- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Get Direct Link
- Report Inappropriate Content
Optimize Memory working with many data
Hello there !
I'm iterating many analyses in a loop - like millions - and I need to store the results. Those results are are text values and numeric values. So far I stored the results in a data table but as the size of the table increase so does the memory, and the speed of the analyses tend to decrease.
I'm looking for a way to save time and memory.
The table is already in private ( as sugested in an other topic of the forum) ; it makes it better but it's not sufficient.
I tried to save my text variable in a list and my numeric ones in a vector (it is pretty efficient for R). Maybe I did it wrong but it seems here to increase the time of the loop.
Does any one have a good idea?
I'm using Jmp 14 but I can have access to JMP 16.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Get Direct Link
- Report Inappropriate Content
Re: Optimize Memory working with many data
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Get Direct Link
- Report Inappropriate Content
Re: Optimize Memory working with many data
Adding to @Jed_Campbell's reply, you can also use the profiling feature in the JMP Debugger to determine where the bottlenecks are.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Get Direct Link
- Report Inappropriate Content
Re: Optimize Memory working with many data
I use the Kmean plateform ( and open and close reports along the process) .
I'm use to use tick second () , I didn't know about Hp time() . It seems to be similar ?
I'll check about the discussions you spotted thanks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Get Direct Link
- Report Inappropriate Content
Re: Optimize Memory working with many data
If you are running out of memory, watch out for private data tables that you forgot to close. (try making them invisible rather than private to make sure you've closed them. Invisible still shows up in the home window.)
If JMP runs slower and slower
- if you have an old-school desktop with a disk light, keep an eye on it to see if stays on a lot. You might be creating matrices that are really big. J(10000) makes a 2D (not 1D) matrix.
- if you are building a string or matrix by concatenating to the end, that gets really slow as the string or matrix gets larger. Pre-allocate the matrix to the right size, or use a list of string parts followed by concatitems() when done.
Do use the JSL debugger's profiler as @Mark_Bailey suggests.
Wrench, Clock, Go
If you get lucky, there will be an obvious hotspot that might match something above.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Get Direct Link
- Report Inappropriate Content
Re: Optimize Memory working with many data
Thanks Craige that's really good advice , I had no clue pre allocating size would help !
I'm pretty sure it's not about tables I forgot to close because I had a step in my script where my tables where visibles.