cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Choose Language Hide Translation Bar
johanna_younous
Level III

Optimize Memory working with many data

Hello there ! 

I'm iterating many analyses in a loop - like millions -  and I need to store the results. Those results are are text values and numeric values. So far I stored the results in a data table but as the size of the table increase so does the memory, and the speed of the analyses tend to decrease. 

I'm looking for a way to save time and memory.

The table is already in private ( as sugested in an other topic of the forum) ; it makes it better but it's not sufficient. 

I tried to save my text variable in a list and my numeric ones in a vector  (it is pretty efficient for R). Maybe I did it wrong but it seems here to increase the time of the loop.   

Does any one have a good idea? 

I'm using Jmp 14 but I can have access to JMP 16.

5 REPLIES 5

Re: Optimize Memory working with many data

Which specific platforms are you using? JMP 16 and 17 made improvements to the speed of many platforms.

Also, you can use the HP Time() function to test iterations. Some good discussions are here and here.

 

Re: Optimize Memory working with many data

Adding to @Jed_Campbell's reply, you can also use the profiling feature in the JMP Debugger to determine where the bottlenecks are.

johanna_younous
Level III

Re: Optimize Memory working with many data

I use the Kmean plateform ( and open and close reports along the process) . 

I'm use to use tick second () , I didn't know about Hp time() . It seems to be similar ?

I'll check about the discussions you spotted thanks. 

Craige_Hales
Super User

Re: Optimize Memory working with many data

If you are running out of memory, watch out for private data tables that you forgot to close. (try making them invisible rather than private to make sure you've closed them. Invisible still shows up in the home window.)

If JMP runs slower and slower

  • if you have an old-school desktop with a disk light, keep an eye on it to see if stays on a lot. You might be creating matrices that are really big. J(10000) makes a 2D (not 1D) matrix.
  • if you are building a string or matrix by concatenating to the end, that gets really slow as the string or matrix gets larger. Pre-allocate the matrix to the right size, or use a list of string parts followed by concatitems() when done.

Do use the JSL debugger's profiler as @Mark_Bailey  suggests.

Wrench, Clock, GoWrench, Clock, Go

If you get lucky, there will be an obvious hotspot that might match something above.

 

Craige
johanna_younous
Level III

Re: Optimize Memory working with many data

Thanks Craige that's really good advice , I had no clue pre allocating size would help ! 

I'm pretty sure it's not about tables I forgot to close because I had a step in my script where my tables where visibles.