Subscribe Bookmark RSS Feed

JMP Performance

whom

Community Trekker

Joined:

Jun 23, 2011

I am trying to go through a data file (dt1) that has about 10,000 rows. I have a FOR loop that goes from 1 to the NROW(dt1). It appeared to be hanging so I put in a bunch of SHOW statements to see what was happening. After processing about 2000 records, no more show statements were being displayed in the log. I decided to wait, and after about 10 minutes, the script finished and the rest of the SHOW statements were blasted into the log.

Does anyone know why after about 6000 show statements, JMP looks like it isn't doing anything? Is there a parameter I can set to keep the SHOW statements going?

Thanks
6 REPLIES
pmroz

Super User

Joined:

Jun 23, 2011

You might get better performance if you load columns into arrays, and then process the arrays. But that's not always the case, so caveat emptor.

my_list = column(dt, "mycolumn") << get values;

Regarding your issue, try putting a wait(0); statement after the SHOW command. That will force JMP to flush some kind of buffer.

Or, you could use the caption statement, like:

caption("Processing row " || char(i) );
wait(0);

Good luck!
ms

Super User

Joined:

Jun 23, 2011

For fast executed iterations the log window may not manage to update continuously. You can insert Wait(0) after each Show() to have a smooth update of the log. However, both the Show() and the Wait() commands will slow down the execution of course.

Ten minutes is a long time though. Hard to tell what may be slowing things down without more information about the type of calculations. You can try to increase available RAM or, if possible, write more efficient code to increase performance.
pmroz

Super User

Joined:

Jun 23, 2011

Rather than display a caption every iteration, try showing it every 100 times via a mod statement

if (mod(i, 100) = 0,
caption("Processing row " || char(i) );
wait(0);
);

Can you give us an inkling as to the calculations you're doing? That may help us solve the riddle.

I had a case recently where I was looping through a table and building several lists from the data. It appeared that larger tables increased processing time exponentially. I finally figured out (from a previous post) that JMP goes infinitely faster if you insert new values into a list at the beginning, rather than at the end. Processing a 21,000-row table went from 34 minutes to 15 seconds!

Good luck!
Peter
whom

Community Trekker

Joined:

Jun 23, 2011

I had to step away from this to address some other issues... but I'm back trying to figure out what is happening...

I'm trying to loop through over 10,000 records in a JMP data file. There are several if statements that are using the LENGTH and INDEX functions. There are a couple of bean counting variables.. nothing complicated!

out of frustration, I altered the parameters for the FOR loop, to look at rows 10585 to 10590 only. Never got any of the show statements to display in the log, and after 5 minutes, i killed the job.

Can you give me the link to the append that you referenced about exponentially long execution times for large files?
pmroz

Super User

Joined:

Jun 23, 2011

Look at the following JMPer Cable issue, in the Tips and Techniques section:

http://www.jmp.com/about/newsletters/jmpercable/pdf/20_summer_2006.pdf

Regards,
Peter
When you use a construction such as a For loop from 1 to NRows the execution time does not scale linearly with the number of rows. For a large number of rows it becomes much more efficient to use the implicit row reference: For Each Row