I am working on a script that will create many graphs, determine a regression fit for each graph, and then save all the regression coefficients to a table. The script performs that actions that I want, but comes to a grinding halt as the number of columns to evaluate increases. The script seems to retain the plots and generated panels in the background that are created within the For loop even when I try to close them within the loop (I don't need the individual graphs or tables box data once the plot has been been sent to a journal (for pdf export ) and the regression coefficients have been extracted).
When I step through the script using the debugger , the script runs to the end and does not show the individual windows, but when I stop the debugger (red square) all of the individual windows from within the for loop are created. The script iterates over 600 columns and the corresponding number of windows that are created in the background (set as invisible) brings jmp to a grinding halt over time.
When using a script to iterate through a dataset to perform some analysis or generate many graphs, what is the best way to close the "intermediate" windows, plots, etc that are created so that the memory burden that is generated is minimized? I
Below is a partial snippet from the script where I iterate through the columns.
For( i = 7, i <=ncols, i++,
biv = dt << Bivariate(
Y(Column(colList[i]) ),
X( :Temperature ),
Fit Line( {Line Color( "Medium Dark Red" )} ), invisible);
rbiv = biv << Report;
biv << Close;
rbiv[Frame Box( 1 )] << Frame Size( 550, 400 );
rbiv << Page Break;
rbiv << Journal;
dtx= Report( biv )["Parameter Estimates"][Table Box( 1 )] << make combined data table;
// dty= Report( biv )["Lack Of Fit"][Table Box( 1 )] << make combined data table;
dtSlope << Concatenate(dtx ,"Append to first table");
// dtFit << Concatenate(dty ,dtSlope, "Append to first table");
// Close(dty, NoSave);
Close(dtx, NoSave);
);