<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: building multiple neural network models in Discussions</title>
    <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338592#M58664</link>
    <description>&lt;P&gt;Yes! This is exactly what I want: a table with all the NN input parameters (e.g., number of TanH nodes, number of layers, etc.) that syncs with the NN platform. I have not tried it yet, and I do imagine it will take a while, but I am using a small 20 sample x 90 predictor "practice" dataset, so it might be doable. In reality, I've really only found that the type of activation and the number of nodes (as well as number of tours) actually lower my RMSE, with boosting not helping very much. But even then, this will be much faster than doing all ~500 models I planned to run (not nearly as many as I had intended to run yesterday thankfully since I want to stay under 4 nodes/activation type). I will try it out tomorrow and let everyone know how it goes.&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Tue, 01 Dec 2020 21:32:22 GMT</pubDate>
    <dc:creator>abmayfield</dc:creator>
    <dc:date>2020-12-01T21:32:22Z</dc:date>
    <item>
      <title>building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338337#M58618</link>
      <description>&lt;P&gt;I am using the neural platform to make predictions, and, being new to neural networking, I am only slightly familiar with all the input parameters: number of hidden layers, number of nodes/hidden layer, boosting models, learning rate, and tours. What I want to do is try to minimize RMSE and the validation model misclassification rate. What I've been doing is iteratively changing each parameter one by one, saving the model performance parameters, and pasting them into a new JMP table, but this is going to take days since there are so many combinations of layers, nodes, tours, etc. Would it be possible to write a script to where JMP Pro builds, say, 1,000 models and dumps the data into a table so that I don't have to manually change each model input parameter?&amp;nbsp;&lt;/P&gt;&lt;P&gt;#Hidden layers: 1 or 2&lt;/P&gt;&lt;P&gt;#Sigmoidal nodes: 0, 1, 2, 3, or 4&lt;/P&gt;&lt;P&gt;#Linear nodes: 0, 1, 2, 3, or 4&lt;/P&gt;&lt;P&gt;#Radial nodes: 0, 1, 2, 3, or 4&lt;/P&gt;&lt;P&gt;#boosting models: no idea, 1-5 maybe?&lt;/P&gt;&lt;P&gt;#learning rate: 0-0.5 in 0.1 increments&lt;/P&gt;&lt;P&gt;#tours: 1, 5, 10, 20, 100&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;So that would be 2 x 5 x 5 x 5 x 5 x 5 x 5=~30,000.&amp;nbsp;&lt;/P&gt;&lt;P&gt;I guess I could widdle a few of these down once I am more familiar with the dataset. Of course, having many nodes and a high number of boosting models doesn't make sense (nor do certain other combinations), but we're still talking about potentially hundreds of models worth testing. Surely this sort of model screening/comparing could be scripted, right?&lt;/P&gt;</description>
      <pubDate>Fri, 09 Jun 2023 21:59:29 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338337#M58618</guid>
      <dc:creator>abmayfield</dc:creator>
      <dc:date>2023-06-09T21:59:29Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338358#M58620</link>
      <description>&lt;P&gt;You could design an experiment with fewer levels to fit a second-order model of RMSE and then use the model to optimize the settings. Also, you can make each model in turn and then extract all the results at once by right-clicking on one of the results table and selecting Make Into Combined Table.&lt;/P&gt;</description>
      <pubDate>Mon, 30 Nov 2020 20:08:50 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338358#M58620</guid>
      <dc:creator>Mark_Bailey</dc:creator>
      <dc:date>2020-11-30T20:08:50Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338521#M58645</link>
      <description>&lt;P&gt;Mark, Thanks for your suggestions. The second one I've tried and failed. I know the "make into combined table" option from other platforms, but maybe it was left out of the neural platform for some reason. Strangely, the "option" trick, whereby you hold down the option key, make a selection, and it applies to all parts of the report, does not work either in the neural platform, or else that could speed things up.&lt;/P&gt;&lt;P&gt;That would be cool if I could have my neural network parameters (# nodes, #tours, etc.) as X's in a DOE, with the goal of minimizing RSME, but it seems to not be possible because of the way you must enter the parameters in the particular boxes in the neural model launch. You'd have to do some really advanced scripting, and even then I'm not sure if DOE and Neural could communicate properly. But if anyone out there knows of a way in which I could get the neural platform to basically run the models reflected in this screenshot (being a sub-selection of the grand total), I would greatly appreciate it. In other words, the values in the first screenshot table would need to go to the appropriate input boxes in the second screenshot (i.e., the neural model launch).&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 01 Dec 2020 15:00:04 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338521#M58645</guid>
      <dc:creator>abmayfield</dc:creator>
      <dc:date>2020-12-01T15:00:04Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338523#M58646</link>
      <description>&lt;P&gt;BTW, the "option" trick to get all relevant features to show the same information as the feature you are actively selecting is actually the "command" trick on a Mac (hence why that didn't previously work).&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 01 Dec 2020 15:04:41 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338523#M58646</guid>
      <dc:creator>abmayfield</dc:creator>
      <dc:date>2020-12-01T15:04:41Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338590#M58663</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.jmp.com/t5/user/viewprofilepage/user-id/12111"&gt;@abmayfield&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; This was something that I've been wanting to script up for a while and was really fun, and your post gave the fire to get it going, thanks! I've been wanting to have a way to tune the NN platform for a while as it is very sensitive to the initial conditions, and it's rather slow to do it manually.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; So, I adapted some scripts that I have made for other model platforms to automate their tuning processes, for example, the boosted tree, bootstrap forest, and XGBoost platforms. The scripts are meant to be saved into your data table and run with the green hot button.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; Hopefully you know some about scripting and will know where to adapt these scripts to your specific situations/data structure/column names, etc. I have entered in some comments to try and help explain what I'm doing in each larger group. I'm attaching two scripts -- one for continuous Y's and one for Nominal/Ordinal Y's. I did have to include a test to see if the output Y is nominal or ordinal because the Number Col Box() value changes depending on this. I used the Big Class Families.jmp data table in the sample data as a test data set. I'm also including two output data tables I generate using either the numerical NN tuning or nominal/ordinal NN tuning. I did nominal on :sex using :age, :weight:, :height, and then I changed :age from Ordinal to Continuous and modeled age with :weight and :sex for the continuous NN tuning.&amp;nbsp;The fits and stats are horrible, but it's intended only for the purpose of seeing if the JSL works or not. I also added a :Validation column to the data table by stratifying on :age, just to check that it all worked correctly.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; I'm sure someone else could script it up more elegantly to test the Y() input column to be modeling type Continuous, Nominal, or Ordinal and then have the appropriate Number Col Box() value inserted to extract the data appropriately using only a single JSL code.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; You will need to edit the Y() and X() inputs to the dt &amp;lt;&amp;lt; Nerual() section as well as the part where you get the name of the report window, as it's named after the variable you're modeling.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp;The NN tuning tables could be treating like a space filling DOE in principle where the lower and upper settings are whatever you want to set it to, except for some of them, like N_Layers can only be 1 or 2 because JMP only allows 1 or 2 layers. Penalty_Method is only the options that are valid in the NN platform, and the Transform Covariate and Robust Fit only take on values of 0 or 1 (off or on). I'm including the NN tuning table as well.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; If you have any questions, let me know.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; Here's my code for the Nominal/Ordinal modeling type:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-jsl"&gt;Names Default To Here( 1 );

dt = Data Table( "Big Class Families" );//enter the name of the current data table
dt_parms = Data Table( "NN Tuning" );//enter name of parameter data table for tuning the NN

dt_results = dt_parms &amp;lt;&amp;lt; Subset( All rows, Selected Columns Only( 0 ) );//this copies the tuning table with all the different rows in it

//these commands create columns to record the fit results
dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Training" );
dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Validation" );
dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Training" );
dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Validation" );
dt_results &amp;lt;&amp;lt; New Column( "RMSE Training" );
dt_results &amp;lt;&amp;lt; New Column( "RMSE Validation" );
dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Training" );
dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Validation" );
dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Training" );
dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Validation" );
dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Training" );
dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Validation" );
dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Training" );
dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Validation" );

//This part just creates a separate modelling progress, delete if you don't want it.
i = 1;
imax = N Row( dt_parms );
dlgStatus = New Window( "Overall Modeling Progress",
	V List Box(
		Text Box( " Overall Modeling Progress ", &amp;lt;&amp;lt;Set Font Size( 12 ), &amp;lt;&amp;lt;Justify Text( "center" ), &amp;lt;&amp;lt;Set width( 200 ) ),
		dlg_gb = Graph Box( FrameSize( 200, 15 ), X Scale( 0, 100 ), Y Scale( 0, 1 ) ),
		tb = Text Box(
			"Current step " || Char( i ) || " of " || Char( imax ),
			&amp;lt;&amp;lt;Set Font Size( 12 ),
			&amp;lt;&amp;lt;Justify Text( "center" ),
			&amp;lt;&amp;lt;Set width( 200 )
		)
	)
);
dlg_gb[Axis Box( 2 )] &amp;lt;&amp;lt; Delete;
dlg_gb[Axis Box( 1 )] &amp;lt;&amp;lt; Delete;

//This big for loop goes through your tuning table putting each value into the fit to generate a new NN model and save the statistics
For( i = 1, i &amp;lt;= imax, i++,
	prog = (i / imax) * 100;//take out if you don't want the update window
	dlgStatus[FrameBox( 1 )] &amp;lt;&amp;lt; Add Graphics Script( {Fill Color( "purple" ), Rect( 0, 1, prog, 0, 1 )} );//take out if you don't want the update window
	tb &amp;lt;&amp;lt; set text( "Current step " || Char( i ) || " of " || Char( imax ) );//take out if you don't want the update window
	Nlayer = dt_parms:N_Layers[i];
	If( dt_results:N_Layers[i] == 1,
		dt_results:N_TanH_2[i] = 0;
		dt_results:N_Linear_2[i] = 0;
		dt_results:N_Gauss_2[i] = 0;
	);//can't have a second layer if you have one and are boosting
	If( dt_results:N_Layers[i] == 2,
		dt_results:N_Boosts[i] = 0;
		dt_results:Learn_Rate[i] = 0.1;
	);//can't boost if you have two layers
	NTH1 = dt_results:N_TanH_1[i];
	NTH2 = dt_results:N_TanH_2[i];
	NL1 = dt_results:N_Linear_1[i];
	NL2 = dt_results:N_Linear_2[i];
	NG1 = dt_results:N_Gauss_1[i];
	NG2 = dt_results:N_Gauss_2[i];
	Nboosts = dt_results:N_Boosts[i];
	LR = dt_results:Learn_Rate[i];
	TCov = dt_results:T_Cov[i];
	RFit = dt_results:Robust_Fit[i];
	PMethod = dt_results:Penalty_Method[i];
	Ntours = dt_results:N_Tours[i];
	
	str = Eval Insert(
		"report = (dt &amp;lt;&amp;lt; Neural(
       Y( :sex ),
		X(
		:height,
		:age,
		:weight
		), 
       Validation ( :Validation ),
       Informative Missing(0), 
       Transform Covariates(^TCov^),
       Fit(
       	NTanH(^NTH1^),
       	NLinear(^NL1^),
       	NGaussian(^NG1^),
       	NTanH2(^NTH2^),
       	NLinear2(^NL2^),
       	NGaussian2(^NG2^),
       	Transform Covariates(^TCov^),
       	Penalty Method(\!"PMethod\!"),
       	Number of Tours(^Ntours^),
       	N Boost(^Nboosts^),
       	Learning Rate(^LR^)
       )),
       Go,
              invisible
       ) &amp;lt;&amp;lt; Report;"
	);
	Eval( Parse( str ) );
	w = Window( dt &amp;lt;&amp;lt; GetName || " - " || "Neural of sex" );//here, you'll want to change the characters after "age" to whatever your response is
	
	//just a simple way to test if the Y variable is Nominal or Ordinal by testing the name of the report window

	If( !Is Missing( Regex( w &amp;lt;&amp;lt; Get Window Title, "age" ) ) == 1,
		ncol_box = 14
	);
	If( !Is Missing( Regex( w &amp;lt;&amp;lt; Get Window Title, "sex" ) ) == 1,
		ncol_box = 6
	);

	
	
	T_stats = w[Outline Box( 3 ), Number Col Box( 1 )] &amp;lt;&amp;lt; Get;//You might need to adjust the Outline Box () and Number Col Box() values depending on your tree structure
	V_stats = w[Outline Box( 3 ), Number Col Box( ncol_box )] &amp;lt;&amp;lt; Get;//You might need to adjust the Outline Box () and Number Col Box() values depending on your tree structure
	report &amp;lt;&amp;lt; Close Window;
	
	//training results
	dt_results:Generalized R² Training[i] = T_stats[1];
	dt_results:Entropy R² Training[i] = T_stats[2];
	dt_results:RMSE Training[i] = T_stats[3];
	dt_results:Mean Abs Dev Training[i] = T_stats[4];
	dt_results:Misclassification Rate Training[i] = T_stats[5];
	dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[6];
	dt_results:Sum Freq Training[i] = T_stats[7];
	
	//validation results
	dt_results:Generalized R² Validation[i] = V_stats[1];
	dt_results:Entropy R² Validation[i] = V_stats[2];
	dt_results:RMSE Validation[i] = V_stats[3];
	dt_results:Mean Abs Dev Validation[i] = V_stats[4];
	dt_results:Misclassification Rate Validation[i] = V_stats[5];
	dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[6];
	dt_results:Sum Freq Validation[i] = V_stats[7];
	
	
);
dlgStatus &amp;lt;&amp;lt; closewindow();&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp; One thing that is strange that I don't understand is that I get this warning about the log window I create when I run it for the NN platform, but I've never gotten it from my other platforms. I can't follow where that warning comes from, but it still runs correctly. Maybe someone can fix that part of my code?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Unexpected ",". Perhaps there is a missing ")".&lt;BR /&gt;Trying to parse operand for "&amp;lt;&amp;lt;" operator.&lt;BR /&gt;Line 23 Column 10: ))►,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;As a last note: be careful about the settings as it can take a VERY long time to run the modeling if you have huge numbers of boosts or tours or transfer functions. Just be cautious not to go overboard.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Hope this helps!,&lt;/P&gt;&lt;P&gt;DS&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 01 Dec 2020 20:52:23 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338590#M58663</guid>
      <dc:creator>SDF1</dc:creator>
      <dc:date>2020-12-01T20:52:23Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338592#M58664</link>
      <description>&lt;P&gt;Yes! This is exactly what I want: a table with all the NN input parameters (e.g., number of TanH nodes, number of layers, etc.) that syncs with the NN platform. I have not tried it yet, and I do imagine it will take a while, but I am using a small 20 sample x 90 predictor "practice" dataset, so it might be doable. In reality, I've really only found that the type of activation and the number of nodes (as well as number of tours) actually lower my RMSE, with boosting not helping very much. But even then, this will be much faster than doing all ~500 models I planned to run (not nearly as many as I had intended to run yesterday thankfully since I want to stay under 4 nodes/activation type). I will try it out tomorrow and let everyone know how it goes.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 01 Dec 2020 21:32:22 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338592#M58664</guid>
      <dc:creator>abmayfield</dc:creator>
      <dc:date>2020-12-01T21:32:22Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338619#M58668</link>
      <description>&lt;P&gt;Well, I failed, but I think it's probably because of some tiny scripting mistake. I used your NN tuning table with my data (attached) and made all the necessary updates (I used three training+validation+test samples). If you run the last script ("neural network comparison"), it will build the new table, but no analyses are run. I just included a few X terms, but it will actually be all 86 of the columns in the "standardized data" group. When I use the debugger, it says something is going wrong with the&lt;/P&gt;&lt;P&gt;"&lt;SPAN class="s1"&gt;For&lt;/SPAN&gt;( i = &lt;SPAN class="s2"&gt;1&lt;/SPAN&gt;, i &amp;lt;= imax, i++," area. I am hoping/guessing a more competent script writer can probably find the issue in seconds!&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 02 Dec 2020 02:34:30 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338619#M58668</guid>
      <dc:creator>abmayfield</dc:creator>
      <dc:date>2020-12-02T02:34:30Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338671#M58676</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.jmp.com/t5/user/viewprofilepage/user-id/12111"&gt;@abmayfield&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; Thanks for sharing the data table, it helped to debug the changes you made in the script.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; A few thing I noticed:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;You needed to add additional columns for the "Test" data set statistics you want to record.&lt;/LI&gt;&lt;LI&gt;You accidentally removed the definition for imax, which is the number of rows of the parameter table&lt;/LI&gt;&lt;LI&gt;You also accidentally changed the call to the NN platform where in the script it says dt &amp;lt;&amp;lt; Neural(), this is where it sends the command to perform the neural net modeling on the data table "dt". You had changed it to something else.&lt;/LI&gt;&lt;LI&gt;The definition of the window "w" needs to have the full name of the NN window, and if you're only running a single X, then apparently JMP puts that in as part of the window name, so it needs to be in both the GetName command and the IF statement below it. But that's only true when modeling with a single X. If using more than one, then you don't need to have the "by......" in the window name or IF statement.&lt;/LI&gt;&lt;LI&gt;The last part is to make sure that the stats for the Training, Validation, and Test data are correctly called in the Number Col Box() command.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&amp;nbsp; The below code works on your data table (also attached with modified script). It works with the NN Tuning table I gave earlier.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; If you are really going to use all 86 columns, you need to put them in manually in the X(col1, col2,...) part of the script. You should then also edit the GetName part because it won't include all 86 names, it'll just be the window name with the response column.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-jsl"&gt;Names Default To Here( 1 );
dt = Current Data Table();
dt_parms = Data Table( "NN Tuning" );
dt_results = dt_parms &amp;lt;&amp;lt; Subset( All rows, Selected Columns Only( 0 ) );
dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Training" );
dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Validation" );
dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Test" );
dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Training" );
dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Validation" );
dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Test" );
dt_results &amp;lt;&amp;lt; New Column( "RMSE Training" );
dt_results &amp;lt;&amp;lt; New Column( "RMSE Validation" );
dt_results &amp;lt;&amp;lt; New Column( "RMSE Test" );
dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Training" );
dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Validation" );
dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Test" );
dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Training" );
dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Validation" );
dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Test" );
dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Training" );
dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Validation" );
dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Test" );
dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Training" );
dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Validation" );
dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Test" );

imax = N Row( dt_parms );

For( i = 1, i &amp;lt;= imax, i++,
	Nlayer = dt_parms:N_Layers[i];
	If( dt_results:N_Layers[i] == 1,
		dt_results:N_TanH_2[i] = 0;
		dt_results:N_Linear_2[i] = 0;
		dt_results:N_Gauss_2[i] = 0;
	);
	If( dt_results:N_Layers[i] == 2,
		dt_results:N_Boosts[i] = 0;
		dt_results:Learn_Rate[i] = 0.1;
	);
	NTH1 = dt_results:N_TanH_1[i];
	NTH2 = dt_results:N_TanH_2[i];
	NL1 = dt_results:N_Linear_1[i];
	NL2 = dt_results:N_Linear_2[i];
	NG1 = dt_results:N_Gauss_1[i];
	NG2 = dt_results:N_Gauss_2[i];
	Nboosts = dt_results:N_Boosts[i];
	LR = dt_results:Learn_Rate[i];
	TCov = dt_results:T_Cov[i];
	RFit = dt_results:Robust_Fit[i];
	PMethod = dt_results:Penalty_Method[i];
	Ntours = dt_results:N_Tours[i];
	
	str = Eval Insert(
		"report = (dt &amp;lt;&amp;lt; Neural(
        Y( :health designation ),
		 X(
		:OFA...22_c0_g2_i1.p1
        ), 
       Validation ( :Validation with test ),
       Informative Missing(0), 
       Transform Covariates(^TCov^),
       Fit(
       	NTanH(^NTH1^),
       	NLinear(^NL1^),
       	NGaussian(^NG1^),
       	NTanH2(^NTH2^),
       	NLinear2(^NL2^),
       	NGaussian2(^NG2^),
       	Transform Covariates(^TCov^),
       	Penalty Method(\!"PMethod\!"),
       	Number of Tours(^Ntours^),
       	N Boost(^Nboosts^),
       	Learning Rate(^LR^)
       )),
       Go,
              invisible
       ) &amp;lt;&amp;lt; Report;"
	);
	Eval( Parse( str ) );
	w = Window( dt &amp;lt;&amp;lt; GetName || " - " || "Neural of health designation by OFA...22_c0_g2_i1.p1" );
	If( !Is Missing( Regex( w &amp;lt;&amp;lt; Get Window Title, "health designation by OFA...22_c0_g2_i1.p1" ) ) == 1,
		ncol_box_v = 10;
		ncol_box_test =19;
	);
	T_stats = w[Outline Box( 3 ), Number Col Box( 1 )] &amp;lt;&amp;lt; Get;//You might need to adjust the Outline Box () and Number Col Box() values depending on your tree structure
	V_stats = w[Outline Box( 3 ), Number Col Box( ncol_box_v )] &amp;lt;&amp;lt; Get;//You might need to adjust the Outline Box () and Number Col Box() values depending on your tree structure
	Test_stats = w[Outline Box( 3 ), Number Col Box( ncol_box_test )] &amp;lt;&amp;lt; Get;
	report &amp;lt;&amp;lt; Close Window;
	
	dt_results:Generalized R² Training[i] = T_stats[1];
	dt_results:Entropy R² Training[i] = T_stats[2];
	dt_results:RMSE Training[i] = T_stats[3];
	dt_results:Mean Abs Dev Training[i] = T_stats[4];
	dt_results:Misclassification Rate Training[i] = T_stats[5];
	dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[6];
	dt_results:Sum Freq Training[i] = T_stats[7];
	
	dt_results:Generalized R² Validation[i] = V_stats[1];
	dt_results:Entropy R² Validation[i] = V_stats[2];
	dt_results:RMSE Validation[i] = V_stats[3];
	dt_results:Mean Abs Dev Validation[i] = V_stats[4];
	dt_results:Misclassification Rate Validation[i] = V_stats[5];
	dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[6];
	dt_results:Sum Freq Validation[i] = V_stats[7];
	
	dt_results:Generalized R² Test[i] = Test_stats[1];
	dt_results:Entropy R² Test[i] = Test_stats[2];
	dt_results:RMSE Test[i] = Test_stats[3];
	dt_results:Mean Abs Dev Test[i] = Test_stats[4];
	dt_results:Misclassification Rate Test[i] = Test_stats[5];
	dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[6];
	dt_results:Sum Freq Test[i] = Test_stats[7];
);&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Hope this helps!,&lt;/P&gt;&lt;P&gt;DS&lt;/P&gt;</description>
      <pubDate>Wed, 02 Dec 2020 13:38:04 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338671#M58676</guid>
      <dc:creator>SDF1</dc:creator>
      <dc:date>2020-12-02T13:38:04Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338687#M58678</link>
      <description>&lt;P&gt;Aha! Thanks so much for re-doing it for me. I clearly have no knowledge of scripting, so I just changed "Neural" to the name of my data table.....which obviously wouldn't work. Once I changed the name of my data table to "Neural," added the Test columns, and made the other changes you recommended, it worked.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Now what I might do is use DOE to make a conditionally factorial design (since, as you noted, you can't boost with multiple hidden layers) and run a few hundred models on my work computer (so as not to overly tax my 8-GB RAM laptop).&amp;nbsp;&lt;/P&gt;&lt;P&gt;I feel like your script needs to become a GUI or add-in (any takers?). JMP Pro 16, which I am beta-testing, has a really great "model screening" tool, but I think it tests across modeling types (e.g., PLS vs. Gen-Reg), not WITHIN the potentially thousands of models that could be built within each modeling type (or else it may never stop running). I have heard of the "simulator," but I think that is doing something different (not generating tons of simulated models), though I could be wrong. Again, thanks so much for your help. This will surely save me DAYS of time!&lt;/P&gt;</description>
      <pubDate>Wed, 02 Dec 2020 14:27:51 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338687#M58678</guid>
      <dc:creator>abmayfield</dc:creator>
      <dc:date>2020-12-02T14:27:51Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338816#M58685</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.jmp.com/t5/user/viewprofilepage/user-id/12111"&gt;@abmayfield&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; Glad that it fixed your problem.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; With the tree-based modeling (XGBoost, boosted trees, and bootstrap forest), you can easily use the tuning table GUI within JMP to run many models. But, one downside is that it chews up a lot of RAM. The nice thing about running it this way is that you can run 10's of thousands of different parameter settings and it doesn't eat up much ram, just CPU consumption. I've run up to 40k different settings with bootstrap forest (which is slow) and it can take a few days on my work laptop (32 GB RAM and 2GHz CPU).&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; I am also an early adopter for JMP 16, and the model screening is nice, but as you point out, it's not tuning any of the models, which can be a bit misleading. It just uses the default settings, which aren't always the best settings to create a predictive model. It could at least help you to determine which platforms are worth the time effort to tune further and which aren't worth looking into.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; You might want to also consider the SVM platform to model your nominal data as this is supposed to also be a good method for that kind of data.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; As for the simulator, that is something different and is meant as a way to test the estimates with different initial conditions (randomization). This gives a distribution of values for the estimate and gives you a confidence interval for the original estimate and you can then decide on whether that model with the given estimates is good or not.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; As for the DOE, you could do a factorial or even run as a space filling design with the factors set appropriately, and I think that platform also allows for conditional constraints. Also, you might consider doing a predictor screening or bootstrapping on the predictors to see if you actually need all 89 of them or if you can really only use a smaller subset.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; As a last note, since you have a short and wide data table, you might want to consider instead of using a validation column, do a Leave one Out approach for the validation. If you do that, you'll have to edit the script for that.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Good luck!,&lt;/P&gt;&lt;P&gt;DS&lt;/P&gt;</description>
      <pubDate>Wed, 02 Dec 2020 15:18:05 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338816#M58685</guid>
      <dc:creator>SDF1</dc:creator>
      <dc:date>2020-12-02T15:18:05Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338836#M58686</link>
      <description>&lt;P&gt;More excellent suggestions, especially regarding the type of validation. I see that Neural supports holdback, Kfold, or row exclusion (whereby I'm guessing I'd just randomly choose a few rows to leave out). Were I to want to change the script to change the type of validation, would I just change the "Validation ( :Validation with test )," line? And would the held back samples be equivalent in the eyes of JMP as validation samples? If not, I'd need to remove the validation columns (or rename them), and the test columns would have to go, as well.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 02 Dec 2020 15:26:48 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338836#M58686</guid>
      <dc:creator>abmayfield</dc:creator>
      <dc:date>2020-12-02T15:26:48Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338904#M58689</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.jmp.com/t5/user/viewprofilepage/user-id/12111"&gt;@abmayfield&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; Kfold is also another great way to validate the short data set. Yes, the row exclusion you would set as a row state, which you could do by selecting the "test" data set and set the row state to "excluded". If you use Kfold or Holdback, JMP uses those for the validation of the data, not as testing of the data. Hopefully you also have a separate data set you can use as the test data and not have it in the same data table as the training and validation data.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; I would just use Kfold for validation and then comment out the parts that you don't need to run the script. Below is the edited version with a 5-Kfold validation setting for the NN platform.&amp;nbsp; Should work, it did for me.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-jsl"&gt;Names Default To Here( 1 );
dt = Current Data Table();
dt_parms = Data Table( "NN Tuning" );
dt_results = dt_parms &amp;lt;&amp;lt; Subset( All rows, Selected Columns Only( 0 ) );
dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Training" );
dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Validation" );
//dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Test" );
dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Training" );
dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Validation" );
//dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Test" );
dt_results &amp;lt;&amp;lt; New Column( "RMSE Training" );
dt_results &amp;lt;&amp;lt; New Column( "RMSE Validation" );
//dt_results &amp;lt;&amp;lt; New Column( "RMSE Test" );
dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Training" );
dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Validation" );
//dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Test" );
dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Training" );
dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Validation" );
//dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Test" );
dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Training" );
dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Validation" );
//dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Test" );
dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Training" );
dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Validation" );
//dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Test" );

imax = N Row( dt_parms );

For( i = 1, i &amp;lt;= imax, i++,
	Nlayer = dt_parms:N_Layers[i];
	If( dt_results:N_Layers[i] == 1,
		dt_results:N_TanH_2[i] = 0;
		dt_results:N_Linear_2[i] = 0;
		dt_results:N_Gauss_2[i] = 0;
	);
	If( dt_results:N_Layers[i] == 2,
		dt_results:N_Boosts[i] = 0;
		dt_results:Learn_Rate[i] = 0.1;
	);
	NTH1 = dt_results:N_TanH_1[i];
	NTH2 = dt_results:N_TanH_2[i];
	NL1 = dt_results:N_Linear_1[i];
	NL2 = dt_results:N_Linear_2[i];
	NG1 = dt_results:N_Gauss_1[i];
	NG2 = dt_results:N_Gauss_2[i];
	Nboosts = dt_results:N_Boosts[i];
	LR = dt_results:Learn_Rate[i];
	TCov = dt_results:T_Cov[i];
	RFit = dt_results:Robust_Fit[i];
	PMethod = dt_results:Penalty_Method[i];
	Ntours = dt_results:N_Tours[i];
	
	str = Eval Insert(
		"report = (dt &amp;lt;&amp;lt; Neural(
        Y( :health designation ),
		 X(:OFA...22_c0_g2_i1.p1 ), 
       Validation ( \!"KFold\!", 5 ),
       Informative Missing(0), 
       Transform Covariates(^TCov^),
       Fit(
       	NTanH(^NTH1^),
       	NLinear(^NL1^),
       	NGaussian(^NG1^),
       	NTanH2(^NTH2^),
       	NLinear2(^NL2^),
       	NGaussian2(^NG2^),
       	Transform Covariates(^TCov^),
       	Penalty Method(\!"PMethod\!"),
       	Number of Tours(^Ntours^),
       	N Boost(^Nboosts^),
       	Learning Rate(^LR^)
       )),
       Go,
              invisible
       ) &amp;lt;&amp;lt; Report;"
	);
	Eval( Parse( str ) );
	w = Window( dt &amp;lt;&amp;lt; GetName || " - " || "Neural of health designation by OFA...22_c0_g2_i1.p1" );
	If( !Is Missing( Regex( w &amp;lt;&amp;lt; Get Window Title, "health designation by OFA...22_c0_g2_i1.p1" ) ) == 1,
		ncol_box_v = 10;
		//ncol_box_test =19;
	);
	T_stats = w[Outline Box( 3 ), Number Col Box( 1 )] &amp;lt;&amp;lt; Get;//You might need to adjust the Outline Box () and Number Col Box() values depending on your tree structure
	V_stats = w[Outline Box( 3 ), Number Col Box( ncol_box_v )] &amp;lt;&amp;lt; Get;//You might need to adjust the Outline Box () and Number Col Box() values depending on your tree structure
	//Test_stats = w[Outline Box( 3 ), Number Col Box( ncol_box_test )] &amp;lt;&amp;lt; Get;
	report &amp;lt;&amp;lt; Close Window;
	
	dt_results:Generalized R² Training[i] = T_stats[1];
	dt_results:Entropy R² Training[i] = T_stats[2];
	dt_results:RMSE Training[i] = T_stats[3];
	dt_results:Mean Abs Dev Training[i] = T_stats[4];
	dt_results:Misclassification Rate Training[i] = T_stats[5];
	dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[6];
	dt_results:Sum Freq Training[i] = T_stats[7];
	
	dt_results:Generalized R² Validation[i] = V_stats[1];
	dt_results:Entropy R² Validation[i] = V_stats[2];
	dt_results:RMSE Validation[i] = V_stats[3];
	dt_results:Mean Abs Dev Validation[i] = V_stats[4];
	dt_results:Misclassification Rate Validation[i] = V_stats[5];
	dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[6];
	dt_results:Sum Freq Validation[i] = V_stats[7];
	
	//dt_results:Generalized R² Test[i] = Test_stats[1];
	//dt_results:Entropy R² Test[i] = Test_stats[2];
	//dt_results:RMSE Test[i] = Test_stats[3];
	//dt_results:Mean Abs Dev Test[i] = Test_stats[4];
	//dt_results:Misclassification Rate Test[i] = Test_stats[5];
	//dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[6];
	//dt_results:Sum Freq Test[i] = Test_stats[7];
);&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Good luck!,&lt;/P&gt;&lt;P&gt;DS&lt;/P&gt;</description>
      <pubDate>Wed, 02 Dec 2020 15:45:54 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338904#M58689</guid>
      <dc:creator>SDF1</dc:creator>
      <dc:date>2020-12-02T15:45:54Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338933#M58696</link>
      <description>&lt;P&gt;Wow, I can't believe that I've run 300 neural network models before lunch thanks to your scripting help. I hope this gets picked up and "kudo'd" as I imagine when more and more people start seeing the value of the NN platform, they will gravitate towards something like this, especially if it's a brand new dataset and you don't really know where to "dive in" with respect to the bewildering amount of NN model input parameters that can be slightly tweaked (sometimes resulting in huge differences even when just increasing, for instance, the learning rate from 0.05 to 0.1). I will play around with the different validation approaches next (validation column vs. holdback vs. KFold). Definitely need to be careful given my (currently) small sample size (hoping to use these preliminary data to get funds to add more "rows" i.e., corals).&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 02 Dec 2020 17:18:13 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338933#M58696</guid>
      <dc:creator>abmayfield</dc:creator>
      <dc:date>2020-12-02T17:18:13Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338946#M58697</link>
      <description>Glad to hear it's working out for you. Good luck with the modeling! You might want to check other model platforms as well and then compare models on a test data set to see which is the best predictor.&lt;BR /&gt;&lt;BR /&gt;DS</description>
      <pubDate>Wed, 02 Dec 2020 17:58:32 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/338946#M58697</guid>
      <dc:creator>SDF1</dc:creator>
      <dc:date>2020-12-02T17:58:32Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/339025#M58707</link>
      <description>&lt;P&gt;I have a (likely simple) scripting question. Say I want to change the input dt to another one. Shouldn't I just be able to copy and paste the script, and substitute the dt from "Neural" to something else?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="p1"&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;&lt;CODE class=" language-jsl"&gt;str = Eval Insert(
"report = (dt &amp;lt;&amp;lt; Neural (
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Y( :health designation ),
        X(&lt;/CODE&gt;&lt;/PRE&gt;
&lt;P class="p2"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="p2"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="p2"&gt;If I delete the original "Neural" file and substitute it with another table with the exact same information and a different name, the script won't work! Is it necessary to have the exact same data in two data tables, one of which is named "Neural?" Maybe I missed that part!&lt;/P&gt;</description>
      <pubDate>Thu, 03 Dec 2020 12:53:35 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/339025#M58707</guid>
      <dc:creator>abmayfield</dc:creator>
      <dc:date>2020-12-03T12:53:35Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/339113#M58721</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.jmp.com/t5/user/viewprofilepage/user-id/12111"&gt;@abmayfield&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; If you go back and review the script, the second line defines what the data table is to be used during the analysis. The JSL code dt = Current Data Table() defines a local variable "dt" that is then used in the rest of the script instead of having to call the data table by name. Similarly, dt_parms is the local variable that is assigned to the NN tuning table, "NN tuning". Those two tables are really the only things that you feed to the script. The rest is all internal.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; If you copy/paste the JSL code into a new data table it should work automatically because you're running it locally in that data table.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; You should not modify the call dt &amp;lt;&amp;lt; Nerual(), except to define your Y()'s and X()'s, and maybe the Validation() method. The JSL code dt &amp;lt;&amp;lt; Nerual() is a command that is telling JMP to perform NN modeling on the data table "dt". Hence, the definition of what dt is at the beginning of the code.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; You will also need to modify the w = Window() call to reflect what the title of the NN report window will be, as well as the Regex() below it. Finally, depending on whether or not you actually will have the "test" data set as part of a validation column, then you'll either want to comment out (with the // command at the start of the line) or remove the comment out at the beginning of the code when it creates the Test result columns as well as the Test_stats definition and where the code writes the results to the new output data table.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; Other than than that, you shouldn't have to edit much at all.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; With regards to your last question, it kind of depends on how you're implementing the code. If you're running it from within the data table, i.e. that it's saved in the script portion of the data table and you're running it via the green hot button, then you would only need to change the Y() and X()'s that you're using as predictors and response to match the column names from the new data table. You will of course need to have the "NN tuning" table open. On the other hand, if you're running this as a separate script, then you might want to change the definition of dt from dt = Current Data Table() to dt = Data Table (" "), where you'd insert the name of the new data table between the quotes.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;The "Neural" you're referring to is not a data table but the JMP command to run a Neural Net model on a specified data table. The "&amp;lt;&amp;lt;" command means to perform the action which is to the right of the &amp;lt;&amp;lt; on the object to the left. Near the start of the code you can see that with the new column command. Where it says dt_results &amp;lt;&amp;lt; New Column(), the code is telling JMP to create a new column for the object dt_results, which is the subset data generated from the dt_parms data table.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Hope this helps!,&lt;/P&gt;&lt;P&gt;DS&lt;/P&gt;</description>
      <pubDate>Thu, 03 Dec 2020 13:39:33 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/339113#M58721</guid>
      <dc:creator>SDF1</dc:creator>
      <dc:date>2020-12-03T13:39:33Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/339116#M58722</link>
      <description>OK, great. This answers a few additional questions I had. Now the only thing I'm struggling with is, when I copy, paste, and run the script in a new data table, I get an outline box error, which you allude to in the script comments. "expected character arg 1 in access or evaluation of outline box." What does that mean?? I've increased and decreased the numbers, but I get the same error message every time, and the script will not run. I don't even know what an outline box is. It's weird that the tree structure is identical. I just changed the file name. The file name is longer. Maybe that means I need a larger outline box?</description>
      <pubDate>Thu, 03 Dec 2020 14:00:17 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/339116#M58722</guid>
      <dc:creator>abmayfield</dc:creator>
      <dc:date>2020-12-03T14:00:17Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/339117#M58723</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.jmp.com/t5/user/viewprofilepage/user-id/12111"&gt;@abmayfield&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; The outline box error is probably from when it tries to grab the statistic values from the NN window. Can you share the new data table and code with me? I can troubleshoot it and see if I can make it easier to port it from one data table to another.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; I think it might have to do with how I've set up the part to check the data column type and to make the ncol_box variable a different value. If it's nominal this value is different than if it's ordinal, or even numerical.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; If you can share the data table and code that you're getting this error with, I'm sure I can fix it. If you want, you could anonymize the data table: Tables &amp;gt; Anonymize.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;DS&lt;/P&gt;</description>
      <pubDate>Thu, 03 Dec 2020 14:17:31 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/339117#M58723</guid>
      <dc:creator>SDF1</dc:creator>
      <dc:date>2020-12-03T14:17:31Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/339118#M58724</link>
      <description>&lt;P&gt;It's weird. If I just keep running the script, it will eventually work!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Another strange thing: no matter what I change the holdback percentage to, it always gives me 11 training vs. 9 validation (~55/45%). Is it not possible to do, say, 70/30%? I have a strong feeling this is because I simply entered in the command incorrectly, rather than it being impossible!&lt;/P&gt;</description>
      <pubDate>Thu, 03 Dec 2020 14:26:45 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/339118#M58724</guid>
      <dc:creator>abmayfield</dc:creator>
      <dc:date>2020-12-03T14:26:45Z</dc:date>
    </item>
    <item>
      <title>Re: building multiple neural network models</title>
      <link>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/339164#M58730</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.jmp.com/t5/user/viewprofilepage/user-id/12111"&gt;@abmayfield&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; Thanks for sharing the data table. I was able to find a few things.&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;Found out where the error was that it keeps writing to the log window and fixed that. It's in the str call. Had to move a parenthesis to a different location.&lt;/LI&gt;&lt;LI&gt;I found out that when doing K-fold or Holdback, you need to change the Validation() to Validation Method(). Then it works and proportions the data appropriately.&lt;/LI&gt;&lt;LI&gt;I recoded how it tests for the modeling type of the column and made some associated changes.&lt;/LI&gt;&lt;LI&gt;I also wrote it so the same code can be used for Nominal, Ordinal, and Continuous. This can be done simply by changing the variable y_resp -- the Y response column name. It does all the rest automatically.&lt;/LI&gt;&lt;LI&gt;I also added a title to the results data table that changes with the "host" data table and the validation type so you can run all three methods and there will be no naming interference from JMP.&lt;/LI&gt;&lt;LI&gt;The three NN fitting scripts in your data table can now by copy-pasted from one data table to the next without issues. I tested it with the first data table you sent, and it worked just fine.&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&amp;nbsp; It's getting a little more automated now, and is a bit improved. I would prefer to have it check the validation method and then generate the extra columns and assign values accordingly. Maybe in the next roll-out!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Here's what the code looks like in general. Notice that I was testing on the temperature column to check for continuous modeling types.&lt;/P&gt;&lt;PRE&gt;&lt;CODE class=" language-jsl"&gt;Names Default To Here( 1 );
dt = Current Data Table();
dt_parms = Data Table( "NN Tuning 8" );
dtp_name = dt_parms &amp;lt;&amp;lt; get name;
dt_name = dt &amp;lt;&amp;lt; get name;

//edit the line below so that it has the name of the column you want to model.
y_resp = "temperature (1, 2, 3)";
response_col = Column( dt, Eval( Eval Expr( Expr( y_resp ) ) ) );

dt_results = dt_parms &amp;lt;&amp;lt; Subset( Output Table( "Output of " || dtp_name || " with (validation type)" || " for " || dt_name), All rows, Selected Columns Only( 0 ) );

//This tests the response column type to automatically set the values for pulling the data out of the report window.
//It also will create the appropriate columns depending on the modeling type of the response column.
coltype = response_col &amp;lt;&amp;lt; Get Modeling Type;
If( coltype == "Ordinal",
	o_box = 3;
	ncol_box_v = 10;
	ncol_box_test = 19;
	dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Training" );
	dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Test" );
	dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Training" );
	dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Test" );
	dt_results &amp;lt;&amp;lt; New Column( "RMSE Training" );
	dt_results &amp;lt;&amp;lt; New Column( "RMSE Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "RMSE Test" );
	dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Training" );
	dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Test" );
	dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Training" );
	dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Test" );
	dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Training" );
	dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Test" );
	dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Training" );
	dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Test" );
);
If( coltype == "Nominal",
	o_box = 3;
	ncol_box_v = 10;
	ncol_box_test = 19;
	dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Training" );
	dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "Generalized R² Test" );
	dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Training" );
	dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "Entropy R² Test" );
	dt_results &amp;lt;&amp;lt; New Column( "RMSE Training" );
	dt_results &amp;lt;&amp;lt; New Column( "RMSE Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "RMSE Test" );
	dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Training" );
	dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Test" );
	dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Training" );
	dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "Misclassification Rate Test" );
	dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Training" );
	dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Test" );
	dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Training" );
	dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Test" );
);
If( coltype == "Continuous",
	o_box = 3;
	ncol_box_v = 2;
	ncol_box_test = 3;
	dt_results &amp;lt;&amp;lt; New Column( "R² Training" );
	dt_results &amp;lt;&amp;lt; New Column( "R² Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "R² Test" );
	dt_results &amp;lt;&amp;lt; New Column( "RMSE Training" );
	dt_results &amp;lt;&amp;lt; New Column( "RMSE Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "RMSE Test" );
	dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Training" );
	dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "Mean Abs Dev Test" );
	dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Training" );
	dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "-LogLiklihood Test" );
	dt_results &amp;lt;&amp;lt; New Column( "SSE Training" );
	dt_results &amp;lt;&amp;lt; New Column( "SSE Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "SSE Test" );
	dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Training" );
	dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Validation" );
	dt_results &amp;lt;&amp;lt; New Column( "Sum Freq Test" );
);


imax = N Row( dt_parms );

For( i = 1, i &amp;lt;= imax, i++,
	Nlayer = dt_parms:N_Layers[i];
	If( dt_results:N_Layers[i] == 1,
		dt_results:N_TanH_2[i] = 0;
		dt_results:N_Linear_2[i] = 0;
		dt_results:N_Gauss_2[i] = 0;
	);
	If( dt_results:N_Layers[i] == 2,
		dt_results:N_Boosts[i] = 0;
		dt_results:Learn_Rate[i] = 0.1;
	);
	NTH1 = dt_results:N_TanH_1[i];
	NTH2 = dt_results:N_TanH_2[i];
	NL1 = dt_results:N_Linear_1[i];
	NL2 = dt_results:N_Linear_2[i];
	NG1 = dt_results:N_Gauss_1[i];
	NG2 = dt_results:N_Gauss_2[i];
	Nboosts = dt_results:N_Boosts[i];
	LR = dt_results:Learn_Rate[i];
	TCov = dt_results:T_Cov[i];
	RFit = dt_results:Robust_Fit[i];
	PMethod = dt_results:Penalty_Method[i];
	Ntours = dt_results:N_Tours[i];
	
	str = Eval Insert(
		"report = (dt &amp;lt;&amp;lt; Neural(
        Y( response_col ),
		 X(
		 :OFA...22_c0_g2_i1.p1,
:OFA...47_c0_g1_i1.p1,
:OFA...73_c1_g1_i3.p1,
:OFA...72_c1_g1_i1.p1,
:OFA...5_c1_g2_i15.p1,
:OFA...78_c2_g1_i3.p1,
:OFA...51_c4_g1_i1.p1,
:OFA...38_c0_g1_i8.p1,
:OFA...19_c0_g1_i1.p1,
:OFA...22_c2_g1_i4.p1,
:OFA...27_c1_g3_i5.p1,
:OFA...82_c3_g2_i3.p1,
:OFA...11_c1_g1_i7.p1,
:OFA...50_c2_g4_i1.p3,
:OFA...18_c1_g1_i7.p1,
:OFA...66_c2_g1_i3.p1,
:OFA...85_c0_g1_i2.p1,
:OFA...99_c5_g1_i6.p1,
:OFA...03_c2_g1_i1.p1,
:SYM...24_c0_g1_i1.p1,
:SYM...75_c0_g2_i1.p1,
:SYM...42_c0_g1_i1.p1,
:SYM...04_c0_g1_i1.p1,
:SYM...97_c0_g1_i1.p1,
:SYM...13_c0_g1_i1.p1,
:SYM...66_c0_g1_i1.p1,
:SYM...72_c0_g6_i1.p1,
:SYM...13_c0_g1_i1.p1 2,
:SYM...33_c0_g1_i1.p1 3,
:SYM...43_c0_g1_i1.p1,
:SYM...51_c0_g1_i1.p1,
:SYM...65_c0_g1_i1.p1 2 
), 
       Validation ( :Validation with test ),
       Informative Missing(0), 
       Transform Covariates(^TCov^),
       Fit(
       	NTanH(^NTH1^),
       	NLinear(^NL1^),
       	NGaussian(^NG1^),
       	NTanH2(^NTH2^),
       	NLinear2(^NL2^),
       	NGaussian2(^NG2^),
       	Transform Covariates(^TCov^),
       	Penalty Method(\!"PMethod\!"),
       	Number of Tours(^Ntours^),
       	N Boost(^Nboosts^),
       	Learning Rate(^LR^)
       ),
       Go,
              invisible
       )) &amp;lt;&amp;lt; Report;"
	);
	Eval( Parse( str ) );
	w = Window( dt &amp;lt;&amp;lt; GetName || " - " || "Neural of " || y_resp );
	
	
	
	T_stats = w[Outline Box( o_box ), Number Col Box( 1 )] &amp;lt;&amp;lt; Get;
	V_stats = w[Outline Box( o_box ), Number Col Box( ncol_box_v )] &amp;lt;&amp;lt; Get;
	Test_stats = w[Outline Box( o_box ), Number Col Box( ncol_box_test )] &amp;lt;&amp;lt; Get;
	report &amp;lt;&amp;lt; Close Window;
	
	If( coltype == "Ordinal",
		dt_results:Generalized R² Training[i] = T_stats[1];
		dt_results:Entropy R² Training[i] = T_stats[2];
		dt_results:RMSE Training[i] = T_stats[3];
		dt_results:Mean Abs Dev Training[i] = T_stats[4];
		dt_results:Misclassification Rate Training[i] = T_stats[5];
		dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[6];
		dt_results:Sum Freq Training[i] = T_stats[7];
	
		dt_results:Generalized R² Validation[i] = V_stats[1];
		dt_results:Entropy R² Validation[i] = V_stats[2];
		dt_results:RMSE Validation[i] = V_stats[3];
		dt_results:Mean Abs Dev Validation[i] = V_stats[4];
		dt_results:Misclassification Rate Validation[i] = V_stats[5];
		dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[6];
		dt_results:Sum Freq Validation[i] = V_stats[7];
	
		dt_results:Generalized R² Test[i] = Test_stats[1];
		dt_results:Entropy R² Test[i] = Test_stats[2];
		dt_results:RMSE Test[i] = Test_stats[3];
		dt_results:Mean Abs Dev Test[i] = Test_stats[4];
		dt_results:Misclassification Rate Test[i] = Test_stats[5];
		dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[6];
		dt_results:Sum Freq Test[i] = Test_stats[7];
	);
	
	If( coltype == "Nominal",
		dt_results:Generalized R² Training[i] = T_stats[1];
		dt_results:Entropy R² Training[i] = T_stats[2];
		dt_results:RMSE Training[i] = T_stats[3];
		dt_results:Mean Abs Dev Training[i] = T_stats[4];
		dt_results:Misclassification Rate Training[i] = T_stats[5];
		dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[6];
		dt_results:Sum Freq Training[i] = T_stats[7];
	
		dt_results:Generalized R² Validation[i] = V_stats[1];
		dt_results:Entropy R² Validation[i] = V_stats[2];
		dt_results:RMSE Validation[i] = V_stats[3];
		dt_results:Mean Abs Dev Validation[i] = V_stats[4];
		dt_results:Misclassification Rate Validation[i] = V_stats[5];
		dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[6];
		dt_results:Sum Freq Validation[i] = V_stats[7];
	
		dt_results:Generalized R² Test[i] = Test_stats[1];
		dt_results:Entropy R² Test[i] = Test_stats[2];
		dt_results:RMSE Test[i] = Test_stats[3];
		dt_results:Mean Abs Dev Test[i] = Test_stats[4];
		dt_results:Misclassification Rate Test[i] = Test_stats[5];
		dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[6];
		dt_results:Sum Freq Test[i] = Test_stats[7];
	);
	
	If( coltype == "Continuous",
		dt_results:R² Training[i] = T_stats[1];
		dt_results:RMSE Training[i] = T_stats[2];
		dt_results:Mean Abs Dev Training[i] = T_stats[3];
		dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[4];
		dt_results:SSE Training[i] = T_stats[5];
		dt_results:Sum Freq Training[i] = T_stats[6];
	
		dt_results:R² Validation[i] = V_stats[1];
		dt_results:RMSE Validation[i] = V_stats[2];
		dt_results:Mean Abs Dev Validation[i] = V_stats[3];
		dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[4];
		dt_results:SSE Validation[i] = V_stats[5];
		dt_results:Sum Freq Validation[i] = V_stats[6];
	
		dt_results:R² Test[i] = Test_stats[1];
		dt_results:RMSE Test[i] = Test_stats[2];
		dt_results:Mean Abs Dev Test[i] = Test_stats[3];
		dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[4];
		dt_results:SSE Test[i] = Test_stats[5];
		dt_results:Sum Freq Test[i] = Test_stats[6];
	);
);&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;Let me know if there are any further issues.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks!,&lt;/P&gt;&lt;P&gt;DS&lt;/P&gt;</description>
      <pubDate>Thu, 03 Dec 2020 17:48:47 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/building-multiple-neural-network-models/m-p/339164#M58730</guid>
      <dc:creator>SDF1</dc:creator>
      <dc:date>2020-12-03T17:48:47Z</dc:date>
    </item>
  </channel>
</rss>

