cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Check out the JMP® Marketplace featured Capability Explorer add-in
Choose Language Hide Translation Bar
abmayfield
Level VI

building multiple neural network models

I am using the neural platform to make predictions, and, being new to neural networking, I am only slightly familiar with all the input parameters: number of hidden layers, number of nodes/hidden layer, boosting models, learning rate, and tours. What I want to do is try to minimize RMSE and the validation model misclassification rate. What I've been doing is iteratively changing each parameter one by one, saving the model performance parameters, and pasting them into a new JMP table, but this is going to take days since there are so many combinations of layers, nodes, tours, etc. Would it be possible to write a script to where JMP Pro builds, say, 1,000 models and dumps the data into a table so that I don't have to manually change each model input parameter? 

#Hidden layers: 1 or 2

#Sigmoidal nodes: 0, 1, 2, 3, or 4

#Linear nodes: 0, 1, 2, 3, or 4

#Radial nodes: 0, 1, 2, 3, or 4

#boosting models: no idea, 1-5 maybe?

#learning rate: 0-0.5 in 0.1 increments

#tours: 1, 5, 10, 20, 100

 

So that would be 2 x 5 x 5 x 5 x 5 x 5 x 5=~30,000. 

I guess I could widdle a few of these down once I am more familiar with the dataset. Of course, having many nodes and a high number of boosting models doesn't make sense (nor do certain other combinations), but we're still talking about potentially hundreds of models worth testing. Surely this sort of model screening/comparing could be scripted, right?

Anderson B. Mayfield
60 REPLIES 60
SDF1
Super User

Re: building multiple neural network models

Hi @abmayfield ,

 

  I'm reposting this as it seems my original post that I tried to edit got lost somehow. Thanks for sharing the data table, it helped to fix the issues.

 

  Anyway, I made some changes, so a few things to note:

  1. I made it so a single code can run on Ordinal, Nominal, and Continuous modeling types. Two different versions are no longer needed.
  2. You will need to edit the variable y_resp, which is the Y response column by entering the column name between the quotes (see line 8). You no longer need to edit it in the call within Neural(). You'll still have to edit the list of X()'s, though.
  3. The code then tests the modeling type of y_resp column and sets some variables and generates the appropriate columns in the output data table.
  4. The output data table is now given a name depending on the validation type and host data table. This way you can run all the validation types and then compare without having weird naming issues from JMP.
  5. I edited each of the different validation type scripts you have in your data table, and they can easily be copy/pasted to new data tables. I tested this on your previous table you sent and it worked just fine.
  6. I fixed the error my code was writing to the log file each time it ran, had to move a parenthesis to a new location.
  7. I found out that the reason your holdback and my k-fold weren't actually working correctly, is that when you do that type, you need to change Validation() to Validation Method(). I've done that for your different methods and the scripts are fixed.

  So, it's improving bit  by bit, so that's nice. I'd really prefer to have it test the response column type and validation method before it generates the extra columns, but I'm not sure how to do that yet. Maybe I'll edit that for the next roll out!

 

Below is the code example for the Kfold validation. I've also attached your new data table with the corrected scripts saved to it.

 

Names Default To Here( 1 );
dt = Current Data Table();
dt_parms = Data Table( "NN Tuning 8" );
dtp_name = dt_parms << get name;
dt_name = dt << get name;

//edit the line below so that it has the name of the column you want to model.
y_resp = "health designation";
response_col = Column( dt, Eval( Eval Expr( Expr( y_resp ) ) ) );

dt_results = dt_parms << Subset( Output Table( "Output of " || dtp_name || " with Kfold" || " for " || dt_name ), All rows, Selected Columns Only( 0 ) );

//This tests the response column type to automatically set the values for pulling the data out of the report window.
//It also will create the appropriate columns depending on the modeling type of the response column.
coltype = response_col << Get Modeling Type;
If( coltype == "Ordinal",
	o_box = 3;
	ncol_box_v = 10;
	ncol_box_test = 19;
	dt_results << New Column( "Generalized R² Training" );
	dt_results << New Column( "Generalized R² Validation" );
	//dt_results << New Column( "Generalized R² Test" );
	dt_results << New Column( "Entropy R² Training" );
	dt_results << New Column( "Entropy R² Validation" );
	//dt_results << New Column( "Entropy R² Test" );
	dt_results << New Column( "RMSE Training" );
	dt_results << New Column( "RMSE Validation" );
	//dt_results << New Column( "RMSE Test" );
	dt_results << New Column( "Mean Abs Dev Training" );
	dt_results << New Column( "Mean Abs Dev Validation" );
	//dt_results << New Column( "Mean Abs Dev Test" );
	dt_results << New Column( "Misclassification Rate Training" );
	dt_results << New Column( "Misclassification Rate Validation" );
	//dt_results << New Column( "Misclassification Rate Test" );
	dt_results << New Column( "-LogLiklihood Training" );
	dt_results << New Column( "-LogLiklihood Validation" );
	//dt_results << New Column( "-LogLiklihood Test" );
	dt_results << New Column( "Sum Freq Training" );
	dt_results << New Column( "Sum Freq Validation" );
	//dt_results << New Column( "Sum Freq Test" );
);
If( coltype == "Nominal",
	o_box = 3;
	ncol_box_v = 10;
	ncol_box_test = 19;
	dt_results << New Column( "Generalized R² Training" );
	dt_results << New Column( "Generalized R² Validation" );
	//dt_results << New Column( "Generalized R² Test" );
	dt_results << New Column( "Entropy R² Training" );
	dt_results << New Column( "Entropy R² Validation" );
	//dt_results << New Column( "Entropy R² Test" );
	dt_results << New Column( "RMSE Training" );
	dt_results << New Column( "RMSE Validation" );
	//dt_results << New Column( "RMSE Test" );
	dt_results << New Column( "Mean Abs Dev Training" );
	dt_results << New Column( "Mean Abs Dev Validation" );
	//dt_results << New Column( "Mean Abs Dev Test" );
	dt_results << New Column( "Misclassification Rate Training" );
	dt_results << New Column( "Misclassification Rate Validation" );
	//dt_results << New Column( "Misclassification Rate Test" );
	dt_results << New Column( "-LogLiklihood Training" );
	dt_results << New Column( "-LogLiklihood Validation" );
	//dt_results << New Column( "-LogLiklihood Test" );
	dt_results << New Column( "Sum Freq Training" );
	dt_results << New Column( "Sum Freq Validation" );
	//dt_results << New Column( "Sum Freq Test" );
);
If( coltype == "Continuous",
	o_box = 3;
	ncol_box_v = 2;
	ncol_box_test = 3;
	dt_results << New Column( "R² Training" );
	dt_results << New Column( "R² Validation" );
	//dt_results << New Column( "R² Test" );
	dt_results << New Column( "RMSE Training" );
	dt_results << New Column( "RMSE Validation" );
	//dt_results << New Column( "RMSE Test" );
	dt_results << New Column( "Mean Abs Dev Training" );
	dt_results << New Column( "Mean Abs Dev Validation" );
	//dt_results << New Column( "Mean Abs Dev Test" );
	dt_results << New Column( "-LogLiklihood Training" );
	dt_results << New Column( "-LogLiklihood Validation" );
	//dt_results << New Column( "-LogLiklihood Test" );
	dt_results << New Column( "SSE Training" );
	dt_results << New Column( "SSE Validation" );
	//dt_results << New Column( "SSE Test" );
	dt_results << New Column( "Sum Freq Training" );
	dt_results << New Column( "Sum Freq Validation" );
	//dt_results << New Column( "Sum Freq Test" );
);


imax = N Row( dt_parms );

For( i = 1, i <= imax, i++,
	Nlayer = dt_parms:N_Layers[i];
	If( dt_results:N_Layers[i] == 1,
		dt_results:N_TanH_2[i] = 0;
		dt_results:N_Linear_2[i] = 0;
		dt_results:N_Gauss_2[i] = 0;
	);
	If( dt_results:N_Layers[i] == 2,
		dt_results:N_Boosts[i] = 0;
		dt_results:Learn_Rate[i] = 0.1;
	);
	NTH1 = dt_results:N_TanH_1[i];
	NTH2 = dt_results:N_TanH_2[i];
	NL1 = dt_results:N_Linear_1[i];
	NL2 = dt_results:N_Linear_2[i];
	NG1 = dt_results:N_Gauss_1[i];
	NG2 = dt_results:N_Gauss_2[i];
	Nboosts = dt_results:N_Boosts[i];
	LR = dt_results:Learn_Rate[i];
	TCov = dt_results:T_Cov[i];
	RFit = dt_results:Robust_Fit[i];
	PMethod = dt_results:Penalty_Method[i];
	Ntours = dt_results:N_Tours[i];
	
	str = Eval Insert(
		"report = (dt << Neural(
        Y( response_col ),
		 X(
		 :OFA...22_c0_g2_i1.p1,
:OFA...47_c0_g1_i1.p1,
:OFA...73_c1_g1_i3.p1,
:OFA...72_c1_g1_i1.p1,
:OFA...5_c1_g2_i15.p1,
:OFA...78_c2_g1_i3.p1,
:OFA...51_c4_g1_i1.p1,
:OFA...38_c0_g1_i8.p1,
:OFA...19_c0_g1_i1.p1,
:OFA...22_c2_g1_i4.p1,
:OFA...27_c1_g3_i5.p1,
:OFA...82_c3_g2_i3.p1,
:OFA...11_c1_g1_i7.p1,
:OFA...50_c2_g4_i1.p3,
:OFA...18_c1_g1_i7.p1,
:OFA...66_c2_g1_i3.p1,
:OFA...85_c0_g1_i2.p1,
:OFA...99_c5_g1_i6.p1,
:OFA...03_c2_g1_i1.p1,
:SYM...24_c0_g1_i1.p1,
:SYM...75_c0_g2_i1.p1,
:SYM...42_c0_g1_i1.p1,
:SYM...04_c0_g1_i1.p1,
:SYM...97_c0_g1_i1.p1,
:SYM...13_c0_g1_i1.p1,
:SYM...66_c0_g1_i1.p1,
:SYM...72_c0_g6_i1.p1,
:SYM...13_c0_g1_i1.p1 2,
:SYM...33_c0_g1_i1.p1 3,
:SYM...43_c0_g1_i1.p1,
:SYM...51_c0_g1_i1.p1,
:SYM...65_c0_g1_i1.p1 2 
), 
       Validation Method ( \!"Kfold\!", 5 ),
       Informative Missing(0), 
       Transform Covariates(^TCov^),
       Fit(
       	NTanH(^NTH1^),
       	NLinear(^NL1^),
       	NGaussian(^NG1^),
       	NTanH2(^NTH2^),
       	NLinear2(^NL2^),
       	NGaussian2(^NG2^),
       	Transform Covariates(^TCov^),
       	Penalty Method(\!"PMethod\!"),
       	Number of Tours(^Ntours^),
       	N Boost(^Nboosts^),
       	Learning Rate(^LR^)
       ),
       Go,
              invisible
       )) << Report;"
	);
	Eval( Parse( str ) );
	w = Window( dt << GetName || " - " || "Neural of " || y_resp );
	
	
	
	T_stats = w[Outline Box( o_box ), Number Col Box( 1 )] << Get;
	V_stats = w[Outline Box( o_box ), Number Col Box( ncol_box_v )] << Get;
	//Test_stats = w[Outline Box( o_box ), Number Col Box( ncol_box_test )] << Get;
	report << Close Window;
	
	If( coltype == "Ordinal",
		dt_results:Generalized R² Training[i] = T_stats[1];
		dt_results:Entropy R² Training[i] = T_stats[2];
		dt_results:RMSE Training[i] = T_stats[3];
		dt_results:Mean Abs Dev Training[i] = T_stats[4];
		dt_results:Misclassification Rate Training[i] = T_stats[5];
		dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[6];
		dt_results:Sum Freq Training[i] = T_stats[7];
	
		dt_results:Generalized R² Validation[i] = V_stats[1];
		dt_results:Entropy R² Validation[i] = V_stats[2];
		dt_results:RMSE Validation[i] = V_stats[3];
		dt_results:Mean Abs Dev Validation[i] = V_stats[4];
		dt_results:Misclassification Rate Validation[i] = V_stats[5];
		dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[6];
		dt_results:Sum Freq Validation[i] = V_stats[7];
	
		//dt_results:Generalized R² Test[i] = Test_stats[1];
		//dt_results:Entropy R² Test[i] = Test_stats[2];
		//dt_results:RMSE Test[i] = Test_stats[3];
		//dt_results:Mean Abs Dev Test[i] = Test_stats[4];
		//dt_results:Misclassification Rate Test[i] = Test_stats[5];
		//dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[6];
		//dt_results:Sum Freq Test[i] = Test_stats[7];
	);
	
	If( coltype == "Nominal",
		dt_results:Generalized R² Training[i] = T_stats[1];
		dt_results:Entropy R² Training[i] = T_stats[2];
		dt_results:RMSE Training[i] = T_stats[3];
		dt_results:Mean Abs Dev Training[i] = T_stats[4];
		dt_results:Misclassification Rate Training[i] = T_stats[5];
		dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[6];
		dt_results:Sum Freq Training[i] = T_stats[7];
	
		dt_results:Generalized R² Validation[i] = V_stats[1];
		dt_results:Entropy R² Validation[i] = V_stats[2];
		dt_results:RMSE Validation[i] = V_stats[3];
		dt_results:Mean Abs Dev Validation[i] = V_stats[4];
		dt_results:Misclassification Rate Validation[i] = V_stats[5];
		dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[6];
		dt_results:Sum Freq Validation[i] = V_stats[7];
	
		//dt_results:Generalized R² Test[i] = Test_stats[1];
		//dt_results:Entropy R² Test[i] = Test_stats[2];
		//dt_results:RMSE Test[i] = Test_stats[3];
		//dt_results:Mean Abs Dev Test[i] = Test_stats[4];
		//dt_results:Misclassification Rate Test[i] = Test_stats[5];
		//dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[6];
		//dt_results:Sum Freq Test[i] = Test_stats[7];
	);
	
	If( coltype == "Continuous",
		dt_results:R² Training[i] = T_stats[1];
		dt_results:RMSE Training[i] = T_stats[2];
		dt_results:Mean Abs Dev Training[i] = T_stats[3];
		dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[4];
		dt_results:SSE Training[i] = T_stats[5];
		dt_results:Sum Freq Training[i] = T_stats[6];
	
		dt_results:R² Validation[i] = V_stats[1];
		dt_results:RMSE Validation[i] = V_stats[2];
		dt_results:Mean Abs Dev Validation[i] = V_stats[3];
		dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[4];
		dt_results:SSE Validation[i] = V_stats[5];
		dt_results:Sum Freq Validation[i] = V_stats[6];
	
		//dt_results:R² Test[i] = Test_stats[1];
		//dt_results:RMSE Test[i] = Test_stats[2];
		//dt_results:Mean Abs Dev Test[i] = Test_stats[3];
		//dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[4];
		//dt_results:SSE Test[i] = Test_stats[5]
		//dt_results:Sum Freq Test[i] = Test_stats[6];
	);
);

 

Let me know if you find any other issues and I'll try to fix them.

 

Hope this helps!,

DS

abmayfield
Level VI

Re: building multiple neural network models

Hmmm, I downloaded your new table and now I get the outline box error every time. I am going to try a JMP reboot, then a computer reboot. It seems strange that it would work on your end but not on mine!

Anderson B. Mayfield
SDF1
Super User

Re: building multiple neural network models

Hi @abmayfield ,

 

  I hope the JMP and/or computer re-set did the trick. If not, I also went back and built a GUI for the NN modeling -- I plan on expanding it to work across different platforms. It doesn't work perfectly, and I get some strange errors at times (an expert scripter can probably debug it within seconds). But, the nice thing is that it will work with any type of input data column -- ordinal, nominal, and continuous, and it works with the acceptable validation methods for NN -- excluded rows, holdback, Kfold, and validation column. It will work with both validation columns that have Test data in them and ones that don't. It's the script called NN GUI in your data table (attached).

 

  For those interested, I borrowed heavily on a blog from JMP staff .@juliagong (retired) here and a support note from .@Wendy_Murphrey here. I plan on creating a GUI that can automate this for the other platforms I have also automated from within data tables: boosted tree, bootstrap forest, and XGBoost. I plan on posting it in the community, but it could take some time -- especially to make sure it's fully debugged.

 

  The only change from the below code to the GUI in the data table is changing dt = Data Table ("...") to dt = Current Data Table().

 

  I hope this helps to make progress on your work. It does for me as I can improve this to make my modeling efforts easier, too!

 

Good luck,

DS

 

NN GUI code:

Names Default To Here( 1 );
//Clear Symbols();

dt = Data Table( "Neural networking Ofav 2017 dataset" );
dt_parms = Data Table( "NN Tuning 8" );
nc = N Col( dt );

dtp_name = dt_parms << get name;
dt_name = dt << get name;

valid_list = {"Excluded Rows Holdback", "Holdback", "KFold", "Validation Column"};

//the width of the column list box in pixels
lbWidth = 110;
/* Expression to clear all current settings */
clearRoles = Expr(
	colListY << RemoveAll;
	colListV << RemoveAll;
);

/* Expression to store the current settings in global variables */
recallRoles = Expr(
	::ycolRecall = colListY << GetItems;
	::vcolRecall = colListV << GetItems;
);

VmethodDlg = New Window( "Choose Validation Type",
	<<Return Result,
	<<Modal,
	<<OnValidate,
	V List Box(
		Align( center ),
		H List Box( Spacer Box( Size( 10, 15 ) ), Text Box( "Automated Tuning" ), Spacer Box( Size( 10, 10 ) ) ), 
            
             
		Spacer Box( Size( 20, 10 ) ), 
             
		H List Box(
			Spacer Box( Size( 20, 10 ) ),
			V List Box(
				Panel Box( "Column Selection", colListData = Col List Box( All, nLines( Min( nc, 5 ) ) ) ),
				V List Box(
					Panel Box( "Validation Method", methodObj = Radio Box( valid_list ) ),
					Panel Box( "Validation Options",
						V List Box(
							Text Box( "Holdback Portion" ),
							HP_input = Number Edit Box( . ),
							Spacer Box( Size( 10, 5 ) ),
							Text Box( "Number of Folds" ),
							KF_input = Number Edit Box( . )
						)
					)
				)
			),
			V List Box(
				Panel Box( "Cast Selected Columns into Roles",
					Lineup Box( N Col( 2 ), Spacing( 3 ),
						Button Box( "Y, Response", colListY << Append( colListData << GetSelected ) ),
						colListY = Col List Box( width( lbWidth ), nLines( 1 ) ),
						Button Box( "Validation Column", colListV << Append( colListData << GetSelected ) ),
						colListV = Col List Box( width( lbWidth ), nLines( 1 ) )
							
					)
				), 

			),
			Panel Box( "Action",
				Lineup Box( N Col( 1 ), 
                    //OK Button
					Button Box( "OK",
						recallRoles;
                        //script that executes upon clicking "OK"
						If(
							(MethodObj << Get) == 1, 
					
								VMeth = "Excluded Rows Holdback";
								y_resp = ColListY << GetItems;
								VmethodDlg << CloseWindow;,
							(MethodObj << Get) == 2, 
					
								VMeth = "Holdback";
								y_resp = ColListY << GetItems;
								VmethodDlg << CloseWindow;,
							(MethodObj << Get) == 3, 
						
								VMeth = "Kfold";
								y_resp = ColListY << GetItems;
								VmethodDlg << CloseWindow;,
							(MethodObj << Get) == 4, 
							
								VMeth = "Validation Column";
								y_resp = ColListY << GetItems;
								ValidCol = ColListV << GetItems;
								VmethodDlg << CloseWindow;
						);
						VmethodDlg << CloseWindow;
					), 
                    //Cancel Button
					Button Box( "Cancel",
						VmethodDlg << CloseWindow;
						Break();
					),
					Button Box( "Reset", clearRoles ),
					Text Box( " " ), 
					//Remove Button
					Button Box( "Remove",
						colListY << RemoveSelected;
						colListV << RemoveSelected;
					),
					Button Box( "Recall",
						clearRoles;
							/* Restore any previous settings from the global variables */
						Try(
							colListY << Append( ::ycolRecall );
							colListV << Append( ::vcolRecall );
						);
					), 
					//Help not implemented here, buttons are shown in UI
					Button Box( "Help", notImplemented )
				), 

			)
		)
	)
);

If(
	VMeth == "Excluded Rows Holdback",
		HP = .;
		KF = .;,
	VMeth == "Holdback",
		HP = VmethodDlg["HP_input"];
		KF = .;,
	VMeth == "Kfold",
		HP = .;
		KF = VmethodDlg["KF_input"];,
	VMeth == "Validation Column",
		HP = .;
		KF = .;
		lvl = 0;
		For( r = 10, r <= N Rows( dt ), r++,
			If( Column( dt, Eval( Eval Expr( Expr( ValidCol[1] ) ) ) )[r] == 2,
				lvl
				++)
		);
);

response_col = Column( dt, Eval( Eval Expr( Expr( y_resp ) ) ) );
coltype = response_col << Get Modeling Type;

dt_results = dt_parms << Subset(
	Output Table( "Output of " || dtp_name || " with " || Vmeth || " Validation" || " for " || dt_name ),
	All rows,
	Selected Columns Only( 0 )
);

If( VMeth == "Excluded Rows Holdback",
	VMeth_Insert = "Validation Method(\!"Excluded Rows Holdback\!")";
	If(
		coltype == "Ordinal",
			set = 1;
			o_box = 3;
			ncol_box_v = 10;
			ncol_box_test = 19;
			dt_results << New Column( "Generalized R² Training" );
			dt_results << New Column( "Generalized R² Validation" );
			dt_results << New Column( "Entropy R² Training" );
			dt_results << New Column( "Entropy R² Validation" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "Misclassification Rate Training" );
			dt_results << New Column( "Misclassification Rate Validation" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );,
		coltype == "Nominal",
			set = 2;
			o_box = 3;
			ncol_box_v = 10;
			ncol_box_test = 19;
			dt_results << New Column( "Generalized R² Training" );
			dt_results << New Column( "Generalized R² Validation" );
			dt_results << New Column( "Entropy R² Training" );
			dt_results << New Column( "Entropy R² Validation" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "Misclassification Rate Training" );
			dt_results << New Column( "Misclassification Rate Validation" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );,
		coltype == "Continuous",
			set = 3;
			o_box = 3;
			ncol_box_v = 2;
			ncol_box_test = 3;
			dt_results << New Column( "R² Training" );
			dt_results << New Column( "R² Validation" );
		//dt_results << New Column( "R² Test" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
		//dt_results << New Column( "RMSE Test" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
		//dt_results << New Column( "Mean Abs Dev Test" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
		//dt_results << New Column( "-LogLiklihood Test" );
			dt_results << New Column( "SSE Training" );
			dt_results << New Column( "SSE Validation" );
		//dt_results << New Column( "SSE Test" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );
		//dt_results << New Column( "Sum Freq Test" );
		
	);
);

If( VMeth == "Holdback",
	VMeth_Insert = "Validation Method(\!"Holdback\!", " || Char( HP ) || ")";
	If(
		coltype == "Ordinal",
			set = 1;
			o_box = 3;
			ncol_box_v = 10;
			ncol_box_test = 19;
			dt_results << New Column( "Generalized R² Training" );
			dt_results << New Column( "Generalized R² Validation" );
			dt_results << New Column( "Entropy R² Training" );
			dt_results << New Column( "Entropy R² Validation" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "Misclassification Rate Training" );
			dt_results << New Column( "Misclassification Rate Validation" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );,
		coltype == "Nominal",
			set = 2;
			o_box = 3;
			ncol_box_v = 10;
			ncol_box_test = 19;
			dt_results << New Column( "Generalized R² Training" );
			dt_results << New Column( "Generalized R² Validation" );
			dt_results << New Column( "Entropy R² Training" );
			dt_results << New Column( "Entropy R² Validation" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "Misclassification Rate Training" );
			dt_results << New Column( "Misclassification Rate Validation" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );,
		coltype == "Continuous",
			set = 3;
			o_box = 3;
			ncol_box_v = 2;
			ncol_box_test = 3;
			dt_results << New Column( "R² Training" );
			dt_results << New Column( "R² Validation" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "SSE Training" );
			dt_results << New Column( "SSE Validation" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );
	);
);

If( VMeth == "Kfold",
	VMeth_Insert = "Validation Method(\!"Kfold\!", " || Char( KF ) || ")";
	If(
		coltype == "Ordinal",
			set = 1;
			o_box = 3;
			ncol_box_v = 10;
			ncol_box_test = 19;
			dt_results << New Column( "Generalized R² Training" );
			dt_results << New Column( "Generalized R² Validation" );
			dt_results << New Column( "Entropy R² Training" );
			dt_results << New Column( "Entropy R² Validation" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "Misclassification Rate Training" );
			dt_results << New Column( "Misclassification Rate Validation" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );,
		coltype == "Nominal",
			set = 2;
			o_box = 3;
			ncol_box_v = 10;
			ncol_box_test = 19;
			dt_results << New Column( "Generalized R² Training" );
			dt_results << New Column( "Generalized R² Validation" );
			dt_results << New Column( "Entropy R² Training" );
			dt_results << New Column( "Entropy R² Validation" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "Misclassification Rate Training" );
			dt_results << New Column( "Misclassification Rate Validation" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );,
		coltype == "Continuous",
			set = 3;
			o_box = 3;
			ncol_box_v = 2;
			ncol_box_test = 3;
			dt_results << New Column( "R² Training" );
			dt_results << New Column( "R² Validation" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "SSE Training" );
			dt_results << New Column( "SSE Validation" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );
	);
);

If( VMeth == "Validation Column" & lvl > 0,
	VMeth_Insert = "Validation(" || ":" || validCol[1] || ")";
	If(
		coltype == "Ordinal",
			set = 1;
			o_box = 3;
			ncol_box_v = 10;
			ncol_box_test = 19;
			dt_results << New Column( "Generalized R² Training" );
			dt_results << New Column( "Generalized R² Validation" );
			dt_results << New Column( "Generalized R² Test" );
			dt_results << New Column( "Entropy R² Training" );
			dt_results << New Column( "Entropy R² Validation" );
			dt_results << New Column( "Entropy R² Test" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "RMSE Test" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "Mean Abs Dev Test" );
			dt_results << New Column( "Misclassification Rate Training" );
			dt_results << New Column( "Misclassification Rate Validation" );
			dt_results << New Column( "Misclassification Rate Test" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "-LogLiklihood Test" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );
			dt_results << New Column( "Sum Freq Test" );,
		coltype == "Nominal",
			set = 2;
			o_box = 3;
			ncol_box_v = 10;
			ncol_box_test = 19;
			dt_results << New Column( "Generalized R² Training" );
			dt_results << New Column( "Generalized R² Validation" );
			dt_results << New Column( "Generalized R² Test" );
			dt_results << New Column( "Entropy R² Training" );
			dt_results << New Column( "Entropy R² Validation" );
			dt_results << New Column( "Entropy R² Test" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "RMSE Test" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "Mean Abs Dev Test" );
			dt_results << New Column( "Misclassification Rate Training" );
			dt_results << New Column( "Misclassification Rate Validation" );
			dt_results << New Column( "Misclassification Rate Test" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "-LogLiklihood Test" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );
			dt_results << New Column( "Sum Freq Test" );,
		coltype == "Continuous",
			set = 3;
			o_box = 3;
			ncol_box_v = 2;
			ncol_box_test = 3;
			dt_results << New Column( "R² Training" );
			dt_results << New Column( "R² Validation" );
			dt_results << New Column( "R² Test" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "RMSE Test" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "Mean Abs Dev Test" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "-LogLiklihood Test" );
			dt_results << New Column( "SSE Training" );
			dt_results << New Column( "SSE Validation" );
			dt_results << New Column( "SSE Test" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );
			dt_results << New Column( "Sum Freq Test" );
	);
);

If( VMeth == "Validation Column" & lvl == 0,
	VMeth_Insert = "Validation(" || ":" || validCol[1] || ")";
	If(
		coltype == "Ordinal",
			set = 1;
			o_box = 3;
			ncol_box_v = 10;
			dt_results << New Column( "Generalized R² Training" );
			dt_results << New Column( "Generalized R² Validation" );
			dt_results << New Column( "Entropy R² Training" );
			dt_results << New Column( "Entropy R² Validation" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "Misclassification Rate Training" );
			dt_results << New Column( "Misclassification Rate Validation" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );,
		coltype == "Nominal",
			set = 2;
			o_box = 3;
			ncol_box_v = 10;
			ncol_box_test = 19;
			dt_results << New Column( "Generalized R² Training" );
			dt_results << New Column( "Generalized R² Validation" );
			dt_results << New Column( "Entropy R² Training" );
			dt_results << New Column( "Entropy R² Validation" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "Misclassification Rate Training" );
			dt_results << New Column( "Misclassification Rate Validation" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );,
		coltype == "Continuous",
			set = 3;
			o_box = 3;
			ncol_box_v = 2;
			ncol_box_test = 3;
			dt_results << New Column( "R² Training" );
			dt_results << New Column( "R² Validation" );
			dt_results << New Column( "RMSE Training" );
			dt_results << New Column( "RMSE Validation" );
			dt_results << New Column( "Mean Abs Dev Training" );
			dt_results << New Column( "Mean Abs Dev Validation" );
			dt_results << New Column( "-LogLiklihood Training" );
			dt_results << New Column( "-LogLiklihood Validation" );
			dt_results << New Column( "SSE Training" );
			dt_results << New Column( "SSE Validation" );
			dt_results << New Column( "Sum Freq Training" );
			dt_results << New Column( "Sum Freq Validation" );
	);
);

imax = N Row( dt_parms );

For( i = 1, i <= imax, i++,
	Nlayer = dt_parms:N_Layers[i];
	If( dt_results:N_Layers[i] == 1,
		dt_results:N_TanH_2[i] = 0;
		dt_results:N_Linear_2[i] = 0;
		dt_results:N_Gauss_2[i] = 0;
	);
	If( dt_results:N_Layers[i] == 2,
		dt_results:N_Boosts[i] = 0;
		dt_results:Learn_Rate[i] = 0.1;
	);
	NTH1 = dt_results:N_TanH_1[i];
	NTH2 = dt_results:N_TanH_2[i];
	NL1 = dt_results:N_Linear_1[i];
	NL2 = dt_results:N_Linear_2[i];
	NG1 = dt_results:N_Gauss_1[i];
	NG2 = dt_results:N_Gauss_2[i];
	Nboosts = dt_results:N_Boosts[i];
	LR = dt_results:Learn_Rate[i];
	TCov = dt_results:T_Cov[i];
	RFit = dt_results:Robust_Fit[i];
	PMethod = dt_results:Penalty_Method[i];
	Ntours = dt_results:N_Tours[i];
	
	str = Eval Insert(
		"report = (dt << Neural(
        Y( response_col ),
		 X(
		 :OFA...22_c0_g2_i1.p1,
:OFA...47_c0_g1_i1.p1,
:OFA...73_c1_g1_i3.p1,
:OFA...72_c1_g1_i1.p1,
:OFA...5_c1_g2_i15.p1,
:OFA...78_c2_g1_i3.p1,
:OFA...51_c4_g1_i1.p1,
:OFA...38_c0_g1_i8.p1,
:OFA...19_c0_g1_i1.p1,
:OFA...22_c2_g1_i4.p1,
:OFA...27_c1_g3_i5.p1,
:OFA...82_c3_g2_i3.p1,
:OFA...11_c1_g1_i7.p1,
:OFA...50_c2_g4_i1.p3,
:OFA...18_c1_g1_i7.p1,
:OFA...66_c2_g1_i3.p1,
:OFA...85_c0_g1_i2.p1,
:OFA...99_c5_g1_i6.p1,
:OFA...03_c2_g1_i1.p1,
:SYM...24_c0_g1_i1.p1,
:SYM...75_c0_g2_i1.p1,
:SYM...42_c0_g1_i1.p1,
:SYM...04_c0_g1_i1.p1,
:SYM...97_c0_g1_i1.p1,
:SYM...13_c0_g1_i1.p1,
:SYM...66_c0_g1_i1.p1,
:SYM...72_c0_g6_i1.p1,
:SYM...13_c0_g1_i1.p1 2,
:SYM...33_c0_g1_i1.p1 3,
:SYM...43_c0_g1_i1.p1,
:SYM...51_c0_g1_i1.p1,
:SYM...65_c0_g1_i1.p1 2 
), 
       ^VMeth_insert^,
       Informative Missing(0), 
       Transform Covariates(^TCov^),
       Fit(
       	NTanH(^NTH1^),
       	NLinear(^NL1^),
       	NGaussian(^NG1^),
       	NTanH2(^NTH2^),
       	NLinear2(^NL2^),
       	NGaussian2(^NG2^),
       	Transform Covariates(^TCov^),
       	Penalty Method(\!"PMethod\!"),
       	Number of Tours(^Ntours^),
       	N Boost(^Nboosts^),
       	Learning Rate(^LR^)
       ),
       Go,
              invisible
       )) << Report;"
	);

	Eval( Parse( str ) );
	w = Window( dt_name || " - " || "Neural of " || y_resp[1] );
	
	T_stats = w[Outline Box( o_box ), Number Col Box( 1 )] << Get;
	V_stats = w[Outline Box( o_box ), Number Col Box( ncol_box_v )] << Get;
	If( VMeth == "Validation Column" & lvl > 0,
		Test_stats = w[Outline Box( o_box ), Number Col Box( ncol_box_test )] << Get
	);
	report << Close Window;
	
	If( coltype == "Ordinal",
		dt_results:Generalized R² Training[i] = T_stats[1];
		dt_results:Entropy R² Training[i] = T_stats[2];
		dt_results:RMSE Training[i] = T_stats[3];
		dt_results:Mean Abs Dev Training[i] = T_stats[4];
		dt_results:Misclassification Rate Training[i] = T_stats[5];
		dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[6];
		dt_results:Sum Freq Training[i] = T_stats[7];
	
		dt_results:Generalized R² Validation[i] = V_stats[1];
		dt_results:Entropy R² Validation[i] = V_stats[2];
		dt_results:RMSE Validation[i] = V_stats[3];
		dt_results:Mean Abs Dev Validation[i] = V_stats[4];
		dt_results:Misclassification Rate Validation[i] = V_stats[5];
		dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[6];
		dt_results:Sum Freq Validation[i] = V_stats[7];
		
		If( VMeth == "Validation Column" & lvl > 0,
			dt_results:Generalized R² Test[i] = Test_stats[1];
			dt_results:Entropy R² Test[i] = Test_stats[2];
			dt_results:RMSE Test[i] = Test_stats[3];
			dt_results:Mean Abs Dev Test[i] = Test_stats[4];
			dt_results:Misclassification Rate Test[i] = Test_stats[5];
			dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[6];
			dt_results:Sum Freq Test[i] = Test_stats[7];
		);
	);
	
	If( coltype == "Nominal",
		dt_results:Generalized R² Training[i] = T_stats[1];
		dt_results:Entropy R² Training[i] = T_stats[2];
		dt_results:RMSE Training[i] = T_stats[3];
		dt_results:Mean Abs Dev Training[i] = T_stats[4];
		dt_results:Misclassification Rate Training[i] = T_stats[5];
		dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[6];
		dt_results:Sum Freq Training[i] = T_stats[7];
	
		dt_results:Generalized R² Validation[i] = V_stats[1];
		dt_results:Entropy R² Validation[i] = V_stats[2];
		dt_results:RMSE Validation[i] = V_stats[3];
		dt_results:Mean Abs Dev Validation[i] = V_stats[4];
		dt_results:Misclassification Rate Validation[i] = V_stats[5];
		dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[6];
		dt_results:Sum Freq Validation[i] = V_stats[7];
	
		If( VMeth == "Validation Column" & lvl > 0,
			dt_results:Generalized R² Test[i] = Test_stats[1];
			dt_results:Entropy R² Test[i] = Test_stats[2];
			dt_results:RMSE Test[i] = Test_stats[3];
			dt_results:Mean Abs Dev Test[i] = Test_stats[4];
			dt_results:Misclassification Rate Test[i] = Test_stats[5];
			dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[6];
			dt_results:Sum Freq Test[i] = Test_stats[7];
		);
	);
	
	If( coltype == "Continuous",
		dt_results:R² Training[i] = T_stats[1];
		dt_results:RMSE Training[i] = T_stats[2];
		dt_results:Mean Abs Dev Training[i] = T_stats[3];
		dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[4];
		dt_results:SSE Training[i] = T_stats[5];
		dt_results:Sum Freq Training[i] = T_stats[6];
	
		dt_results:R² Validation[i] = V_stats[1];
		dt_results:RMSE Validation[i] = V_stats[2];
		dt_results:Mean Abs Dev Validation[i] = V_stats[3];
		dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[4];
		dt_results:SSE Validation[i] = V_stats[5];
		dt_results:Sum Freq Validation[i] = V_stats[6];
		
		If( VMeth == "Validation Column" & lvl > 0,
			dt_results:R² Test[i] = Test_stats[1];
			dt_results:RMSE Test[i] = Test_stats[2];
			dt_results:Mean Abs Dev Test[i] = Test_stats[3];
			dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[4];
			dt_results:SSE Test[i] = Test_stats[5];
			dt_results:Sum Freq Test[i] = Test_stats[6];
		);
	);
);
abmayfield
Level VI

Re: building multiple neural network models

Wow! Now this is truly great. The only slight glitch is that, if the script window is directly above the "tuning" table, it "calls" the tuning table in the model window instead of the data table. As long as the data table and script are at the forefront of your screen, it loads the model terms correctly. In fact, I just ran 100 models in just a few minutes. I wonder if you can put this on the "add-in" depot (forgot its actual name) on the JMP website so that it gets more widespread attention. I'm certain this would help MANY people (if not immediately, then in the future as more see the utility of NN in model building). Hell, it should probably make its way in some form into JMP Pro 17 (might be too late for 16). Thanks so much for all your help, and I hope this helps your own research, as well. 

Anderson B. Mayfield
SDF1
Super User

Re: building multiple neural network models

Hi @abmayfield ,

 

  Thanks for your feedback and kind words. Yeah, I had that problem too, about it pulling up the wrong data table if it's not "on top". I'm not sure why it does that at all. There's also something funky about how it responds to when I call certain things in the new window for passing on to the fit portion later. The "cancel" button is not working like I intend, and the "help" button doesn't go anywhere right now, but it's a start! I think that a scripting expert could easily fix those bugs and have it work as intended. Perhaps when I get things set up for it to run on more platforms and to not have those bugs in it, I could post it to the add-in page. Would be an honor to have it incorporated into JMP.

 

  It does help me with my work as it'll make the process of tuning my models even more automated and simplified. I had wanted to do something for the NN platform, but was putting it off until your post, and now that I've made the GUI, I'm wanting to expand on that to make it more cross-functional. The ability to tune a large set of models and focus on just the stats really helps to narrow down the hyperparameter space so that it's easier to compare models across different platforms and see which is the best at predicting new data. Then, you can go back and run the optimal settings in the standard platform and get all the rich graphical feedback JMP provides to visualize the model and all the other stuff.

 

Good luck!,

DS

abmayfield
Level VI

Re: building multiple neural network models

I think part of this may have to do with the fact that, for many modeling types, building ever-more complex models is NOT what you want to do: if a simple model fits your data well, and the validation or test RSME values are low, then just stop there. But for NN, adding layers, nodes, boosts, tours, etc., is not necessarily a BAD thing because, although more complex, they are also more flexible. So I can kind of see why having this GUI for ALL modeling platforms may lead those unfamiliar with modeling to essentially data mine exhaustively and to their detriment. From my limited understanding of NN, though, if one model has a respectable validation RSME under the default conditions of TanH=3, yet another has 3 TanH, 3 Linear, and 3 Gaussian nodes across two hidden layers, yet has a similar RSME, you wouldn't automatically reject the latter model on account of it being more complex yet equally adequate in terms of RSME because it might end up being MORE flexible to future test datasets. Therefore, it could very well be that most people WOULD do well to build potentially hundreds to thousands of NN models. It is entirely possible, though, that I am wrong and NN is no different from other modeling types, in which, all else being equal, simple is better. I'm just spit-balling as to why you don't see more people trying to optimize NN input parameters: is it out of ignorance or fear of data mining?!

Anderson B. Mayfield
SDF1
Super User

Re: building multiple neural network models

Hi @abmayfield ,

 

  I definitely agree with you about not having overly complex models. That's absolutely an issue that should be avoided.

  My take on things is that NN modeling is not too dissimilar than other platforms. Each one has it's benefits and drawbacks. For example, NNs are not that great at handling missing data, they're not very computationally scalable, and their notoriously difficult to interpret. For people using such models to help build theories, that interpretation is important. NNs also can't handle all types of data sets or irrelevant inputs. But, there are no assumptions going into the model, and they tend to have a high predictive ability.

 

  But, since each modeling method has a different set of benefits/drawbacks, I tend to model across different platforms and then compare the best of the models from each platform against each other on a new data set to see which perform the best at predicting the new data set and use that one.

 

  That being said, I just finished Ver 1.0 of a GUI interface that can be run from a JMP menu hot button (not just from within the data table). You feed it your data table and tuning table, and depending on which of the four methods you choose (NN, boosted tree, bootstrap forest, XGBoost), it will run the number of fits you send it based on the tuning table. It'll work across the different data types for the Y input and across different validation column types, even multiple validation columns for the XGBoost platform. It's by no means bug free, but I'm pretty excited I got it to work within a week or so of posting the first GUI for the NN platform. It does require the user to have knowledge of which tuning table to use for each platform and other platform specific requirements -- like one must know that the boosted tree platform in JMP doesn't work with more than 2-level nominal/ordinal data (at least with JMP 15) or that the XGBoost platform can use multiple validation columns, but each must be only 2-levels.

 

  It was fun working on this project and will help me in future efforts. Thanks!

 

Good luck,

DS

abmayfield
Level VI

Re: building multiple neural network models

That is great! Let me know when you post it. I will definitely try it out, especially since I'm still in the early stages of figuring out this dataset (though NN was the clear winner over all other model types I tested on JMP Pro 16). I do see the value of not going DIRECTLY to NN because of the reasons you stated AND because it's not great for response variable reduction. In my case, I would need to repeatedly measure my same suite of 30-40 proteins in every sample (with no missing data, as you pointed out), which will drive up costs versus doing a more traditional, candidate biomarker approach (with one or just a few predictors). So I basically make simple, stripped down models with just a few biomarkers (typically using Gen-reg) and compare them to the optimized NN model from your script. In THIS instance, NN is superior, but that's not to say it ALWAYS will be from every future dataset I will collect! Looking forward to trying out the GUI, bugs and all. 

Anderson B. Mayfield
abmayfield
Level VI

Re: building multiple neural network models

Hello!

   I am back at this NN work after a 6-month hiatus, and I noticed that when I try to run your NN GUI, I get a new error message: "Send expects Scriptable Object in access of evalution of 'list', {/*###*/"Validation"})." Do you think this might be because of some change between JMP 15 and 16 (i.e., certain JMP 15 scripts need to be modified for JMP 16?

Anderson B. Mayfield
SDF1
Super User

Re: building multiple neural network models

Hi @abmayfield ,

 

No, I'm guessing that it's because you selected to have a validation column for the radio button selection (vs holdback or holdback excluded, etc.) but forgot to include the validation in the list of columns when you cast the columns into their selected roles. That's my guess, but if it's not that, then I might need you to provide a step-by-step description.

 

  I have also updated the code somewhat to correct for a few other mistakes that were being made. My code includes the progress window, which I think you didn't want, but anyway, I'm attaching the new GenAt.jsl file so you can have the source code. I use it all the time to assist with my modeling -- NN, boosted tree, random forest, and XGBoost. I like it, but it does require some knowledge/understanding of what you're doing from a modeling perspective and what the different platforms allow you to do.

 

  I don't know if this link will work, but I tried writing up a description on how to use it for the Add-ins repository. I know I can get to that link through my SAS account, just not sure if others can. It's not yet made public on the repository, though.

 

Hope this helps!,

DS