cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Browse apps to extend the software in the new JMP Marketplace
Choose Language Hide Translation Bar
abmayfield
Level VI

building multiple neural network models

I am using the neural platform to make predictions, and, being new to neural networking, I am only slightly familiar with all the input parameters: number of hidden layers, number of nodes/hidden layer, boosting models, learning rate, and tours. What I want to do is try to minimize RMSE and the validation model misclassification rate. What I've been doing is iteratively changing each parameter one by one, saving the model performance parameters, and pasting them into a new JMP table, but this is going to take days since there are so many combinations of layers, nodes, tours, etc. Would it be possible to write a script to where JMP Pro builds, say, 1,000 models and dumps the data into a table so that I don't have to manually change each model input parameter? 

#Hidden layers: 1 or 2

#Sigmoidal nodes: 0, 1, 2, 3, or 4

#Linear nodes: 0, 1, 2, 3, or 4

#Radial nodes: 0, 1, 2, 3, or 4

#boosting models: no idea, 1-5 maybe?

#learning rate: 0-0.5 in 0.1 increments

#tours: 1, 5, 10, 20, 100

 

So that would be 2 x 5 x 5 x 5 x 5 x 5 x 5=~30,000. 

I guess I could widdle a few of these down once I am more familiar with the dataset. Of course, having many nodes and a high number of boosting models doesn't make sense (nor do certain other combinations), but we're still talking about potentially hundreds of models worth testing. Surely this sort of model screening/comparing could be scripted, right?

Anderson B. Mayfield
60 REPLIES 60
abmayfield
Level VI

Re: building multiple neural network models

More excellent suggestions, especially regarding the type of validation. I see that Neural supports holdback, Kfold, or row exclusion (whereby I'm guessing I'd just randomly choose a few rows to leave out). Were I to want to change the script to change the type of validation, would I just change the "Validation ( :Validation with test )," line? And would the held back samples be equivalent in the eyes of JMP as validation samples? If not, I'd need to remove the validation columns (or rename them), and the test columns would have to go, as well. 

Anderson B. Mayfield
SDF1
Super User

Re: building multiple neural network models

Hi @abmayfield ,

 

  Kfold is also another great way to validate the short data set. Yes, the row exclusion you would set as a row state, which you could do by selecting the "test" data set and set the row state to "excluded". If you use Kfold or Holdback, JMP uses those for the validation of the data, not as testing of the data. Hopefully you also have a separate data set you can use as the test data and not have it in the same data table as the training and validation data.

 

  I would just use Kfold for validation and then comment out the parts that you don't need to run the script. Below is the edited version with a 5-Kfold validation setting for the NN platform.  Should work, it did for me.

 

Names Default To Here( 1 );
dt = Current Data Table();
dt_parms = Data Table( "NN Tuning" );
dt_results = dt_parms << Subset( All rows, Selected Columns Only( 0 ) );
dt_results << New Column( "Generalized R² Training" );
dt_results << New Column( "Generalized R² Validation" );
//dt_results << New Column( "Generalized R² Test" );
dt_results << New Column( "Entropy R² Training" );
dt_results << New Column( "Entropy R² Validation" );
//dt_results << New Column( "Entropy R² Test" );
dt_results << New Column( "RMSE Training" );
dt_results << New Column( "RMSE Validation" );
//dt_results << New Column( "RMSE Test" );
dt_results << New Column( "Mean Abs Dev Training" );
dt_results << New Column( "Mean Abs Dev Validation" );
//dt_results << New Column( "Mean Abs Dev Test" );
dt_results << New Column( "Misclassification Rate Training" );
dt_results << New Column( "Misclassification Rate Validation" );
//dt_results << New Column( "Misclassification Rate Test" );
dt_results << New Column( "-LogLiklihood Training" );
dt_results << New Column( "-LogLiklihood Validation" );
//dt_results << New Column( "-LogLiklihood Test" );
dt_results << New Column( "Sum Freq Training" );
dt_results << New Column( "Sum Freq Validation" );
//dt_results << New Column( "Sum Freq Test" );

imax = N Row( dt_parms );

For( i = 1, i <= imax, i++,
	Nlayer = dt_parms:N_Layers[i];
	If( dt_results:N_Layers[i] == 1,
		dt_results:N_TanH_2[i] = 0;
		dt_results:N_Linear_2[i] = 0;
		dt_results:N_Gauss_2[i] = 0;
	);
	If( dt_results:N_Layers[i] == 2,
		dt_results:N_Boosts[i] = 0;
		dt_results:Learn_Rate[i] = 0.1;
	);
	NTH1 = dt_results:N_TanH_1[i];
	NTH2 = dt_results:N_TanH_2[i];
	NL1 = dt_results:N_Linear_1[i];
	NL2 = dt_results:N_Linear_2[i];
	NG1 = dt_results:N_Gauss_1[i];
	NG2 = dt_results:N_Gauss_2[i];
	Nboosts = dt_results:N_Boosts[i];
	LR = dt_results:Learn_Rate[i];
	TCov = dt_results:T_Cov[i];
	RFit = dt_results:Robust_Fit[i];
	PMethod = dt_results:Penalty_Method[i];
	Ntours = dt_results:N_Tours[i];
	
	str = Eval Insert(
		"report = (dt << Neural(
        Y( :health designation ),
		 X(:OFA...22_c0_g2_i1.p1 ), 
       Validation ( \!"KFold\!", 5 ),
       Informative Missing(0), 
       Transform Covariates(^TCov^),
       Fit(
       	NTanH(^NTH1^),
       	NLinear(^NL1^),
       	NGaussian(^NG1^),
       	NTanH2(^NTH2^),
       	NLinear2(^NL2^),
       	NGaussian2(^NG2^),
       	Transform Covariates(^TCov^),
       	Penalty Method(\!"PMethod\!"),
       	Number of Tours(^Ntours^),
       	N Boost(^Nboosts^),
       	Learning Rate(^LR^)
       )),
       Go,
              invisible
       ) << Report;"
	);
	Eval( Parse( str ) );
	w = Window( dt << GetName || " - " || "Neural of health designation by OFA...22_c0_g2_i1.p1" );
	If( !Is Missing( Regex( w << Get Window Title, "health designation by OFA...22_c0_g2_i1.p1" ) ) == 1,
		ncol_box_v = 10;
		//ncol_box_test =19;
	);
	T_stats = w[Outline Box( 3 ), Number Col Box( 1 )] << Get;//You might need to adjust the Outline Box () and Number Col Box() values depending on your tree structure
	V_stats = w[Outline Box( 3 ), Number Col Box( ncol_box_v )] << Get;//You might need to adjust the Outline Box () and Number Col Box() values depending on your tree structure
	//Test_stats = w[Outline Box( 3 ), Number Col Box( ncol_box_test )] << Get;
	report << Close Window;
	
	dt_results:Generalized R² Training[i] = T_stats[1];
	dt_results:Entropy R² Training[i] = T_stats[2];
	dt_results:RMSE Training[i] = T_stats[3];
	dt_results:Mean Abs Dev Training[i] = T_stats[4];
	dt_results:Misclassification Rate Training[i] = T_stats[5];
	dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[6];
	dt_results:Sum Freq Training[i] = T_stats[7];
	
	dt_results:Generalized R² Validation[i] = V_stats[1];
	dt_results:Entropy R² Validation[i] = V_stats[2];
	dt_results:RMSE Validation[i] = V_stats[3];
	dt_results:Mean Abs Dev Validation[i] = V_stats[4];
	dt_results:Misclassification Rate Validation[i] = V_stats[5];
	dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[6];
	dt_results:Sum Freq Validation[i] = V_stats[7];
	
	//dt_results:Generalized R² Test[i] = Test_stats[1];
	//dt_results:Entropy R² Test[i] = Test_stats[2];
	//dt_results:RMSE Test[i] = Test_stats[3];
	//dt_results:Mean Abs Dev Test[i] = Test_stats[4];
	//dt_results:Misclassification Rate Test[i] = Test_stats[5];
	//dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[6];
	//dt_results:Sum Freq Test[i] = Test_stats[7];
);

 

Good luck!,

DS

abmayfield
Level VI

Re: building multiple neural network models

Wow, I can't believe that I've run 300 neural network models before lunch thanks to your scripting help. I hope this gets picked up and "kudo'd" as I imagine when more and more people start seeing the value of the NN platform, they will gravitate towards something like this, especially if it's a brand new dataset and you don't really know where to "dive in" with respect to the bewildering amount of NN model input parameters that can be slightly tweaked (sometimes resulting in huge differences even when just increasing, for instance, the learning rate from 0.05 to 0.1). I will play around with the different validation approaches next (validation column vs. holdback vs. KFold). Definitely need to be careful given my (currently) small sample size (hoping to use these preliminary data to get funds to add more "rows" i.e., corals). 

Anderson B. Mayfield
SDF1
Super User

Re: building multiple neural network models

Glad to hear it's working out for you. Good luck with the modeling! You might want to check other model platforms as well and then compare models on a test data set to see which is the best predictor.

DS
abmayfield
Level VI

Re: building multiple neural network models

I have a (likely simple) scripting question. Say I want to change the input dt to another one. Shouldn't I just be able to copy and paste the script, and substitute the dt from "Neural" to something else?

 

 

str = Eval Insert(
"report = (dt << Neural (
        Y( :health designation ),
        X(

 

 

If I delete the original "Neural" file and substitute it with another table with the exact same information and a different name, the script won't work! Is it necessary to have the exact same data in two data tables, one of which is named "Neural?" Maybe I missed that part!

Anderson B. Mayfield
SDF1
Super User

Re: building multiple neural network models

Hi @abmayfield ,

 

  If you go back and review the script, the second line defines what the data table is to be used during the analysis. The JSL code dt = Current Data Table() defines a local variable "dt" that is then used in the rest of the script instead of having to call the data table by name. Similarly, dt_parms is the local variable that is assigned to the NN tuning table, "NN tuning". Those two tables are really the only things that you feed to the script. The rest is all internal.

 

  If you copy/paste the JSL code into a new data table it should work automatically because you're running it locally in that data table.

 

  You should not modify the call dt << Nerual(), except to define your Y()'s and X()'s, and maybe the Validation() method. The JSL code dt << Nerual() is a command that is telling JMP to perform NN modeling on the data table "dt". Hence, the definition of what dt is at the beginning of the code.

 

  You will also need to modify the w = Window() call to reflect what the title of the NN report window will be, as well as the Regex() below it. Finally, depending on whether or not you actually will have the "test" data set as part of a validation column, then you'll either want to comment out (with the // command at the start of the line) or remove the comment out at the beginning of the code when it creates the Test result columns as well as the Test_stats definition and where the code writes the results to the new output data table.

 

  Other than than that, you shouldn't have to edit much at all.

 

  With regards to your last question, it kind of depends on how you're implementing the code. If you're running it from within the data table, i.e. that it's saved in the script portion of the data table and you're running it via the green hot button, then you would only need to change the Y() and X()'s that you're using as predictors and response to match the column names from the new data table. You will of course need to have the "NN tuning" table open. On the other hand, if you're running this as a separate script, then you might want to change the definition of dt from dt = Current Data Table() to dt = Data Table (" "), where you'd insert the name of the new data table between the quotes.

 

 The "Neural" you're referring to is not a data table but the JMP command to run a Neural Net model on a specified data table. The "<<" command means to perform the action which is to the right of the << on the object to the left. Near the start of the code you can see that with the new column command. Where it says dt_results << New Column(), the code is telling JMP to create a new column for the object dt_results, which is the subset data generated from the dt_parms data table. 

 

Hope this helps!,

DS

abmayfield
Level VI

Re: building multiple neural network models

OK, great. This answers a few additional questions I had. Now the only thing I'm struggling with is, when I copy, paste, and run the script in a new data table, I get an outline box error, which you allude to in the script comments. "expected character arg 1 in access or evaluation of outline box." What does that mean?? I've increased and decreased the numbers, but I get the same error message every time, and the script will not run. I don't even know what an outline box is. It's weird that the tree structure is identical. I just changed the file name. The file name is longer. Maybe that means I need a larger outline box?
Anderson B. Mayfield
SDF1
Super User

Re: building multiple neural network models

Hi @abmayfield ,

 

  The outline box error is probably from when it tries to grab the statistic values from the NN window. Can you share the new data table and code with me? I can troubleshoot it and see if I can make it easier to port it from one data table to another.

 

  I think it might have to do with how I've set up the part to check the data column type and to make the ncol_box variable a different value. If it's nominal this value is different than if it's ordinal, or even numerical.

 

  If you can share the data table and code that you're getting this error with, I'm sure I can fix it. If you want, you could anonymize the data table: Tables > Anonymize.

 

DS

abmayfield
Level VI

Re: building multiple neural network models

It's weird. If I just keep running the script, it will eventually work!

 

Another strange thing: no matter what I change the holdback percentage to, it always gives me 11 training vs. 9 validation (~55/45%). Is it not possible to do, say, 70/30%? I have a strong feeling this is because I simply entered in the command incorrectly, rather than it being impossible!

Anderson B. Mayfield
SDF1
Super User

Re: building multiple neural network models

Hi @abmayfield ,

 

  Thanks for sharing the data table. I was able to find a few things.

  1. Found out where the error was that it keeps writing to the log window and fixed that. It's in the str call. Had to move a parenthesis to a different location.
  2. I found out that when doing K-fold or Holdback, you need to change the Validation() to Validation Method(). Then it works and proportions the data appropriately.
  3. I recoded how it tests for the modeling type of the column and made some associated changes.
  4. I also wrote it so the same code can be used for Nominal, Ordinal, and Continuous. This can be done simply by changing the variable y_resp -- the Y response column name. It does all the rest automatically.
  5. I also added a title to the results data table that changes with the "host" data table and the validation type so you can run all three methods and there will be no naming interference from JMP.
  6. The three NN fitting scripts in your data table can now by copy-pasted from one data table to the next without issues. I tested it with the first data table you sent, and it worked just fine.

  It's getting a little more automated now, and is a bit improved. I would prefer to have it check the validation method and then generate the extra columns and assign values accordingly. Maybe in the next roll-out!

 

Here's what the code looks like in general. Notice that I was testing on the temperature column to check for continuous modeling types.

Names Default To Here( 1 );
dt = Current Data Table();
dt_parms = Data Table( "NN Tuning 8" );
dtp_name = dt_parms << get name;
dt_name = dt << get name;

//edit the line below so that it has the name of the column you want to model.
y_resp = "temperature (1, 2, 3)";
response_col = Column( dt, Eval( Eval Expr( Expr( y_resp ) ) ) );

dt_results = dt_parms << Subset( Output Table( "Output of " || dtp_name || " with (validation type)" || " for " || dt_name), All rows, Selected Columns Only( 0 ) );

//This tests the response column type to automatically set the values for pulling the data out of the report window.
//It also will create the appropriate columns depending on the modeling type of the response column.
coltype = response_col << Get Modeling Type;
If( coltype == "Ordinal",
	o_box = 3;
	ncol_box_v = 10;
	ncol_box_test = 19;
	dt_results << New Column( "Generalized R² Training" );
	dt_results << New Column( "Generalized R² Validation" );
	dt_results << New Column( "Generalized R² Test" );
	dt_results << New Column( "Entropy R² Training" );
	dt_results << New Column( "Entropy R² Validation" );
	dt_results << New Column( "Entropy R² Test" );
	dt_results << New Column( "RMSE Training" );
	dt_results << New Column( "RMSE Validation" );
	dt_results << New Column( "RMSE Test" );
	dt_results << New Column( "Mean Abs Dev Training" );
	dt_results << New Column( "Mean Abs Dev Validation" );
	dt_results << New Column( "Mean Abs Dev Test" );
	dt_results << New Column( "Misclassification Rate Training" );
	dt_results << New Column( "Misclassification Rate Validation" );
	dt_results << New Column( "Misclassification Rate Test" );
	dt_results << New Column( "-LogLiklihood Training" );
	dt_results << New Column( "-LogLiklihood Validation" );
	dt_results << New Column( "-LogLiklihood Test" );
	dt_results << New Column( "Sum Freq Training" );
	dt_results << New Column( "Sum Freq Validation" );
	dt_results << New Column( "Sum Freq Test" );
);
If( coltype == "Nominal",
	o_box = 3;
	ncol_box_v = 10;
	ncol_box_test = 19;
	dt_results << New Column( "Generalized R² Training" );
	dt_results << New Column( "Generalized R² Validation" );
	dt_results << New Column( "Generalized R² Test" );
	dt_results << New Column( "Entropy R² Training" );
	dt_results << New Column( "Entropy R² Validation" );
	dt_results << New Column( "Entropy R² Test" );
	dt_results << New Column( "RMSE Training" );
	dt_results << New Column( "RMSE Validation" );
	dt_results << New Column( "RMSE Test" );
	dt_results << New Column( "Mean Abs Dev Training" );
	dt_results << New Column( "Mean Abs Dev Validation" );
	dt_results << New Column( "Mean Abs Dev Test" );
	dt_results << New Column( "Misclassification Rate Training" );
	dt_results << New Column( "Misclassification Rate Validation" );
	dt_results << New Column( "Misclassification Rate Test" );
	dt_results << New Column( "-LogLiklihood Training" );
	dt_results << New Column( "-LogLiklihood Validation" );
	dt_results << New Column( "-LogLiklihood Test" );
	dt_results << New Column( "Sum Freq Training" );
	dt_results << New Column( "Sum Freq Validation" );
	dt_results << New Column( "Sum Freq Test" );
);
If( coltype == "Continuous",
	o_box = 3;
	ncol_box_v = 2;
	ncol_box_test = 3;
	dt_results << New Column( "R² Training" );
	dt_results << New Column( "R² Validation" );
	dt_results << New Column( "R² Test" );
	dt_results << New Column( "RMSE Training" );
	dt_results << New Column( "RMSE Validation" );
	dt_results << New Column( "RMSE Test" );
	dt_results << New Column( "Mean Abs Dev Training" );
	dt_results << New Column( "Mean Abs Dev Validation" );
	dt_results << New Column( "Mean Abs Dev Test" );
	dt_results << New Column( "-LogLiklihood Training" );
	dt_results << New Column( "-LogLiklihood Validation" );
	dt_results << New Column( "-LogLiklihood Test" );
	dt_results << New Column( "SSE Training" );
	dt_results << New Column( "SSE Validation" );
	dt_results << New Column( "SSE Test" );
	dt_results << New Column( "Sum Freq Training" );
	dt_results << New Column( "Sum Freq Validation" );
	dt_results << New Column( "Sum Freq Test" );
);


imax = N Row( dt_parms );

For( i = 1, i <= imax, i++,
	Nlayer = dt_parms:N_Layers[i];
	If( dt_results:N_Layers[i] == 1,
		dt_results:N_TanH_2[i] = 0;
		dt_results:N_Linear_2[i] = 0;
		dt_results:N_Gauss_2[i] = 0;
	);
	If( dt_results:N_Layers[i] == 2,
		dt_results:N_Boosts[i] = 0;
		dt_results:Learn_Rate[i] = 0.1;
	);
	NTH1 = dt_results:N_TanH_1[i];
	NTH2 = dt_results:N_TanH_2[i];
	NL1 = dt_results:N_Linear_1[i];
	NL2 = dt_results:N_Linear_2[i];
	NG1 = dt_results:N_Gauss_1[i];
	NG2 = dt_results:N_Gauss_2[i];
	Nboosts = dt_results:N_Boosts[i];
	LR = dt_results:Learn_Rate[i];
	TCov = dt_results:T_Cov[i];
	RFit = dt_results:Robust_Fit[i];
	PMethod = dt_results:Penalty_Method[i];
	Ntours = dt_results:N_Tours[i];
	
	str = Eval Insert(
		"report = (dt << Neural(
        Y( response_col ),
		 X(
		 :OFA...22_c0_g2_i1.p1,
:OFA...47_c0_g1_i1.p1,
:OFA...73_c1_g1_i3.p1,
:OFA...72_c1_g1_i1.p1,
:OFA...5_c1_g2_i15.p1,
:OFA...78_c2_g1_i3.p1,
:OFA...51_c4_g1_i1.p1,
:OFA...38_c0_g1_i8.p1,
:OFA...19_c0_g1_i1.p1,
:OFA...22_c2_g1_i4.p1,
:OFA...27_c1_g3_i5.p1,
:OFA...82_c3_g2_i3.p1,
:OFA...11_c1_g1_i7.p1,
:OFA...50_c2_g4_i1.p3,
:OFA...18_c1_g1_i7.p1,
:OFA...66_c2_g1_i3.p1,
:OFA...85_c0_g1_i2.p1,
:OFA...99_c5_g1_i6.p1,
:OFA...03_c2_g1_i1.p1,
:SYM...24_c0_g1_i1.p1,
:SYM...75_c0_g2_i1.p1,
:SYM...42_c0_g1_i1.p1,
:SYM...04_c0_g1_i1.p1,
:SYM...97_c0_g1_i1.p1,
:SYM...13_c0_g1_i1.p1,
:SYM...66_c0_g1_i1.p1,
:SYM...72_c0_g6_i1.p1,
:SYM...13_c0_g1_i1.p1 2,
:SYM...33_c0_g1_i1.p1 3,
:SYM...43_c0_g1_i1.p1,
:SYM...51_c0_g1_i1.p1,
:SYM...65_c0_g1_i1.p1 2 
), 
       Validation ( :Validation with test ),
       Informative Missing(0), 
       Transform Covariates(^TCov^),
       Fit(
       	NTanH(^NTH1^),
       	NLinear(^NL1^),
       	NGaussian(^NG1^),
       	NTanH2(^NTH2^),
       	NLinear2(^NL2^),
       	NGaussian2(^NG2^),
       	Transform Covariates(^TCov^),
       	Penalty Method(\!"PMethod\!"),
       	Number of Tours(^Ntours^),
       	N Boost(^Nboosts^),
       	Learning Rate(^LR^)
       ),
       Go,
              invisible
       )) << Report;"
	);
	Eval( Parse( str ) );
	w = Window( dt << GetName || " - " || "Neural of " || y_resp );
	
	
	
	T_stats = w[Outline Box( o_box ), Number Col Box( 1 )] << Get;
	V_stats = w[Outline Box( o_box ), Number Col Box( ncol_box_v )] << Get;
	Test_stats = w[Outline Box( o_box ), Number Col Box( ncol_box_test )] << Get;
	report << Close Window;
	
	If( coltype == "Ordinal",
		dt_results:Generalized R² Training[i] = T_stats[1];
		dt_results:Entropy R² Training[i] = T_stats[2];
		dt_results:RMSE Training[i] = T_stats[3];
		dt_results:Mean Abs Dev Training[i] = T_stats[4];
		dt_results:Misclassification Rate Training[i] = T_stats[5];
		dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[6];
		dt_results:Sum Freq Training[i] = T_stats[7];
	
		dt_results:Generalized R² Validation[i] = V_stats[1];
		dt_results:Entropy R² Validation[i] = V_stats[2];
		dt_results:RMSE Validation[i] = V_stats[3];
		dt_results:Mean Abs Dev Validation[i] = V_stats[4];
		dt_results:Misclassification Rate Validation[i] = V_stats[5];
		dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[6];
		dt_results:Sum Freq Validation[i] = V_stats[7];
	
		dt_results:Generalized R² Test[i] = Test_stats[1];
		dt_results:Entropy R² Test[i] = Test_stats[2];
		dt_results:RMSE Test[i] = Test_stats[3];
		dt_results:Mean Abs Dev Test[i] = Test_stats[4];
		dt_results:Misclassification Rate Test[i] = Test_stats[5];
		dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[6];
		dt_results:Sum Freq Test[i] = Test_stats[7];
	);
	
	If( coltype == "Nominal",
		dt_results:Generalized R² Training[i] = T_stats[1];
		dt_results:Entropy R² Training[i] = T_stats[2];
		dt_results:RMSE Training[i] = T_stats[3];
		dt_results:Mean Abs Dev Training[i] = T_stats[4];
		dt_results:Misclassification Rate Training[i] = T_stats[5];
		dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[6];
		dt_results:Sum Freq Training[i] = T_stats[7];
	
		dt_results:Generalized R² Validation[i] = V_stats[1];
		dt_results:Entropy R² Validation[i] = V_stats[2];
		dt_results:RMSE Validation[i] = V_stats[3];
		dt_results:Mean Abs Dev Validation[i] = V_stats[4];
		dt_results:Misclassification Rate Validation[i] = V_stats[5];
		dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[6];
		dt_results:Sum Freq Validation[i] = V_stats[7];
	
		dt_results:Generalized R² Test[i] = Test_stats[1];
		dt_results:Entropy R² Test[i] = Test_stats[2];
		dt_results:RMSE Test[i] = Test_stats[3];
		dt_results:Mean Abs Dev Test[i] = Test_stats[4];
		dt_results:Misclassification Rate Test[i] = Test_stats[5];
		dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[6];
		dt_results:Sum Freq Test[i] = Test_stats[7];
	);
	
	If( coltype == "Continuous",
		dt_results:R² Training[i] = T_stats[1];
		dt_results:RMSE Training[i] = T_stats[2];
		dt_results:Mean Abs Dev Training[i] = T_stats[3];
		dt_results:Name( "-LogLiklihood Training" )[i] = T_stats[4];
		dt_results:SSE Training[i] = T_stats[5];
		dt_results:Sum Freq Training[i] = T_stats[6];
	
		dt_results:R² Validation[i] = V_stats[1];
		dt_results:RMSE Validation[i] = V_stats[2];
		dt_results:Mean Abs Dev Validation[i] = V_stats[3];
		dt_results:Name( "-LogLiklihood Validation" )[i] = V_stats[4];
		dt_results:SSE Validation[i] = V_stats[5];
		dt_results:Sum Freq Validation[i] = V_stats[6];
	
		dt_results:R² Test[i] = Test_stats[1];
		dt_results:RMSE Test[i] = Test_stats[2];
		dt_results:Mean Abs Dev Test[i] = Test_stats[3];
		dt_results:Name( "-LogLiklihood Test" )[i] = Test_stats[4];
		dt_results:SSE Test[i] = Test_stats[5];
		dt_results:Sum Freq Test[i] = Test_stats[6];
	);
);

Let me know if there are any further issues.

 

Thanks!,

DS