cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Browse apps to extend the software in the new JMP Marketplace
Choose Language Hide Translation Bar
Nimaxim
Level II

Customization and Evaluation of Neural Networks in K-fold Cross-Validation in model screening platform

I need to use neural network modeling for predicting variables and to avoid overfitting, I'm utilizing the K-fold cross-validation method. On the model screening platform, the neural network is fitted using boosting with three TanH functions. Is it possible to customize the neural network by using a different number and configuration than the platform’s default settings? I need to obtain the R-square value for each fold individually, and I also want to adjust the neural network settings, such as changing the number of functions and applying robust fitting.

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Customization and Evaluation of Neural Networks in K-fold Cross-Validation in model screening platform

@Nimaxim -

Here are a few thoughts on how to proceed with your NN models.

 

Model Screening

When you run Model Screening, the NN platform will run a baseline set of parameters: 3 TanH nodes with 20 Boosts. As far as I know, you cannot adjust these baseline parameters in Model Screening.

 

Building Individual NN Models

You can of course create individual models by launching Analyze > Predictive Modeling > Neural. In this interface you can specify many additional parameters to find the best NN model for your system.

 

When you build a NN model in the NN platform, you can apply a KFold column. To do this, first create a KFold column by going to Analyze > Predictive Modeling > Make Validation Column. In the Selected Method menu, select Make K Fold column, click OK, and then specify the number of folds and click Go. A new column will appear in your data table with title "n Fold Column". Launch the NN platform and add this column as your Validation Column.:

scott_allen_3-1723828097539.png

 

Compare R2 within all the Folds

Comparing the R2 with all the folds is a little tricky, but there is a way. First, build your NN models in the NN platform. Next, save the formula (Profile Formula is fine) for each model using the Red Triangle menu. If you have several models you want to save, press Ctrl on your keyboard and select Save Profile Formulas to do this for all the models in your report at the same time. In this example, I built two models and saved their formulas to the table (you will notice that you do not know which Fold was used for Validation in this model report):

scott_allen_0-1723825248675.png


Next, go to Analyze > Predictive Modeling > Model Comparison and load the NN Prediction formulas and use the KFold column as the Group role:

scott_allen_1-1723825410392.png

In the Model Comparison report you get a table that shows the Measures of Fit for each Fold in each Model:

scott_allen_2-1723827035373.png


You may notice that the models differ between runs. If they differ considerably, you can specify a random seed in the Model Dialog. If you use the same Random Seed and same Validation Column, the measures of fit should be consistent each time you run the models.

Building and Comparing Multiple NNs

Building and comparing multiple NN models in JMP is simple. You can specify and change parameters easily in the model dialog. If you find that you want to build and compare many different NN models with a wide range of functions, nodes, and other fitting parameters, you might want to try the NN Tuning add-in I built. You can find it here: Neural Network Tuning Add-in 

 

-Scott

View solution in original post

1 REPLY 1

Re: Customization and Evaluation of Neural Networks in K-fold Cross-Validation in model screening platform

@Nimaxim -

Here are a few thoughts on how to proceed with your NN models.

 

Model Screening

When you run Model Screening, the NN platform will run a baseline set of parameters: 3 TanH nodes with 20 Boosts. As far as I know, you cannot adjust these baseline parameters in Model Screening.

 

Building Individual NN Models

You can of course create individual models by launching Analyze > Predictive Modeling > Neural. In this interface you can specify many additional parameters to find the best NN model for your system.

 

When you build a NN model in the NN platform, you can apply a KFold column. To do this, first create a KFold column by going to Analyze > Predictive Modeling > Make Validation Column. In the Selected Method menu, select Make K Fold column, click OK, and then specify the number of folds and click Go. A new column will appear in your data table with title "n Fold Column". Launch the NN platform and add this column as your Validation Column.:

scott_allen_3-1723828097539.png

 

Compare R2 within all the Folds

Comparing the R2 with all the folds is a little tricky, but there is a way. First, build your NN models in the NN platform. Next, save the formula (Profile Formula is fine) for each model using the Red Triangle menu. If you have several models you want to save, press Ctrl on your keyboard and select Save Profile Formulas to do this for all the models in your report at the same time. In this example, I built two models and saved their formulas to the table (you will notice that you do not know which Fold was used for Validation in this model report):

scott_allen_0-1723825248675.png


Next, go to Analyze > Predictive Modeling > Model Comparison and load the NN Prediction formulas and use the KFold column as the Group role:

scott_allen_1-1723825410392.png

In the Model Comparison report you get a table that shows the Measures of Fit for each Fold in each Model:

scott_allen_2-1723827035373.png


You may notice that the models differ between runs. If they differ considerably, you can specify a random seed in the Model Dialog. If you use the same Random Seed and same Validation Column, the measures of fit should be consistent each time you run the models.

Building and Comparing Multiple NNs

Building and comparing multiple NN models in JMP is simple. You can specify and change parameters easily in the model dialog. If you find that you want to build and compare many different NN models with a wide range of functions, nodes, and other fitting parameters, you might want to try the NN Tuning add-in I built. You can find it here: Neural Network Tuning Add-in 

 

-Scott