I know the validation set can be used to inform the model building process and the test set is not touched. For example, in other models, the validation R2 can be used as a stoppage rule. How is the validation set used in neural networks? I can't find any information about this in JMP documentation or online. Thanks.
Some choice to make when building a neural network with JMP Pro:
All of these choices represent parameters that you choose to tune your model to get the best possible fit. Too highly tuned and you get over-fitting. So the validation set is used to help protect against over-fitting your model as a result of your choice of these tuning parameters.
Since you are using the validation set to help in the choice of model paremeters it is no longer an indepedent assessment of the performance of your model. That's where the test set comes in.
I typically use the validation results in Neural Nets as a check for over fitting.
yes but what makes it different from the test set. It must be used somehow during model building as a check.
Some choice to make when building a neural network with JMP Pro:
All of these choices represent parameters that you choose to tune your model to get the best possible fit. Too highly tuned and you get over-fitting. So the validation set is used to help protect against over-fitting your model as a result of your choice of these tuning parameters.
Since you are using the validation set to help in the choice of model paremeters it is no longer an indepedent assessment of the performance of your model. That's where the test set comes in.
The training subset of the data is used to fit the neural network model. This model is then used to predict the observations in the validation set. The validation set is in no way used in the parameter estimation. There is no 'stopping rule' because the model does not change (number of terms/parameters) as it would in Partition or Stepwise.
As previously suggested, you compare the performance of the training and validation to assess if you are under- or over-fitting.