Choose Language Hide Translation Bar
Adele
Occasional Contributor

Re: Neural nets_validation sets get higher model fits than training sets

Thank you for your suggestions. We tried k-fold, and it works better in our case. However, new problems come out: how could we do k-fold cross-validation for bootstrap forests?

Thanks.

0 Kudos

Re: Neural nets_validation sets get higher model fits than training sets

You cannot use K-fold cross-validation with Bootstrap Forest model selection in JMP. You can only use hold-out sets.

 

Remember that most predictive model types are intended for large data sets. You might do as well with multiple regression in your case if the data meet the model assumptions.

Learn it once, use it forever!
0 Kudos
Adele
Occasional Contributor

Re: Neural nets_validation sets get higher model fits than training sets

Thank you. Yes, I only found the scripts below. But I could not combine it with the bootstrap forest model.

 

obj = Open( "$SAMPLE_DATA/Car Poll.jmp" ) << Partition(
	Y( :country ),
	X( :sex, :marital status, :age, :type, :size )
);

obj << K Fold Crossvalidation( 5 );

obj << Go;

 

 

Thank you so much for your helping.

0 Kudos
Adele
Occasional Contributor

Re: Neural nets_validation sets get higher model fits than training sets

Dear Mr. Bailey,

    Thanks. I have one more question:  can we use K-fold cross-validation with standard least squares regression in JMP?

0 Kudos

Re: Neural nets_validation sets get higher model fits than training sets

No, the Fit Least Squares platform in JMP does not provide any cross-validation. The Generalized Regression platform in JMP Pro, which includes the standard linear regression model, provides both hold-out cross-validation in the launch dialog box and K-fold cross-validation as a platform validation option.

 

You might consider using AICc for model selection in place of cross-validation because of the small sample size.

 

You might also consider using open source software if it provides the flexibility you demand. You can connect to R or Python, for example, with JMP and work with both software.

Learn it once, use it forever!

View solution in original post

Re: Neural nets_validation sets get higher model fits than training sets

JMP Pro does offer k-fold for regression analysis under the Stepwise Platform. It is used to determine which terms to keep in the model. From the JMP Documentation:

 

K-Fold Crossvalidation (Available only for continuous responses.) Performs K-Fold cross validation in the selection process. When selected, this option enables the Max K-Fold RSquare stopping rule (“Stepwise Regression Control Panel” on page 250). For more
information about validation, see “Using Validation” on page 275.

Dan Obermiller

Re: Neural nets_validation sets get higher model fits than training sets

I forgot to mention that K-fold cross-validation is specifically for cases with small sample sizes. Did you try this method instead of a validation hold out set?

Learn it once, use it forever!

View solution in original post