Hi @CentroidError56,
You're right, this option will only work with default hyperparameter settings.
Here is how I would do with specific hyperparameter configurations to be tested on the same folds:
- Create a K-folds validation column using the Make Validation Column platform. Specify the stratification columns (the X's) and the target (Y). You can then define the number of folds and set a random seed.
- Launch the modeling platform of your choice with this K-folds column in the validation role.
In this example for illustration, I used the JMP Diabetes dataset and the SVM platform. I realized a tuning design to find 3 appropriate hyperparameters settings (models 2, 3 and 20) :

-
I then press CTRL + click on the option Publish prediction formula (available in the red triangle next to any model). A new Formula Depot window is created with the formula of the three models seen earlier :

-
In the red triangle next to Formula Depot, I click on the option Model Comparison and select my three models :

JMP notices than my 5-folds validation column is used for all models, and ask if it can use it as a grouping column, so I click yes to have individual models results for each fold.
- I have now access to the individual folds measure of performances for each of the three models :

This method should work no matter the modeling platform used, as it is only using the models prediction formula, so you can combine different models types.
Hope this answer will solve your problem,
Victor GUILLER
"It is not unusual for a well-designed experiment to analyze itself" (Box, Hunter and Hunter)