Hi @Nazarkovsky,
It would have been probably easier to start a new post instead of replying to this old post, as it would have provide more visibility and enable other JMP users to join the discussion about your question.
Yes, there are sometimes some errors in JMP modeling platforms when using fixed K-Folds crossvalidation column, as JMP is mostly recognizing 2 levels (training and validation or test) or 3 levels (training, validation and test) from validation column. In the error message you have, no CV is done, a linear model is fit on all your data.
Since you're using a very simple linear model, you can perhaps try to use Model Screening platform (which also includes PLS if you want to compare the different model's results), specifying your X's, your response Y, your 4-folds validation column (or doing the crossvalidation directly from the platform), and the type of model you want to fit (linear regression models options + modeling options):

Or directly with the K-folds crossvalidation option from the platform, you can obtain these results:

If you want to run a Leave-One-Out cross-validation, specify simply K=n (n being the number of runs in your table). If you want to have access to individual folds results, you can adapt the solution I provided here to your modeling platform: Accessing out of fold metrics for K-Fold CV
Finally, I see at least one major problem in your modeling workflow that could lead to data leakage : you need to apply the cross-validation or your splitting/validation strategy on your principal component analysis too. If you don't do it, the PCA will see the entire dataset and learn the correlations between the factors from all the dataset, and the linear model you're fitting after (and evaluate thanks to cross-validation) will benefit from the entire information from the dataset, not the information from 3 out of 4 folds. Since JMP hasn't implemented a validation role (yet !) for PCA, this is why I have added this Wishlist, to prevent data leakage and possible errors in modeling workflows like yours: Add validation role option in Principal Component Analysis platform To summarize, you should :
- Define and apply a validation strategy (CV, LOO or standard train/validation or train/validation/test split).
- Preprocess the data only on training set (or preprocess the data several time depending on the folds used)
- Fit a model using the same validation strategy with the preprocessed data from training set (or from the corresponding training folds).
Hope this answer will help you,
Victor GUILLER
"It is not unusual for a well-designed experiment to analyze itself" (Box, Hunter and Hunter)