I really like the new Model Screening platform option in JMP under Analyze > Predictive Modeling > Model Screening, as referenced in: Model Screening in JMP Pro 16
https://community.jmp.com/t5/JMP-Blog/Model-Screening-in-JMP-Pro-16/bc-p/384039#M4078.
One feature I think would be especially useful for me - in the context of this example or otherwise - would be an option similar to the Prediction Profiler when I am specifying fitting models in certain platforms in JMP (like Analyze > Fit Model) - to "Remember Settings" so that I can effectively 'toggle' between different tradeoff scenarios for false positive and false negative rates.
The option to "Profile" the level setting on the decision threshold probability in order the tune the model to focus in on minimizing false negatives at the potential expense of increasing false positives is the scenario I am particularly interested in. From a quality-risk perspective (protecting the end-user at the cost of higher mfg scrap), higher false alarms can be tolerated.
Here is a picture which illustrates what I am talking about (just a plain-vanilla "Remember Settings" output from the Fit Model Platform under the Prediction Profiler after selecting "Remember Settings" at various combinations of input settings of X's on Y (same Diabetes.jmp sample dataset example):
Thanks!