cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Choose Language Hide Translation Bar

Conformal Prediction

What inspired this wish list request? 

JMP has made a range of machine learning algorithms readily available - a major deficiency in most of these is the lack of uncertainty measures for predictions, particularly for classification models.  There are a number of tools available for assessing model accuracy (e.g., ROC curves, confusion matrices, etc.) but few for assessing the uncertainty in the predictions.  Unlike classical statistical models (where there is an assumed error structure), the degree of uncertainty in the predictions is often lacking.  Conformal prediction appears to be a relatively straightforward way to provide this for a variety of machine learning methods.

 

 

What is the improvement you would like to see? 

I would like to see the conformal prediction capability attached to any of the classification modeling methods employed in JMP.  The resulting output should show (for classification models) the predicted classification sets, including an indeterminate set where the desired level of certainty cannot be assured. 

 

 

Why is this idea important? 

Conformal prediction has been around for at least 20 years, but is hindered (in my opinion) by an unnecessarily mathematical description of the technique.  I think it is more straightforward than as typically presented.  I also think it is a nonparametric technique for quantifying uncertainty in predictions - sorely needed as machine learning models are increasingly used.  The lack of quantifying predictive uncertainty (rather than just accuracy) limits the usefulness of these methods, and potentially makes their application misguided (yielding predictions where there may be little to distinguish between observations that can reliably be predicted from those for which it cannot.  For one reference, see https://arxiv.org/pdf/2107.07511.pdf.