Our World Statistics Day conversations have been a great reminder of how much statistics can inform our lives. Do you have an example of how statistics has made a difference in your life? Share your story with the Community!
Choose Language Hide Translation Bar

XGBoost Add-In for JMP Pro

The XGBoost add-in for JMP Pro provides a point-and-click interface to the popular XGBoost open-source library for predictive modeling with extreme gradient boosted trees. Value-added functionality of this add-in includes:

 

•    Repeated k-fold cross validation with out-of-fold predictions, plus a separate routine to create optimized k-fold validation columns, optionally stratified or grouped.

•    Ability to fit multiple Y responses in one run.

•    Automated parameter search via JMP Design of Experiments (DOE) Fast Flexible Filling Design

•    Interactive graphical and statistical outputs

•    Model comparison interface

•    Profiling Export of JMP Scripting Language (JSL) and Python code for reproducibility

 

Click the link above to download the add-in, then drag onto JMP Pro 15 or 16 to install.   The most recent interface features are available with the most recent JMP 16 Pro Early Adopter.

 

See the attached .pdf for details and examples, which can be found in the attached .zip file (along with a journal and other examples used in this tutorial). 

 

xgb_wine1.JPGxgb_wine2.JPG

 

Related Materials

Video Tutorial with Downloadable Journal: XGBoost Add-In for JMP Pro (2020-US-45MP-540) 

Add-In for JMP Genomics:  Python Predictive Methods 

Add-In from @Franck_R : Machine Learning with XGBoost in a JMP Addin (JMP app + python) 

Comments

Congrats Russ, this addin looks indeed very promising and useful!

Thank you Franck, dittos for yours!   On Kaggle it appears LightGBM is the most popular so that may be a direction we would want to pursue down the road, and also CatBoost.