New for JMP 17 Pro: An Autotune option makes it even easier to automatically tune hyperparameters using a Fast Flexible Filling design.
The XGBoost add-in for JMP Pro provides a point-and-click interface to the popular XGBoost open-source library for predictive modeling with extreme gradient boosted trees. Value-added functionality of this add-in includes:
• Repeated k-fold cross validation with out-of-fold predictions, plus a separate routine to create optimized k-fold validation columns, optionally stratified or grouped.
• Ability to fit multiple Y responses in one run.
• Automated parameter search via JMP Design of Experiments (DOE) Fast Flexible Filling Design
• Interactive graphical and statistical outputs
• Model comparison interface
• Profiling Export of JMP Scripting Language (JSL) and Python code for reproducibility
Click the link above to download the add-in, then drag onto JMP Pro 15 or higher to install. The most recent features and improvements are available with the most recent version JMP Pro, including early adopter versions.
See the attached .pdf for details and examples, which can be found in the attached .zip file (along with a journal and other examples used in this tutorial).
Related Materials
Video Tutorial with Downloadable Journal: XGBoost Add-In for JMP Pro (2020-US-45MP-540)
Add-In for JMP Genomics:
Add-In from @Franck_R : Machine Learning with XGBoost in a JMP Addin (JMP app + python)