Our World Statistics Day conversations have been a great reminder of how much statistics can inform our lives. Do you have an example of how statistics has made a difference in your life? Share your story with the Community!
Oct 8, 2020 12:57 PM
| Last Modified: Oct 15, 2020 8:26 AM
The XGBoost add-in for JMP Pro provides a point-and-click interface to the popular XGBoost open-source library for predictive modeling with extreme gradient boosted trees. Value-added functionality of this add-in includes:
• Repeated k-fold cross validation with out-of-fold predictions, plus a separate routine to create optimized k-fold validation columns, optionally stratified or grouped.
• Ability to fit multiple Y responses in one run.
• Automated parameter search via JMP Design of Experiments (DOE) Fast Flexible Filling Design
• Interactive graphical and statistical outputs
• Model comparison interface
• Profiling Export of JMP Scripting Language (JSL) and Python code for reproducibility
Click the link above to download the add-in, then drag onto JMP Pro 15 or 16 to install. The most recent interface features are available with the most recent JMP 16 Pro Early Adopter.
See the attached .pdf for details and examples, which can be found in the attached .zip file (along with a journal and other examples used in this tutorial).