cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
  • Sign-in to the JMP Community will be unavailable intermittently Dec. 6-7 due to a system update. Thank you for your understanding!
  • We’re retiring the File Exchange at the end of this year. The JMP Marketplace is now your destination for add-ins and extensions.
  • JMP 19 is here! Learn more about the new features.

Learn JMP Events

Events designed to further your knowledge and exploration of JMP.
Choose Language Hide Translation Bar

Understanding and Applying Tree-based Methods for Predictor Screening and Modeling

Published on ‎11-07-2024 03:29 PM by Community Manager Community Manager | Updated on ‎11-07-2024 05:39 PM

 

See how to:

  • Model using Partition, Bootstrap Forests and Boosted Tree
  • Understand pros and cons of decision trees
    • Pros: Uncover non-linear relationships, get results that are easy to understand, screen large number of factors
    • Cons: Handle one response at a time, forms if-then statement not mathematical formula, high variability can lead to major differences in a model for similar data
  • Build a JMP Partition classification tree that uses splitting to define relationship between the predictors and categorical responses
  • Define the number of samples, size of the trees (models) and sample rate to build Bootstrap Forest models using a random-forest technique
  • Define the number of layers, splits per tree, learning rate and sampling rates to build Boosted Tree models that combine smaller models into a final model
  • Interpret results and tune models
  • Specify and run Profit Matrix reports to identify misclassifications for categorical responses, and interpret results using Confusion Matrix and Decision Matrix

Note: Q&A included at time 17:00 and time 38:00



Start:
Tue, Oct 6, 2020 02:00 PM EDT
End:
Tue, Oct 6, 2020 03:00 PM EDT
Attachments
0 Kudos
0 Comments