cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Try the Materials Informatics Toolkit, which is designed to easily handle SMILES data. This and other helpful add-ins are available in the JMP® Marketplace
Choose Language Hide Translation Bar

Understanding and Applying Tree-based Methods for Predictor Screening and Modeling

Published on ‎11-07-2024 03:29 PM by Community Manager Community Manager | Updated on ‎11-07-2024 05:39 PM

 

See how to:

  • Model using Partition, Bootstrap Forests and Boosted Tree
  • Understand pros and cons of decision trees
    • Pros: Uncover non-linear relationships, get results that are easy to understand, screen large number of factors
    • Cons: Handle one response at a time, forms if-then statement not mathematical formula, high variability can lead to major differences in a model for similar data
  • Build a JMP Partition classification tree that uses splitting to define relationship between the predictors and categorical responses
  • Define the number of samples, size of the trees (models) and sample rate to build Bootstrap Forest models using a random-forest technique
  • Define the number of layers, splits per tree, learning rate and sampling rates to build Boosted Tree models that combine smaller models into a final model
  • Interpret results and tune models
  • Specify and run Profit Matrix reports to identify misclassifications for categorical responses, and interpret results using Confusion Matrix and Decision Matrix

Note: Q&A included at time 17:00 and time 38:00



Start:
Tue, Oct 6, 2020 02:00 PM EDT
End:
Tue, Oct 6, 2020 03:00 PM EDT
Attachments
0 Kudos
0 Comments