cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
The Discovery Summit 2025 Call for Content is open! Submit an abstract today to present at our premier analytics conference.
Choose Language Hide Translation Bar
View Original Published Thread

Understanding and Applying Tree-based Methods for Predictor Screening and Modeling

Published on ‎11-07-2024 03:29 PM by Community Manager Community Manager | Updated on ‎11-07-2024 05:39 PM

 

Tree-Based Methods
Video Player is loading.
Current Time 0:00
Duration 0:00
Loaded: 0%
Stream Type LIVE
Remaining Time 0:00
 
1x
    • Chapters
    • descriptions off, selected
    • captions off, selected
      (view in My Videos)

      See how to:

      • Model using Partition, Bootstrap Forests and Boosted Tree
      • Understand pros and cons of decision trees
        • Pros: Uncover non-linear relationships, get results that are easy to understand, screen large number of factors
        • Cons: Handle one response at a time, forms if-then statement not mathematical formula, high variability can lead to major differences in a model for similar data
      • Build a JMP Partition classification tree that uses splitting to define relationship between the predictors and categorical responses
      • Define the number of samples, size of the trees (models) and sample rate to build Bootstrap Forest models using a random-forest technique
      • Define the number of layers, splits per tree, learning rate and sampling rates to build Boosted Tree models that combine smaller models into a final model
      • Interpret results and tune models
      • Specify and run Profit Matrix reports to identify misclassifications for categorical responses, and interpret results using Confusion Matrix and Decision Matrix

      Note: Q&A included at time 17:00 and time 38:00



      0 Kudos
      0 Comments