See how to:
- Model using Partition, Bootstrap Forests and Boosted Tree
- Understand pros and cons of decision trees
- Pros: Uncover non-linear relationships, get results that are easy to understand, screen large number of factors
- Cons: Handle one response at a time, forms if-then statement not mathematical formula, high variability can lead to major differences in a model for similar data
- Build a JMP Partition classification tree that uses splitting to define relationship between the predictors and categorical responses
- Define the number of samples, size of the trees (models) and sample rate to build Bootstrap Forest models using a random-forest technique
- Define the number of layers, splits per tree, learning rate and sampling rates to build Boosted Tree models that combine smaller models into a final model
- Interpret results and tune models
- Specify and run Profit Matrix reports to identify misclassifications for categorical responses, and interpret results using Confusion Matrix and Decision Matrix
Note: Q&A included at time 17:00 and time 38:00