turn on suggestions

Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results for

- JMP User Community
- :
- Discussions
- :
- Discussions
- :
- What is a correct approach to choose "Number of be...

Topic Options

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

May 18, 2015 7:06 PM
(2121 views)

If I choose lots of models the Akaike weight (*wi*) of each variable will be lower than a group of models with less variables.

Thanks

4 REPLIES

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

May 22, 2015 10:19 AM
(1989 views)

It somewhat depends on the characteristics of the data table you are analysing and the questions you want to address with your model. Can you please describe the number of predictors and rows in your table, are your predictors correlated? Also what questions are you aiming to answer with your model?

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

May 22, 2015 11:34 AM
(1989 views)

Hi Malcom, thanks for your interest. I'm looking for the best model using (AICc) to predict bird abundance using and 11 not correlated environmental variables.

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

May 22, 2015 1:27 PM
(1989 views)

If your predictors are not correlated then you could use stepwise regression using AICc as the criterion. If you suspect higher order effects you may also want to add interaction and quadratic terms in addition to linear (main) effect terms. Then see what stepwise regression using AICc gives you. If you are using Pro version of JMP and you have 100+ rows I would also try using a validation column and use stepwise regression with model validation as that may gve you greater protection from over fitting.

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

May 22, 2015 1:37 PM
(1989 views)

The advantage of viewing more than the best model is that there might be an alternative not different from the best that is better explained by science. Since AIC and BIC are likelihood based, you can think of the units as standard deviations and anything within 2 or 2.5 units as not statistically different. How many alternatives you need to look at depends on the data and the model. If you're expecting certain variables to be in the model, I'd start with a small number, rerunning the platform with an increased number of models, until either those variables show up or the last model is different from the best. If not, I'd only look at the best model from each group.