turn on suggestions

Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results for

- JMP User Community
- :
- Discussions
- :
- How to analyse a process which gives binary output...

Topic Options

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

Sep 22, 2016 7:18 AM
(1682 views)

I am studying the effect of input variables (about 30 inputs) on outputs (accept or reject) of manufacturing process. Is there any way I can do this analysis?

I have tried fitting this but Rsquare value is just 4-5%. Your help in this regards is highly appreciated.

- Tags:
- jmp_13

4 REPLIES

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

Sep 22, 2016 10:51 AM
(1611 views)

See my Quality Digest article on Binary Logistic Regression in the link above. It sounds like this is exactly what you need!

Hope this helps!

Steve

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

Sep 22, 2016 10:59 AM
(1611 views)

Also, make sure your Y variable is either ordinal or nominal, NOT continuous! JMP will do the Logistic Regression for you automatically in the Fit Y by X platform. My article mentioned above will help you understand what Logistic Regression is about. I use baseball data to determine the impacts of batting stats on the likelihood of the player being in the Hall of Fame (Binary output: 0=Out, 1=In). Batting stats are the continuous inputs.

Steve

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

Sep 22, 2016 11:19 AM
(1611 views)

One more point: The McFadden's pseudo r-squared statistic for Binary Regression is given by the Entropy R-squared in JMP. Values of 0.2 - 0.4 are considered fair and above 0.4 as very good.

Steve

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

Sep 23, 2016 3:45 AM
(1611 views)

Here are some other ideas:

1. If you are in the business of variable identification you might want to try one of JMP's and JMP Pro's partition platforms, or, if you have JMP Pro, workflows in the Generalized Regression platform or PLS - Discriminant Analysis platform. I like the PLS - DA platform for the wide and shallow data situation, which the classic logistic regression process struggles with. Neural Nets are also a possibility with our without JMP Pro.

2. The above can also be used if you are also trying to build a truly predictive model. In that case, again if you've got JMP Pro (version 13), take a look at the formula depot capability when you'd like to export your 'production predictive model' score code to your production environments...say a database.

3. If you've got JMP Pro (this can also be done in JMP but much more work involved) use the flexible model validation functionality to create a training, validation and test segments of your original data. Miss-classification overall rate and type are important to reflect on before blindly putting a model into use. Take a look at those confusion matrices! I like to look at ROC curves as well in this instance.

4. Again, if you've got JMP Pro, use the Model Comparison platform to easily compare multiple models to each other.

- Tags:
- jmp_13