cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
JMP is taking Discovery online, April 16 and 18. Register today and join us for interactive sessions featuring popular presentation topics, networking, and discussions with the experts.
Choose Language Hide Translation Bar
malinw
Level I

Analyse screening experiment

Hi,

I have some questions about how to analyse my screening experiment, I'm using JMP 10.

I have 5 factors at two levels. I made a screening design, a fractional factorial design with resolution IV, and added two center points. The two-factor interactions were confounding so I augmented the design to resolve them. My design is not orthogonal due to difficulties controlling some conditions during the experiment (e.g. the temperature could not be held to 20°C,  it turned out to be 22°C). I would like to find out which parameters that seem to have an effect on the response, how large these effects are and, if I can, also if there is a sign of curvature in the model.

If I use the Screening Platform, some factors will be highlighted: main effects, some two-factor interactions but also two quadratic terms.

- What is the requirement for the factor to be highlighted?

When I hit "Make model" with these highlighted factors, the two quadratic terms are not significant (Prob>t is 0.08 and 0.17).

- How come they are highlighted by the Screening Platform, but not significant in this model?
- How should I interpret the result, (their estimates are -1.02 and 0.78)? That there seems to be a curvature (even if they are not significant)? But they are confounding, so I can't say which quadratic terms are giving the effect.

- Should I not include them in the model?

- The RSquare Adj is 0.88 for the first model, and 0.57 when I exclude the quadratic terms.

- Do I need to add axial terms to get significant quadratic terms?

Should I use another method, fit model or stepwise regression instead?

A lot of questions, I hope someone can help me sort out some of them. Thanks in advance!

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Analyse screening experiment

I would not worry about the fact that your levels were not exactly what was called for in the treatment. Make sure to update the value in the data table before the analysis! The correlation that is introduced (non-orthogonality) by these deviations should be small and therefore, not a problem - regression can handle such anomalies well.

I recommend that you study the chapter about the Screening platform in the guide. (Help > Books, > Design of Experiments Guide) This information will go a long way towards preparing you to use this platform and towards answering your questions.

Some points you might consider:

  • Contrasts with an individual p-value < 0.1 are automatically selected. You can over-ride the selection by clicking and control-clicking contrasts to add or remove contrasts from the current set of selected contrasts.
  • The p-values in the screening platform depend on the current model, that is, what other terms are included in the model. When you click Make Model, only the selected terms are included so the p-values will change. Also, contrasts are used by the Screening platform but Fit Least Squares uses the parameter estimates. When the estimates at not orthogonal, there will be a difference between them and contrasts, leading to a change in the associated p-values.
  • You can use the Lack of Fit test in Fit Least Squares to help you decide if there is curvature. I would remove the quadratic terms first. Also understand that this test is weak since it will have only 1 degree of freedom from 2 center points. If this test is significant, then you have evidence of curvature and that would support one or both of the quadratic terms.
  • You do not have all of the necessary treatments in your design matrix to support a model matrix (regression) with both quadratic terms. If you decide that curvature is significant, then you can augment the design with new runs. (DOE > Augment Design. Study the chapter in the guide about augmenting a design.) Without augmentation, you would have to rely on belief based on external knowledge to select one of the quadratic terms. There is no support for such a selection in this data set.
  • The R square always increases when you add a term and always decreases when you remove a term regardless of their significance to the model. Adjusted R square generally behaves the same way but its penalty makes interpretation more complicated. Are you using adjusted R square for model selection?
  • The Screening platform assumes that all of your factors are continuous. I assume that they are because you added center points.

You can also try the Stepwise platform feature of All Possible Models with only five factors.

View solution in original post

5 REPLIES 5

Re: Analyse screening experiment

I would not worry about the fact that your levels were not exactly what was called for in the treatment. Make sure to update the value in the data table before the analysis! The correlation that is introduced (non-orthogonality) by these deviations should be small and therefore, not a problem - regression can handle such anomalies well.

I recommend that you study the chapter about the Screening platform in the guide. (Help > Books, > Design of Experiments Guide) This information will go a long way towards preparing you to use this platform and towards answering your questions.

Some points you might consider:

  • Contrasts with an individual p-value < 0.1 are automatically selected. You can over-ride the selection by clicking and control-clicking contrasts to add or remove contrasts from the current set of selected contrasts.
  • The p-values in the screening platform depend on the current model, that is, what other terms are included in the model. When you click Make Model, only the selected terms are included so the p-values will change. Also, contrasts are used by the Screening platform but Fit Least Squares uses the parameter estimates. When the estimates at not orthogonal, there will be a difference between them and contrasts, leading to a change in the associated p-values.
  • You can use the Lack of Fit test in Fit Least Squares to help you decide if there is curvature. I would remove the quadratic terms first. Also understand that this test is weak since it will have only 1 degree of freedom from 2 center points. If this test is significant, then you have evidence of curvature and that would support one or both of the quadratic terms.
  • You do not have all of the necessary treatments in your design matrix to support a model matrix (regression) with both quadratic terms. If you decide that curvature is significant, then you can augment the design with new runs. (DOE > Augment Design. Study the chapter in the guide about augmenting a design.) Without augmentation, you would have to rely on belief based on external knowledge to select one of the quadratic terms. There is no support for such a selection in this data set.
  • The R square always increases when you add a term and always decreases when you remove a term regardless of their significance to the model. Adjusted R square generally behaves the same way but its penalty makes interpretation more complicated. Are you using adjusted R square for model selection?
  • The Screening platform assumes that all of your factors are continuous. I assume that they are because you added center points.

You can also try the Stepwise platform feature of All Possible Models with only five factors.

malinw
Level I

Re: Analyse screening experiment

Thank you for your useful answers!

Yes, I have only continuous factors, and I was wrong in my earlier post: I added 4 centerpoints not 2.

How do I decide which factors to include in the model?

If I start with using the Screening platform and get 6 factors highlighted. I make a model with them, but in the model only four of them are significant. Should I make a new model with only the significant ones? And then analyse that model according to lack of fit, adjusted R square, etc.

I understand that the Half-normal plot is useful. The estimates of inactive terms will fall on the line which has slope δ, because they represent random noise that follow the normal distribution.

Would the following procedure be appropriate?  Using the Screening Platform:

Make a model (fit least square) with the terms that are far from the line in the Half-normal plot, but excluding the quadratic terms.
Make a new model with only the significant terms (but still a hierarchical model if the presence of two-factor interactions).
     -Should I remove one factor at the time, the one with lowest significance level, and make a new model? Or should I remove all insignificant terms      at the same time?
And then analyse the model according to previous discussion.

Re: Analyse screening experiment

The Screening platform uses alpha = 0.1 to select contrasts. This criterion represents twice the type I error risk (false positive) than the often used criterion of 0.05 for alpha. This increase in risk is acceptable because you will continue to include such factors and with more empirical evidence decide if they confirm as true effects or not and because it decreases the type II error risk (false negatives). This trade-off is desirable because you won't continue to examine factors determined to have no effect. This trade-off is useful because screening experiments are usually small and fixed size so you cannot increase power with more runs. For now, I would include all six factors until further evidence shows that one or more of them do not confirm.

I am confused about the quadratic terms in your model. You should not have the necessary treatments in your screening design to support the estimation of the quadratic terms in the model. You only have center points.

We generally do not recommend removing all of the terms that were found not to be significant in the initial fit of the model to the data. It is a good idea to conservatively remove the least significant term that does not break the model hierarchy and re-fit the model. Iterating this way is safer because the p-value for any term depends on the other terms in the model and so all of them will change as you simplify the linear model.

Augmenting your design as you proceed is a great way to incrementally increase the empirical evidence as you extend and confirm your model.

Re: Analyse screening experiment

Two things to add to Mark's comments:

After you fit your model go to the red hot spot by the fit model title bar and go to Row Diagnostics and select "Plot Residual by Predicted".  Look at the graph and look for a pattern(s) that would suggest curvature or some other trend to your data.

Also, depending on the version of JMP you are using, you should consider using the Definitive Screening Design for your next DOE.  This will give you the least number of runs necessary to find many, if not all of the active (important) factors, including curvature terms, for your model.  In your case of 5 continuous factors your design would potentially be 11 runs.

Best,

Bill

Re: Analyse screening experiment