cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
] />

Discussions

Solve problems, and share tips and tricks with other JMP users.
Choose Language Hide Translation Bar
NominalGemsbok3
Level III

"Surprising" results in an DSD-Design

As part of a master’s thesis, a Definitive Screening Design with six continuous variables was conducted. We generated a standard design in JMP with 17 runs (+3 additional runs at the zero level).

The results surprised me in that a considerable number of interaction and quadratic effects are highly significant. My initial suspicion was overfitting, but I cannot find any indications supporting that. We have an excellent R-squared (which is expected), no elevated VIFs, an unremarkable Durbin–Watson test, etc. Furthermore, a PRESS value of 0.044 is achieved.

These findings remain the same or very similar even when strict heredity is disabled in the DSD analysis. In that case, 2–3 additional interaction or quadratic effects appear, which are again highly significant.

Based on everything I know, I currently see no reason to doubt the validity of the results. Or am I overlooking something?

NominalGemsbok3_1-1776096715422.png

 

29 REPLIES 29
statman
Super User

Re: "Surprising" results in an DSD-Design

Read Cuthbert Daniel, the inventor of the method.

Daniel, Cuthbert (1976) Applications of Statistics to Industrial Experiments, Wiley (ISBN 0-471-19469-7)

"All models are wrong, some are useful" G.E.P. Box
frankderuyck
Level VII

Re: "Surprising" results in an DSD-Design

OK, patterns around the line are indication for special causes, in this case some moderate strong effects to take furhter into account

statman
Super User

Re: "Surprising" results in an DSD-Design

Did you read his book already? Let me help...Yes, model effects that are outside the distribution of random errors (which should be normally distributed with a mean of 0) are indeed assignable (are different than the effects that are likely part of the distribution of errors, hence not assignable). But, that is not necessarily the line drawn by Lenth's algorithm. Daniel did not have Length's algorithm portrayed on his normal plots. He interpreted the plots and used his interpretation of where he thought the distribution of errors was portrayed on the plot (I use the "fat" pencil test). He mentions the observation of an "S" curve on the normal plot suggesting that something other than the model effects was effecting the experiment. The special cause is in the noise, not a result of the model effect. This, in turn would inflate the MSE in ANOVA.

"All models are wrong, some are useful" G.E.P. Box

Re: "Surprising" results in an DSD-Design

Along with @statman 'fat pencil' criterion, think of shifts in the vertical direction as a change in the mean response and a change in the slope of the line through group of estimates as a difference in the standard deviation.

frankderuyck
Level VII

Re: "Surprising" results in an DSD-Design

Can we conclude that the half normal plot can be used a s a first screening for detection of very strong effects. When no random distribution and patterns around the solid line are observed there is indication for moderate strong effects? 

statman
Super User

Re: "Surprising" results in an DSD-Design

Yes, strength relative to the other effects in the model. BTW, I always recommend you do these plots with a saturated model.

Also of note, if the signs of the factor settings are meaningful (i.e., continuous factors), then you can use normal plots and take advantage of the directional vector. A paper to read if you don't have access to his excellent book:

Daniel, Cuthbert (1959) "Use of Half-Normal Plots in Interpreting Factorial Two-Level Experiments", Technometrics, Vol. 1, No. 4, November

"All models are wrong, some are useful" G.E.P. Box

Re: "Surprising" results in an DSD-Design

To @statman last reply, the Screening platform (an old one) implements his recommendations. See it here:

Names Default to Here( 1 );
dt = Open( "$SAMPLE_DATA/Design Experiment/2x3x4 Factorial.jmp" );
dt << Screening( Y( :Y ), X( :X1, :X2, :X3 ) );

 

frankderuyck
Level VII

Re: "Surprising" results in an DSD-Design

Very interesting case and great discussion, thanks for sharing!

frankderuyck
Level VII

Re: "Surprising" results in an DSD-Design

It looks that whith the two level screening platform we will not achieve the best AICc = -30 Fit Definitive Screenin result above.

Last year I posted a 7 factorDSD case https://community.jmp.com/t5/Discussions/Setting-Stage-1-P-value-in-Analyis-of-DSD-at-a-high-level-t... where this was opposite: I quickly got a good 5 active factor model with two level screening and using the Fit DSD I had to increase stage 1 P value to 0,4 to detect the 5 active factors, interesting..

frankderuyck
Level VII

Re: "Surprising" results in an DSD-Design

So for detection of active main effects and indication of potential interaction & quadratic terms I recommend using the Fit DSD platform however be aware that Stage 1 p value is a filter (!) when set too low it may block potential active effects entering the model. Start Fit DSD with stage 1 p value 0,2 - 0,4 (higher might result in overfitting). 

Recommended Articles