cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Check out the JMP® Marketplace featured Capability Explorer add-in
Choose Language Hide Translation Bar
ADouyon
Level III

Error when comparing multiple DOE designs

Hello,

Can someone clarify why I get this error message when comparing the 4 custom DOE designs (2 designs have 18 runs, the other has 36 runs -includes duplicates- and the last has 54 runs-includes triplicates), and what this message means?
"Model for primary design cannot be fit by all designs. Removing inestimable model terms."

Thank you!!

3 ACCEPTED SOLUTIONS

Accepted Solutions
Victor_G
Super User

Re: Error when comparing multiple DOE designs

Hi @ADouyon,

 

You have this message because the designs you're trying to compare don't have the same terms in the model (you can check which terms are in each model by clicking on "Evaluate Design" script, and looking at the "Model" part).

For example, the reference design V5 has the term X4*X4, but this term is not present in V4, so JMP can't compare the power and other parameters of X4*X4 for all designs. This is what the error message is about, to warn you that models can be different in the model comparison, so JMP will only do the comparison of designs on terms that are common in all models.

 

The continuous or discrete numeric factors are coded with -1 for low level, +1 for high level (and 0 for middle level) in this platform, in order to have the same basis in case you would have different ranges/values for different designs.

This variable coding is always done in the DoE process (high level = +1, low level =-, but not directly shown in JMP), as it enables same variation range for all factors (and independent from units), and the model coefficients can be directly compared. 

 

Victor GUILLER

"It is not unusual for a well-designed experiment to analyze itself" (Box, Hunter and Hunter)

View solution in original post

Victor_G
Super User

Re: Error when comparing multiple DOE designs

Hi @ADouyon,

 

I have one idea to spot replicate runs, but there may be an easier option to do that in JMP:

  1. First, on your DoE datatable, select all factors columns, and then click on menu "Rows", "Row selection" and finally "Select Duplicate rows". JMP will then highlights rows that are not unique (screenshot 1).
  2. Then, keeping rows that have been highlighted (and column headers for factors), go to the same "Rows" menu, then "Row selection", and then click on "Select Matching cells" (screenshot 2). JMP will then highlight rows that have the same cell values as the rows that have been selected in step 1.

 

Hope this will help you,

 

Victor GUILLER

"It is not unusual for a well-designed experiment to analyze itself" (Box, Hunter and Hunter)

View solution in original post

Victor_G
Super User

Re: Error when comparing multiple DOE designs

Hi again,

Not entirely sure, but I have a very likely explanation : your design without replicate runs has 18 runs, the default value recommended by JMP to estimate all main effects and 2 factors interactions (+ quadratic effects "if possible").

Since you specify 2 replicate runs for the same total number of runs for the other designs, JMP use 2 of the 18 runs as replicate runs, which decrease the number of terms estimable by the model, because these runs don't bring new information (but help estimate pure error and test lack of fit of the model).

 

But if you click on the script "Model", you'll see that JMP didn't forget you had the intention to have X4*X4 (it's present in the terms of the model, see screenshot "Model"). But since 2 of the runs of this design are used for replication (to estimate pure error instead of parameters estimates), you do not have sufficient data to fit such a model: your model will not able to estimate the terms coefficients in a "standard way" (independently from each others) due to this singularity (see screenshot "Model-2"). One option is to require new way of analyzing your data for this kind of situation (some possible options could be "Stepwise regression"  if you have JMP, or use Generalized Regression models like Lasso or Elastic Net if you have JMP Pro), or simply don't try to estimate these "missing" terms.

 

From my side, good night !

Victor GUILLER

"It is not unusual for a well-designed experiment to analyze itself" (Box, Hunter and Hunter)

View solution in original post

8 REPLIES 8
ADouyon
Level III

Re: Error when comparing multiple DOE designs

PS: I was attempting to compare design v5 with the other 3 designs (v4, v6 and v7)

ADouyon
Level III

Re: Error when comparing multiple DOE designs

Also, if I click 'OK' in the error message (not sure it's okay because not sure what the error means), why do all my factors are now shown as continuous and with ranges from -1 to 1 (when doing the designs comparison)?
Before comparing the designs, my factors were definitely defined differently (attached); one is continuous and the other 3 are discrete numeric and the ranges were not -1 to 1.

Thank you!!

ADouyon
Level III

Re: Error when comparing multiple DOE designs

factors table 

Victor_G
Super User

Re: Error when comparing multiple DOE designs

Hi @ADouyon,

 

You have this message because the designs you're trying to compare don't have the same terms in the model (you can check which terms are in each model by clicking on "Evaluate Design" script, and looking at the "Model" part).

For example, the reference design V5 has the term X4*X4, but this term is not present in V4, so JMP can't compare the power and other parameters of X4*X4 for all designs. This is what the error message is about, to warn you that models can be different in the model comparison, so JMP will only do the comparison of designs on terms that are common in all models.

 

The continuous or discrete numeric factors are coded with -1 for low level, +1 for high level (and 0 for middle level) in this platform, in order to have the same basis in case you would have different ranges/values for different designs.

This variable coding is always done in the DoE process (high level = +1, low level =-, but not directly shown in JMP), as it enables same variation range for all factors (and independent from units), and the model coefficients can be directly compared. 

 

Victor GUILLER

"It is not unusual for a well-designed experiment to analyze itself" (Box, Hunter and Hunter)
ADouyon
Level III

Re: Error when comparing multiple DOE designs

Thank you @Victor_G for catching that!!

I thought I had included the 2nd order interactions and the quadratic effects in all designs.

Following up on that - when I click on the "Evaluate Design" script, it shows me the model. However, I don't see the number of replicate runs I added. How can I see/confirm this in my design, once I have already generated the table?
For example, my v5 design has no replicate runs. While, my v4 design had 2 replicate runs added. Both designs have a total of 18 runs. When I go to "Evaluate Design", however, these 2 designs look the same, as all it shows is that the total number of runs is 18.

Thank you @Victor_G !

ADouyon
Level III

Re: Error when comparing multiple DOE designs

Hi again,

Something very strange is happening....
When I am making the v4 design, I make sure I have the term x4*x4. However, after I generate the table, this term disappears from my design. I don't understand this. I made sure it was there (see screenshot). But, when I go the "evaluate design" script, I see that this term x4*x4 disappeared... it is the only term that disappeared.

Do you know why this happens? it only happens with this specific v4 design I was making; it is supposed to be the simplest of all my designs. (I have 4 factors, 1 response. I want all quadratic and interaction effects. and I added 2 replicate runs only. And, I have some specific constraints)

Thank youuu!

Victor_G
Super User

Re: Error when comparing multiple DOE designs

Hi again,

Not entirely sure, but I have a very likely explanation : your design without replicate runs has 18 runs, the default value recommended by JMP to estimate all main effects and 2 factors interactions (+ quadratic effects "if possible").

Since you specify 2 replicate runs for the same total number of runs for the other designs, JMP use 2 of the 18 runs as replicate runs, which decrease the number of terms estimable by the model, because these runs don't bring new information (but help estimate pure error and test lack of fit of the model).

 

But if you click on the script "Model", you'll see that JMP didn't forget you had the intention to have X4*X4 (it's present in the terms of the model, see screenshot "Model"). But since 2 of the runs of this design are used for replication (to estimate pure error instead of parameters estimates), you do not have sufficient data to fit such a model: your model will not able to estimate the terms coefficients in a "standard way" (independently from each others) due to this singularity (see screenshot "Model-2"). One option is to require new way of analyzing your data for this kind of situation (some possible options could be "Stepwise regression"  if you have JMP, or use Generalized Regression models like Lasso or Elastic Net if you have JMP Pro), or simply don't try to estimate these "missing" terms.

 

From my side, good night !

Victor GUILLER

"It is not unusual for a well-designed experiment to analyze itself" (Box, Hunter and Hunter)
Victor_G
Super User

Re: Error when comparing multiple DOE designs

Hi @ADouyon,

 

I have one idea to spot replicate runs, but there may be an easier option to do that in JMP:

  1. First, on your DoE datatable, select all factors columns, and then click on menu "Rows", "Row selection" and finally "Select Duplicate rows". JMP will then highlights rows that are not unique (screenshot 1).
  2. Then, keeping rows that have been highlighted (and column headers for factors), go to the same "Rows" menu, then "Row selection", and then click on "Select Matching cells" (screenshot 2). JMP will then highlight rows that have the same cell values as the rows that have been selected in step 1.

 

Hope this will help you,

 

Victor GUILLER

"It is not unusual for a well-designed experiment to analyze itself" (Box, Hunter and Hunter)