<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: DoE How to treat replicate measurements in Discussions</title>
    <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18595#M16947</link>
    <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Do you mean &lt;A href="http://www.jmp.com/support/help/Mixed_and_Random_Effect_Model_Reports_and_Option.shtml" title="http://www.jmp.com/support/help/Mixed_and_Random_Effect_Model_Reports_and_Option.shtml"&gt;Mixed and Random Effect Model Reports and Options&lt;/A&gt;​ ? I will read about it and see how I can implement this.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
    <pubDate>Fri, 20 May 2016 07:04:25 GMT</pubDate>
    <dc:creator>roland_goers</dc:creator>
    <dc:date>2016-05-20T07:04:25Z</dc:date>
    <item>
      <title>DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18588#M16940</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;&lt;EM&gt;(I needed to extend the question, the update is below the images)&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Hey everyone,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I am looking for some advice. I have performed a definitive screening design (JMP 11) with 3 factors, 1 response (9 experiments) three times. I augmented the design and entered the new data. The problem is, that some combinations produce very varying results. I have repeated some of these experiments additional three times and the variation seems to be "inherent".&lt;/P&gt;&lt;P&gt;My question is how to treat these replicate runs. When I augment the design to have 2 replicate runs (3 runs in total), my regression looks pretty bad.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="11580_NoMean.jpg" style="width: 1112px;"&gt;&lt;img src="https://community.jmp.com/t5/image/serverpage/image-id/3095i7034B47E48FD7AFE/image-size/medium?v=v2&amp;amp;px=400" role="button" title="11580_NoMean.jpg" alt="11580_NoMean.jpg" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;However, when I average my trial runs first and then perform the regression, I get much better results.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="11581_WithMean.jpg" style="width: 1090px;"&gt;&lt;img src="https://community.jmp.com/t5/image/serverpage/image-id/3096iB107B6B28A0A1B7A/image-size/medium?v=v2&amp;amp;px=400" role="button" title="11581_WithMean.jpg" alt="11581_WithMean.jpg" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;But when I do the averaging first, I think I disregard the information about the standard deviation/variance, which I would like to include (and maybe penalize, because we would like to work in a "stable" region).&lt;/P&gt;&lt;P&gt;Is there any way to include this into the DoE? Maybe use the SD as a response and minimize it?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Edit:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;I've thought about my problem and would like to extend it a "little" bit. I thought that I could use the SD as weight for my data, in order to take into account that some conditions used for the experiments lead to very varying results and others are more reproducable. So I read about weighted least square regression and now I am very confused because choosing the weight appears to be non-trivial. Thus my question extends to: If I use a weight, which one would be appropriate. Currently I have thought about and tried the following approaches:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Use the SD (from what I have seen, it should used as 1/SD or 1/SD^2). This sounds very reasonable to me but on the other hand I have read that one needs lots of replicates to have a proper estimate of the SD (in the range of dozens), otherwise this method is not accurate enough.&lt;/LI&gt;&lt;LI&gt;Use one of my responses as a weight. I am also measuring the so called polydispersity index, which gives me some information about the homogenity/heterogenity of my samples. The results looks quite good, but I am not sure if it is allowed to use a response a weight (I think I would introduce some kind of bias?)&lt;/LI&gt;&lt;LI&gt;Use the residuals. I have found this in a lecture handout. The idea is to make a non-weighted fit first and the use 1/(residuals)^2 as the weight. But againg, I am worried to "push" the resulting fit into a wrong direction.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;SPAN style="font-size: 10pt; line-height: 1.5em;"&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 10pt; line-height: 1.5em;"&gt;Thanks a lot! &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Nachricht geändert durch Roland Goers&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 19 Oct 2016 02:43:31 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18588#M16940</guid>
      <dc:creator>roland_goers</dc:creator>
      <dc:date>2016-10-19T02:43:31Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18589#M16941</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Roland,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I can think of a few things I would do if I were in your situation.&lt;/P&gt;&lt;P&gt;First, if you feel the replicates are showing excessive variation, perhaps you have missed one or more key input variables that are varying without your knowledge.&amp;nbsp; Also, what about your measurement system?&amp;nbsp; Is you measurement stable?&lt;/P&gt;&lt;P&gt;You might also "normalize" your input variables before analysis (like calculating a Z score) - Subtract each value in the input column from the average of that column and divide the difference by the standard deviation of that column.&amp;nbsp; Now you have a new column of normalized values.&lt;/P&gt;&lt;P&gt;Even though the amount of data is small, I sometimes go ahead and try some predictive modeling like partition and neural networks.&amp;nbsp; Sometimes you get some good clues here.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 18 May 2016 20:15:38 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18589#M16941</guid>
      <dc:creator>Steven_Moore</dc:creator>
      <dc:date>2016-05-18T20:15:38Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18590#M16942</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P style="font-size: 13.3333px;"&gt;Thank you very much for your answer!&lt;/P&gt;&lt;P style="font-size: 13.3333px;"&gt;I think the experimental setup is fine, even though I agree that there are probably more factors which should be taken into account. However, these are factors we do not know (yet) and/or cannot control. That's one of the issues in fundamental research &lt;SPAN __jive_emoticon_name="confused" __jive_macro_name="emoticon" class="jive-image jive_macro_emoticon jive_emote jive_macro" src="https://community.jmp.com/7.0.4.3b79b96/images/emoticons/confused.png"&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P style="font-size: 13.3333px;"&gt;I followed you advise and read and computed the z-score for my measurements. I used them as weights during the regression and the results look promising. Close to the regression I obtained when I simply used the mean values, which I think makes sense.&lt;/P&gt;&lt;P style="font-size: 13.3333px;"&gt;However, I've read that similar to the 1/SD approach, it is required to know the population mean and its SD. But as the experiments are very time consuming, I think this might be a trade-off I need to "risk". &lt;/P&gt;&lt;P style="font-size: 13.3333px;"&gt;Of course, if anyone knows a better way for data with small numbers of replicate runs, I would be happy to try it.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 19 May 2016 12:19:19 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18590#M16942</guid>
      <dc:creator>roland_goers</dc:creator>
      <dc:date>2016-05-19T12:19:19Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18591#M16943</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Roland, in general there are three sources of variation: the process, sampling, and the measurement system.&amp;nbsp; From what you describe, it sounds like you have significant variation in sampling and/or measurement.&amp;nbsp; Consider a Components of Variance (nested, hierarchical) designed experiment to isolate the sources of variation and their contribution to the overall experimental variation.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 19 May 2016 13:49:16 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18591#M16943</guid>
      <dc:creator>waynergf</dc:creator>
      <dc:date>2016-05-19T13:49:16Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18592#M16944</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;{continuing...}&amp;nbsp; I have many times in research found the sampling and/or measurement variation to be so great as to obscure the effects of the controllable factors we were investigating.&amp;nbsp; Sometimes just a thorough review of how the experiment is run, samples taken, and measurements made revealed differences (among those involved in the "system" for example) that were causing the large variation.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 19 May 2016 14:04:12 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18592#M16944</guid>
      <dc:creator>waynergf</dc:creator>
      <dc:date>2016-05-19T14:04:12Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18593#M16945</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;I want to be clear about your protocol. You actually replicated runs, not merely measured the outcome more than one. For example, the response in an experiment to study the make up of a chemical buffer might by the pH. My treatments define different levels of ingredients in each buffer. Each run produces a new buffer solution. If I make another solution, that is replication and observing the resulting pH provides new information. If I simply measure the pH of the same buffer solution three times, that is a repeated measurement.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;If you actually replicated runs then you should include all the individual replicates in the analysis and not the average. The root mean square error estimates the random variation between runs. The mean square error provides the proper denominator in the ANOVA F-test and the root mean square error provides the proper standard error for the parameter estimate t-tests.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;On the other hand, if you simply repeated the measure of the response for a single run, then using the average makes sense ifyou believe that the variation in the repetitions is purely in the measurement (repeatability).&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 19 May 2016 14:05:54 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18593#M16945</guid>
      <dc:creator>Mark_Bailey</dc:creator>
      <dc:date>2016-05-19T14:05:54Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18594#M16946</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;I would first add a new nominal variable called 'Rep' or 'Run' and tabulate which set of replicates were run together, for example 1,2,3 etc. Now include this your model analyses. Start by adding this new variable as a random or fixed effect. This will assign some of the rep to rep variation, but does little to diagnose it. You can add more complex effects such as rep-to-rep interactions to explore what may be causing the variation.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 19 May 2016 14:51:43 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18594#M16946</guid>
      <dc:creator>cipollone_mg</dc:creator>
      <dc:date>2016-05-19T14:51:43Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18595#M16947</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Do you mean &lt;A href="http://www.jmp.com/support/help/Mixed_and_Random_Effect_Model_Reports_and_Option.shtml" title="http://www.jmp.com/support/help/Mixed_and_Random_Effect_Model_Reports_and_Option.shtml"&gt;Mixed and Random Effect Model Reports and Options&lt;/A&gt;​ ? I will read about it and see how I can implement this.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 20 May 2016 07:04:25 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18595#M16947</guid>
      <dc:creator>roland_goers</dc:creator>
      <dc:date>2016-05-20T07:04:25Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18596#M16948</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;All samples I measure are prepared individually, thus I think I did "true" replicates. They are also all included in the analysis (the design consists of 9 runs, I entered 27 data points). The result is shown in figure 1. &lt;/P&gt;&lt;P&gt;Maybe it helps if I briefly descirbe what I am doing or want to do: I try to produce some kind of protein decorated nanoparticles for my PhD thesis. The creation relies totally on the self-assembly of the components, thus I can control the composition&amp;nbsp; and the evironment (e.g. temperature in the lab) but the formation takes place over 2 days and is spontaneous. &lt;/P&gt;&lt;P&gt;We do not really understand what happens during that time, we only have a rough idea of what should happen. My idea was to use DoE to get a better idea what the important factors are and how they influence the outcome. The factors I have chosen to investigate are the most likely ones, determined from former experiments. In the end, this experiment should tell me, which parameters I have to adjust in order to obtain particles of a certain size and as homogenous as possible. &lt;/P&gt;&lt;P&gt;In these experiments, there seem to be parameters which give varying results (I did another triplicate for this compositions). Other conditions appear to be much more reliable. Thus I would like to able to make a statement like "On average we get this size, however this parameter region is "unstable", this one here is more "stable"". And of course, the model would need to reflect this. That's why thought about including the SD as a weight and use the PdI as a second&amp;nbsp; response (max. size, min. PdI).&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 20 May 2016 07:29:59 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18596#M16948</guid>
      <dc:creator>roland_goers</dc:creator>
      <dc:date>2016-05-20T07:29:59Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18597#M16949</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;The Components of Variance design would be represented by a Random Effects model.&amp;nbsp; You can read about them in this excerpt from Box, Hunter, Hunter's _Statistics for Experimenters_": &lt;A href="https://app.box.com/files/1/f/74723508/1/f_9510142201" title="https://app.box.com/files/1/f/74723508/1/f_9510142201"&gt;https://app.box.com/files/1/f/74723508/1/f_9510142201.&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;The excerpt also includes a detailed explanation and example of how to optimize (e.g., minimize cost) the sampling and measurement for a desired total experimental variance.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 20 May 2016 12:30:49 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18597#M16949</guid>
      <dc:creator>waynergf</dc:creator>
      <dc:date>2016-05-20T12:30:49Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18598#M16950</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;re: "&lt;SPAN style="font-size: 14px; color: #666666; font-family: Helvetica, Arial, sans-serif; font-variant-ligatures: normal; font-variant-position: normal; font-variant-numeric: normal; font-variant-alternates: normal; font-variant-east-asian: normal; line-height: 20px; background-color: #ffffff;"&gt;Do you mean &lt;/SPAN&gt;&lt;A _jive_internal="true" href="https://community.jmp.com/www.jmp.com/support/help/Mixed_and_Random_Effect_Model_Reports_and_Option.shtml" rel="nofollow" style="font-size: 14px; font-family: Helvetica, Arial, sans-serif; font-variant-ligatures: normal; font-variant-position: normal; font-variant-numeric: normal; font-variant-alternates: normal; font-variant-east-asian: normal; line-height: 20px;" target="_blank"&gt;Mixed and Random Effect &lt;/A&gt;​&lt;SPAN style="font-size: 10pt;"&gt;"&amp;nbsp; &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Actually, I mean &lt;SPAN style="text-decoration: underline;"&gt;F&lt;/SPAN&gt;ixed and Random effects.... Yes, do check out these tools.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;After you add a 'Rep' or 'batch' variable to your data, make sure it is nominal categorical type, then include this an an input factor in your model regression, along with your design factors. Why you are at it, were there any other uncontrolled variables that you tracked, e.g. temperature variation, time, etc. ? If so, include those as well. By default the 'batch' variable will be a Fixed effect. This means it will appear in the model equation. This is useful to diagnose the magnitude of batch-to-batch effects and help to see where it occurs. As a main-effect, it will show only the offset for that particular batch. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;If you specify the batch variable as a Random effect, it will not appear in the model equation. There are advantages to doing this, but is less useful at this point since you want to 'see' the variation effects.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;When you include all individual reps (as opposed to the means) you are able to get Lack Of Fit diagnostics, which are very useful to help understand the process, to highlight any lurking effects. I recommend always regressing the individual reps, although I understand it is sometimes useful to use the means as a second step.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I like your idea of including the size variation as a second response. Although, I would not recommend using any weighting schemes at this point.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 20 May 2016 13:58:55 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18598#M16950</guid>
      <dc:creator>cipollone_mg</dc:creator>
      <dc:date>2016-05-20T13:58:55Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18599#M16951</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Regarding the "Rep" factor:&lt;/P&gt;&lt;P&gt;In which way do I need to integrate it into my modeling? I usually choose my three factors, click the RSM macro in JMP and proceed. Do I need to include the "rep" factor into the RSM macro or leave him out (Thus no interactions with other factors)? &lt;/P&gt;&lt;P&gt;Futhermore, if the "rep" factor is considered significant and I have to include it into the model, it also appears in the profilers. However, this is kind of confusing for me, because on the one hand I understand that this factor is significant and thus my results are depending on the batch. This makes perfectly sense, I have a biological component in there, they are never the same. On the other hand, if I now optimize my results, the batch is also included and for example batch #1 does not exist anymore....&lt;/P&gt;&lt;P&gt;Do I miss something? Is there a way to include the batch variance without including a batch variable?&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 26 May 2016 14:17:43 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18599#M16951</guid>
      <dc:creator>roland_goers</dc:creator>
      <dc:date>2016-05-26T14:17:43Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18600#M16952</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;After you create your RSM with your design variables, add the Rep variable. At this point it is considered a Fixed effect, and will appear in the model equation. Go ahead and do the regression this way first. You can check to see if it is significant, and how big an effect it is. If it is significant, transform the Rep variable to a 'Random Effect' (click the Attributes red triangle), and do the regression again. This time it will not appear in the model equation, rather, it is considered a random effect. "&lt;A name="739673" style="font-size: 10pt; border: none; outline: none; color: #666666; -webkit-transition: color 0.5s ease; transition: color 0.5s ease; font-family: Arial, Helvetica, sans-serif;"&gt;A random effect is a factor whose levels are considered a random sample from some population. Often, the precise levels of the random effect are not of interest, rather it is the variation reflected by the levels that is of interest (the &lt;/A&gt;&lt;SPAN class="Emphasis" style="font-size: 10pt; font-style: italic; color: #666666; font-family: Arial, Helvetica, sans-serif;"&gt;variance components&lt;/SPAN&gt;&lt;SPAN style="font-size: 10pt; color: #666666; font-family: Arial, Helvetica, sans-serif;"&gt;). However, there are also situations where you want to predict the response for a given level of the random effect. Technically, a random effect is considered to have a normal distribution with mean zero and nonzero variance."&lt;/SPAN&gt;&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 26 May 2016 14:52:51 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18600#M16952</guid>
      <dc:creator>cipollone_mg</dc:creator>
      <dc:date>2016-05-26T14:52:51Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18601#M16953</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Thank you very much for your answer!&lt;/P&gt;&lt;P&gt;Out of curiosity, why is the "rep" factor not included into the RSM (or not allowed to be)? I just compared the two results&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Following your guidelines:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="11718_pastedImage_0.png" style="width: 360px;"&gt;&lt;img src="https://community.jmp.com/t5/image/serverpage/image-id/3189iDFDC5F8387AE937D/image-size/medium?v=v2&amp;amp;px=400" role="button" title="11718_pastedImage_0.png" alt="11718_pastedImage_0.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Including it into the RSM:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="11719_pastedImage_1.png" style="width: 354px;"&gt;&lt;img src="https://community.jmp.com/t5/image/serverpage/image-id/3190i8B3A3E94BA503A2F/image-size/medium?v=v2&amp;amp;px=400" role="button" title="11719_pastedImage_1.png" alt="11719_pastedImage_1.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;What I can see, is that the first model includes less factors, namely X1, X2, X3, X3*X3 and Xrep, whereas the second model uses X1*Xrep, X2*Xrep and X3*Xrep additionally. Thus it has more terms and might overfit. However, all criteria like Rsq adj., BIC and AICc are also better for the 2nd model.&lt;/P&gt;&lt;P&gt;The results of the optimization are very similar.&lt;/P&gt;&lt;P&gt;Using simply common sense, I would imagine that during each replicate run, the input factors varied which these terms would take into account. &lt;/P&gt;&lt;P&gt;Or am I totally wrong and this simply forbidden? &lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 19 Oct 2016 02:56:03 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18601#M16953</guid>
      <dc:creator>roland_goers</dc:creator>
      <dc:date>2016-10-19T02:56:03Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18602#M16954</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;I like to look at the higher XRep terms as a diagnostic, for example to help figure out what may be happening and maybe to reduce the effects in the future. Based on the factors you listed, it looks like there may have been something that occured over time. But I would not include anything except the XRep as a random effect in the final model. As you know, XRep is not something that can be asigned in future simulations.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;hope this helps,&lt;/P&gt;&lt;P&gt;mark&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 26 May 2016 19:01:53 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18602#M16954</guid>
      <dc:creator>cipollone_mg</dc:creator>
      <dc:date>2016-05-26T19:01:53Z</dc:date>
    </item>
    <item>
      <title>Re: DoE How to treat replicate measurements</title>
      <link>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18603#M16955</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Thank you all very much for your help and detailed explanation! Especially cipollone.mg/Mark!&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 26 May 2016 20:32:32 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/DoE-How-to-treat-replicate-measurements/m-p/18603#M16955</guid>
      <dc:creator>roland_goers</dc:creator>
      <dc:date>2016-05-26T20:32:32Z</dc:date>
    </item>
  </channel>
</rss>

