Choose Language Hide Translation Bar
Highlighted
Level I

## Does JMP normlize the data for RSM linear regression

Y=XB.  Y and X is known.

i made a martix X ,first column is 1. next column is variable value. (RSM model . eg x1 x2 x3 x1^2 x1x2 x1x3 x2^2 x2x3 x3^2)

the column 2-10 isn;t normlized.

I use JMP fit this un-normlized X matrix and select marco/RSM.

the first step is use all the parameter and i can get the linear regression result

R-square is 0.91

if i use this un-normlized X matrix for linear regression. -- in python.

serval method.

1. pinv(x)*Y

2. pin(x.t*x)x.t*Y

3. lstsq(mat_X,mat_Y.T)

4. sm.OLS(mat_Y, mat_X)

the R-square result is about 0.92 which is similar to jmp reuslt.

if i normlized this X martix.

mat_X[:,1:]=np.around(2*(mat_X[:,1:]-mat_X[:,1:].mean(0))/(np.ptp(mat_X[:,1:],axis=0)))

the R-square result is about 0.87 .much smaller than JMP result.

so i wonder to know if JMP normlize the data during it's  fit model...

According to DOE analysis . this normlization is necessary...

2 REPLIES 2
Highlighted
Staff

## Re: Does JMP normlize the data for RSM linear regression

If you design the experiment with JMP and make the design data table, each continuous factor will have the Coding property. Fit Least Squares uses this property to code the factor levels internally to -1 (min) to +1 (max). The label of the term in the Parameter Estimates changes to something like X1(min,max) to indicate that the estimates used coded levels, not actual levels.

If you enter the factors into a data table some other way, then they likely do not have this column property and Fit Least Squares will not code the levels.

Learn it once, use it forever!
Highlighted
Level I

## Re: Does JMP normlize the data for RSM linear regression

Hi thanks for you suggestion.

i attahced the jmp reuslt. from this result .

you are right. jmp did code the input variable.

and i figure out why there is a difference.

my bood as the round the normlized value(-1/0/1)  while JMP doesn't.

obviously continuous value will show better fit result.

0.286238698909496
0.0108723016776135 * :var1_item
-0.0030783246785283 * :var2_item
-0.00103156547312312 * :var3_item
-0.00639159696577367 * :var4_item
0.00591958635149878 * :var5_item
-0.0116101343183908 * :var6_item
-0.00576482105310824 * :var7_item
-0.0000968264730577947 * :var8_item
0.0119152879877735 * :var9_item
-0.0848833900498315 * :var10_item
0.0622163225141193 * :var11_item
0.0402091328320561 * :var12_item
var1_item *var1_item * -0.00777777703149709
var1_item * (:var2_item - 0.00260416666666667) * -0.00221885710855179
var1_item * (:var3_item - 24.5807291666667) * -0.000359320174969306
var1_item * (:var4_item - (-0.0104166666666667)) * -0.00362969350701834
var1_item * (:var6_item - 0.00520833333333333) * -0.00199166114217802
var1_item * (:var8_item - 27.3046875) * -0.0000902825315175603
var1_item * (:var9_item - (-0.00520833333333333)) * 0.00138867940298831
var1_item * (:var10_item - 0.776757812500002) * -0.0126640508917091
var1_item * (:var12_item -0.0515624999999999) * 0.00723226555572464
(:var2_item - 0.00260416666666667) * (:var3_item - 24.5807291666667) * -0.000335886721396323
(:var2_item -0.00260416666666667) * (:var4_item - (-0.0104166666666667)) * 0.00191935297074069
(:var2_item - 0.00260416666666667) * (:var5_item - 1.14609374999999) * -0.00196457112219932
(:var2_item - 0.00260416666666667) * (:var10_item -0.776757812500002) * -0.00920871165427011
(:var3_item - 24.5807291666667) * (:var3_item - 24.5807291666667) * -0.0000815627812120544
(:var3_item -24.5807291666667) * (:var4_item - (-0.0104166666666667)) * 0.000191618725149221
(:var3_item - 24.5807291666667) * (:var5_item - 1.14609374999999) *0.000636976851886725
(:var3_item - 24.5807291666667) * (:var6_item -0.00520833333333333) * -0.000168527994113173
(:var3_item - 24.5807291666667) * (:var9_item - (-0.00520833333333333)) * 0.0000915720731626415
(:var3_item -24.5807291666667) * (:var10_item - 0.776757812500002) * 0.000333488716894479
(:var3_item - 24.5807291666667) * (:var11_item - 0.0503906249999999) * -0.000439774767505078
(:var4_item - (-0.0104166666666667)) * (:var6_item -0.00520833333333333) * 0.00096610960767461
(:var4_item - (-0.0104166666666667)) * (:var10_item - 0.776757812500002) * 0.00450546279579516
(:var5_item -1.14609374999999) * (:var6_item - 0.00520833333333333) * 0.0021106676783288
(:var5_item - 1.14609374999999) * (:var7_item - 13.821875) * 0.00217097837772603
(:var5_item - 1.14609374999999) * (:var11_item - 0.0503906249999999) * -0.0115442487991345
(:var6_item - 0.00520833333333333) * (:var7_item - 13.821875) * -0.00125862999783603
(:var6_item - 0.00520833333333333) * (:var10_item -0.776757812500002) * -0.00957578865813035
(:var6_item - 0.00520833333333333) * (:var11_item - 0.0503906249999999) * 0.00993582976587471
(:var7_item - 13.821875) * (:var7_item - 13.821875) * -0.000431452298824955
(:var7_item - 13.821875) * (:var8_item - 27.3046875) * -0.000126745570160173
(:var7_item - 13.821875) * (:var9_item - (-0.00520833333333333)) * 0.000354895921488306
(:var7_item -13.821875) * (:var10_item - 0.776757812500002) * -0.00632400305434231
(:var7_item -13.821875) * (:var11_item - 0.0503906249999999) * 0.00395422868916678
(:var8_item - 27.3046875) * (:var9_item - (-0.00520833333333333)) * 0.000473319071720134
(:var8_item - 27.3046875) * (:var10_item - 0.776757812500002) * 0.00073928041332545
(:var9_item - (-0.00520833333333333)) * (:var10_item - 0.776757812500002) * -0.00587332411782299
(:var9_item - (-0.00520833333333333)) * (:var11_item -0.0503906249999999) * 0.00953104032282317
(:var10_item - 0.776757812500002) * (:var12_item - 0.0515624999999999) * -0.0646790675737112
(:var11_item -0.0503906249999999) * (:var11_item - 0.0503906249999999) * -0.180683768377719
(:var11_item - 0.0503906249999999) * (:var12_item - 0.0515624999999999) *0.104048651671741

Article Labels

There are no labels assigned to this post.