cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
JMP is taking Discovery online, April 16 and 18. Register today and join us for interactive sessions featuring popular presentation topics, networking, and discussions with the experts.
Choose Language Hide Translation Bar
Danijel_V
Level III

No of DF in p-value calculation from t Ratio in Parameter Estimates table (Standard least squares)

Hello,

 

I may have a trivial question, however I have not find an answer anywhere else.

 

I am runing a linear regression model with standard least squares (DOE made with DSD). In the Parameter Estimates table (in the attached picture) t Ratios and p-values (Prob > ItI) can be seen for each included effect in the model. I wonder what is t critical in the test or to put in another way what is the used number of degrees of freedom for the two sided t-test with 5 % of signifance level.

 

With trial and error I figured it out that it is the number of degrees of freedom used for Error. It also seem odd to me, that the probabilites form F ratio are the same as from t ratio. Could you please explain the logic behind it? Why not all degress of freedom are being used (16 in example instead of 11)?

 

Thanks, Danijel

1 ACCEPTED SOLUTION

Accepted Solutions

Re: No of DF in p-value calculation from t Ratio in Parameter Estimates table (Standard least squares)

You should look at the Statistic Details chapter in Help > Books > Fitting Linear Models.

 

The t ratios use the error DF. You can use the JSL function q = t Quantile( probability, df ) to obtain the 'critical' quantile. Use p/2 for the two-sided case. See Help > Scripting Index > Functions > Probability > t Quantile.

 

In the case of a balanced design (your case), the F ratios and t ratios test the same hypothesis. So the F ratio is the square of the t ratio. They must yield the same p value in this special case.

 

You do not understand 'degrees of freedom.' The sum of squares includes more terms than the amount of independent information. So the DF correct for that in the mean squares. If you estimated 15 parameters, for example, you would have only 1 DF of independent information left to assess the estimates. You cannot 'eat your cake and have it, too.'

View solution in original post

1 REPLY 1

Re: No of DF in p-value calculation from t Ratio in Parameter Estimates table (Standard least squares)

You should look at the Statistic Details chapter in Help > Books > Fitting Linear Models.

 

The t ratios use the error DF. You can use the JSL function q = t Quantile( probability, df ) to obtain the 'critical' quantile. Use p/2 for the two-sided case. See Help > Scripting Index > Functions > Probability > t Quantile.

 

In the case of a balanced design (your case), the F ratios and t ratios test the same hypothesis. So the F ratio is the square of the t ratio. They must yield the same p value in this special case.

 

You do not understand 'degrees of freedom.' The sum of squares includes more terms than the amount of independent information. So the DF correct for that in the mean squares. If you estimated 15 parameters, for example, you would have only 1 DF of independent information left to assess the estimates. You cannot 'eat your cake and have it, too.'