turn on suggestions

Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results for

- JMP User Community
- :
- Discussions
- :
- Discussions
- :
- Adjusting data to a different covariate value in ANCOVA

Topic Options

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

Jul 9, 2010 9:49 AM
(1538 views)

Hi,

I have a straight forward ANCOVA with an IV, a DV and a covariate. Somewhere in the help files I read that when performing ANCOVA, JMP adjusts the least square means of the DV according to the covariate mean. This covariate mean is rather inconvenient. Is there a way to select a value on my own from within the range of the covariate values?

Thanks!

I have a straight forward ANCOVA with an IV, a DV and a covariate. Somewhere in the help files I read that when performing ANCOVA, JMP adjusts the least square means of the DV according to the covariate mean. This covariate mean is rather inconvenient. Is there a way to select a value on my own from within the range of the covariate values?

Thanks!

4 REPLIES

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

Jul 15, 2010 5:03 AM
(1411 views)

Can you find the section in the help file you're referring to and post it here? I would have thought it was more likely that it indicates that the least squares means are adjusted for the covariate, as opposed to the covariate *mean*.

If all the values of your covariate are the same - whatever that value is - you ought to find that the model fitting procedure gives you the same answer as if you hadn't fitted a covariate at all. That's because the procedure is effectively incorporating a regression of your dependent variable against the covariate, and if all your covariate values are the same the effect would be like regressing Y against a constant. Or is this not what you're asking?

If all the values of your covariate are the same - whatever that value is - you ought to find that the model fitting procedure gives you the same answer as if you hadn't fitted a covariate at all. That's because the procedure is effectively incorporating a regression of your dependent variable against the covariate, and if all your covariate values are the same the effect would be like regressing Y against a constant. Or is this not what you're asking?

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

Jul 15, 2010 6:40 AM
(1411 views)

Hi David, thanks for the response.

The section I was talking about is under "Standard Least Squares: an introduction". I found it by clicking on "analysis of covariance examples" in the help index. The excerpt says:

"The least squares means are now different from the ordinary mean because they are adjusted for the effect of X, the covariate, on the response, Y. Now the least squares means are the predicted values that you expect for each of the three values of Drug given that the covariate, X, is held at some constant value.** The constant value is chosen for convenience to be the mean of the covariate, **which is 10.7333."

To clarify, the values of my covariates are not all the same. If they were, as you say, I wouldn't have to worry about correcting my DV for covariate effects. The method I'm using seems to be adjusting the DV vs. IV relationship to the mean covariate value. I'd like to change that, because if you're comparing across studies, the value of the covariate you adjust to would - in my mind - put the least square means on different scales in each study. Does that make sense?

Cheers.

The section I was talking about is under "Standard Least Squares: an introduction". I found it by clicking on "analysis of covariance examples" in the help index. The excerpt says:

"The least squares means are now different from the ordinary mean because they are adjusted for the effect of X, the covariate, on the response, Y. Now the least squares means are the predicted values that you expect for each of the three values of Drug given that the covariate, X, is held at some constant value.

To clarify, the values of my covariates are not all the same. If they were, as you say, I wouldn't have to worry about correcting my DV for covariate effects. The method I'm using seems to be adjusting the DV vs. IV relationship to the mean covariate value. I'd like to change that, because if you're comparing across studies, the value of the covariate you adjust to would - in my mind - put the least square means on different scales in each study. Does that make sense?

Cheers.

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

Jul 15, 2010 7:49 AM
(1411 views)

Ah - I understand now: yes, I can see the problem, but I've just been doing a little experimenting, and it looks as though it may not be a problem after all.

You should be able to adjust all the actual values of the covariate in any individual study up or down by a constant without changing the results of the analysis at all - after all, all you're doing is rescaling one of your explanatory variables. You should therefore get a set of treatment means that always differ by the same amount, regardless of the value of that constant; the only remaining question is whether their actual*values* are the same. To test that, I loaded up the Big Class data set and created a new column called Height2 = Height minus a constant (I used 50, but it could have been anything):

dt = open("$SAMPLE_DATA/Big Class.jmp");

I then tried fitting two models to it: one with Height as the covariate in an analysis of Weight using Sex as my treatment effect, and a second one with Height2 as the covariate in the same analysis. The Sex treatment means were identical (M=103.02; F=107.42), so on the basis of that comparison, it doesn't seem to matter how the covariate is scaled.

I also did the same thing using Age as my factor (don't forget to change the modelling type of Age to "Nominal" before running this one, since it's a numeric variable), and again got two identical sets of mean scores.

Do check me on this, but it looks as though the problem may just conveniently disappear.

You should be able to adjust all the actual values of the covariate in any individual study up or down by a constant without changing the results of the analysis at all - after all, all you're doing is rescaling one of your explanatory variables. You should therefore get a set of treatment means that always differ by the same amount, regardless of the value of that constant; the only remaining question is whether their actual

dt = open("$SAMPLE_DATA/Big Class.jmp");

I then tried fitting two models to it: one with Height as the covariate in an analysis of Weight using Sex as my treatment effect, and a second one with Height2 as the covariate in the same analysis. The Sex treatment means were identical (M=103.02; F=107.42), so on the basis of that comparison, it doesn't seem to matter how the covariate is scaled.

I also did the same thing using Age as my factor (don't forget to change the modelling type of Age to "Nominal" before running this one, since it's a numeric variable), and again got two identical sets of mean scores.

Do check me on this, but it looks as though the problem may just conveniently disappear.

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

Jul 15, 2010 8:33 AM
(1411 views)

I'm kind of kicking myself right now because that is such a simple solution!! I do think it works, because all that really matters is the relative differences among the covariate values, not their absolutes. Thanks so much, David!