Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results for

- JMP User Community
- :
- Discussions
- :
- Discussions
- :
- Slope and intercept functions

Topic Options

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

Apr 6, 2017 11:09 PM
(4556 views)

I was very surprised to fint that JMP does not have a slope or intercept function. Can this relly be so?

It would be great if functions such as Slope(xMatrix, yMatrix), Intercept(xMatrix, yMatrix), Col Slope(xCol, yCol, By Vraiables) and Col Intercept(xCol, yCol, By Vraiables) were added (and diagnostics functions such as R2, SE etc,), yeilding simple RMS linear regression coefissients.

I know I can do the regressions in Bivariate, generate a data table of slopes etc. and update my original table with these, but this is not dynamic, and it is not practical for a large number of regression analyses.

I hope you will add such functions in the future.

BR

Jesper

Jesper

Solved! Go to Solution.

1 ACCEPTED SOLUTION

Accepted Solutions

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

Apr 28, 2017 10:09 AM
(6869 views)

Solution

Jesper,

Thanks for the recommendation. This is a good suggestion, and I'll look into adding a better solution in a future version of JMP. In the meantime, I added a JSL function in JMP 13 called Least Squares Solve that may be useful to you.

In two lines of code, you can get estimates for the intercept, slope and standard errors (not counting the code to extract the results). It's more general than what you're requesting and is easiest to understand with some knowledge of Linear Algebra. It also doesn't report R^2 values or model fitting diagnostics that are available in platforms such as Fit Y by X.

To show you how to fit a simple linear regression with Least Squares Solve, I generated a random vector for x, and then generated y that depends linearly on x.

```
// Generating some data
intercept = 2; slope = 4;
x = J(100,1,Random Normal());
y = intercept + x * slope +J (100,1,Random Normal(0,.3));
```

In JMP 13, Least Squares Solve requires a design matrix, and it won't add an intercept term for you. To add an intercept, you have to add a column of 1's to the observed x. The JSL to do this is:

```
// add an intercept term by adding a column of 1s to the design matrix, X
X.design = J(nrow(x),1) || x;
```

If you're familiar with Linear Algebra, Least Squares Solve will solve the normal equations associated with the model:

y = X.design * beta + error

However, since X.design has 2 columns and the first is a column of 1's, this model simplifies to simple linear regression, i.e. for individual i:

y_i = intercept + slope * x_i + error_i.

```
// Solve for intercept, slope, and variance terms for each
{beta, var} = Least Squares Solve(X.design, y);
```

Least Squares Solve() returns the estimated beta coefficients (intercept and slope in this case) as well as their associated variances and covariances. Note that the standard error is the square root of the variance of an estimate. To extract the intercept, slope, and their respective standard errors, the code is:

```
// To retrieve the intercept, slope, and their associated standard errors:
beta[1]; //the intercept
beta[2]; //the slope
sqrt(var[1,1]); // the standard error for the intercept
sqrt(var[2,2]); // the standard error for the slope
```

I hope this helps,

Milo

7 REPLIES

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

Apr 7, 2017 2:00 AM
(4549 views)

Perhaps you have seen it already, but if not there's a related thread here.

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

Apr 7, 2017 3:26 AM
(4544 views)

I see that it provides a workaround. I already was able to do the calculations, but not in an easy way. My post was more of a feature request. The lack of said functions seems like an odd ommission.

BR

Jesper

Jesper

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

Apr 7, 2017 10:01 AM
(4519 views)

The JMP Scripting Guide has a section on Statistical Examples in the Data Structures section on Matrices that shows how to use the matrix operations to compute a linear regression.

I've used that example to create the slope() and intercept() functions that you were looking for.

```
slope = Function( {x, y},
{beta},
If( !(Is Matrix( x ) & Is Matrix( y ) & N Col( x ) == 1 & N Col( y ) == 1),
Show( "x or y arguments are not matrices or they have more than one column" );
Throw();
,
x = J( N Row( x ), 1 ) || x; // put in an intercept column of 1s
beta = Inv( X` * X ) * X` * Y; // the least square estimates
beta[2];
)
);
intercept = Function( {x, y},
{beta},
If( !(Is Matrix( x ) & Is Matrix( y ) & N Col( x ) == 1 & N Col( y ) == 1),
Show( "x or y arguments are not matrices or they have more than one column" );
Throw();
,
x = J( N Row( x ), 1 ) || x; // put in an intercept column of 1s
beta = Inv( X` * X ) * X` * Y; // the least square estimates
beta[1];
)
);
Y = [98, 112.5, 84, 102.5, 102.5, 50.5, 90, 77, 112, 150, 128, 133, 85, 112];
X = [65.3, 69, 56.5, 62.8, 63.5, 51.3, 64.3, 56.3, 66.5, 72, 64.8, 67, 57.5, 66.5];
s = slope( x, y );
i = intercept( x, y );
Show( s, i );
```

The Statistical Examples section expands on the example to show how to compute residuals and other diagnostic results for multiple regression as well.

I'll repeat that example below.

```
// open the data table
dt = Open( "$SAMPLE_DATA/Big Class.jmp" );
// get data into matrices
x = (Column( "Age" ) << Get Values) || (Column( "Height" ) << Get Values);
x = J( N Row( x ), 1, 1 ) || x;
y = Column( "Weight" ) << Get Values;
// regression calculations
xpxi = Inv( x` * x );
beta = xpxi * x` * y; // parameter estimates
resid = y - x * beta; // residuals
sse = resid` * resid; // sum of squared errors
dfe = N Row( x ) - N Col( x ); // degrees of freedom
mse = sse / dfe; // mean square error, error variance estimate
// additional calculations on estimates
stdb = Sqrt( Vec Diag( xpxi ) * mse ); // standard errors of estimates
alpha = .05;
qt = Students t Quantile( 1 - alpha / 2, dfe );
betau95 = beta + qt * stdb; // upper 95% confidence limits
betal95 = beta - qt * stdb; // lower 95% confidence limits
tratio = beta :/ stdb; // Student's T ratios
probt = (1 - t Distribution( Abs( tratio ), dfe )) * 2; // p-values
// present results
New Window( "Big Class Regression",
Table Box(
String Col Box( "Term", {"Intercept", "Age", "Height"} ),
Number Col Box( "Estimate", beta ),
Number Col Box( "Std Error", stdb ),
Number Col Box( "TRatio", tratio ),
Number Col Box( "Prob>|t|", probt ),
Number Col Box( "Lower95%", betal95 ),
Number Col Box( "Upper95%", betau95 )
)
);
```

-Jeff

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

Apr 9, 2017 10:41 PM
(4456 views)

Thank you

I think I saw the latter example also when I was looking for the solution. I still don't think it solves my problem though. My problem is not that I cannot calculate slope and intercept, but that it seems needlessly cumbersome to do so.

In your example I need to script the functions, and also generate code to extract x- and y-matrices for each relevant By-variable from my table.

My question is not for help to do the calculations. I just want to voice my desire for said build-in functions. If they were there I would use them regularly. And now I miss them with the same frequency.

Jesper

BR

Jesper

Jesper

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

Apr 10, 2017 6:25 AM
(4443 views)

Hello Jesper,

Have you seen the addin for collecting slopes and intercepts?

I believe this will do what you want.

Cheers,

Stan

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

Apr 11, 2017 2:03 AM
(4408 views)

It does the calculations I want, but not in the way I would prefer. Again, I can do the calculatios, so that is not the issue. I just think there could (should) be an easier way.

BR

Jesper

Jesper

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

Apr 28, 2017 10:09 AM
(6870 views)

Jesper,

Thanks for the recommendation. This is a good suggestion, and I'll look into adding a better solution in a future version of JMP. In the meantime, I added a JSL function in JMP 13 called Least Squares Solve that may be useful to you.

In two lines of code, you can get estimates for the intercept, slope and standard errors (not counting the code to extract the results). It's more general than what you're requesting and is easiest to understand with some knowledge of Linear Algebra. It also doesn't report R^2 values or model fitting diagnostics that are available in platforms such as Fit Y by X.

To show you how to fit a simple linear regression with Least Squares Solve, I generated a random vector for x, and then generated y that depends linearly on x.

```
// Generating some data
intercept = 2; slope = 4;
x = J(100,1,Random Normal());
y = intercept + x * slope +J (100,1,Random Normal(0,.3));
```

In JMP 13, Least Squares Solve requires a design matrix, and it won't add an intercept term for you. To add an intercept, you have to add a column of 1's to the observed x. The JSL to do this is:

```
// add an intercept term by adding a column of 1s to the design matrix, X
X.design = J(nrow(x),1) || x;
```

If you're familiar with Linear Algebra, Least Squares Solve will solve the normal equations associated with the model:

y = X.design * beta + error

However, since X.design has 2 columns and the first is a column of 1's, this model simplifies to simple linear regression, i.e. for individual i:

y_i = intercept + slope * x_i + error_i.

```
// Solve for intercept, slope, and variance terms for each
{beta, var} = Least Squares Solve(X.design, y);
```

Least Squares Solve() returns the estimated beta coefficients (intercept and slope in this case) as well as their associated variances and covariances. Note that the standard error is the square root of the variance of an estimate. To extract the intercept, slope, and their respective standard errors, the code is:

```
// To retrieve the intercept, slope, and their associated standard errors:
beta[1]; //the intercept
beta[2]; //the slope
sqrt(var[1,1]); // the standard error for the intercept
sqrt(var[2,2]); // the standard error for the slope
```

I hope this helps,

Milo