cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Browse apps to extend the software in the new JMP Marketplace
Choose Language Hide Translation Bar
JesperJohansen
Level IV

Slope and intercept functions

I was very surprised to fint that JMP does not have a slope or intercept function. Can this relly be so?

It would be great if functions such as Slope(xMatrix, yMatrix), Intercept(xMatrix, yMatrix), Col Slope(xCol, yCol, By Vraiables) and Col Intercept(xCol, yCol, By Vraiables) were added (and diagnostics functions such as R2, SE etc,), yeilding simple RMS linear regression coefissients.

I know I can do the regressions in Bivariate, generate a data table of slopes etc. and update my original table with these, but this is not dynamic, and it is not practical for a large number of regression analyses.

I hope you will add such functions in the future.

BR
Jesper
1 ACCEPTED SOLUTION

Accepted Solutions
Milo
Staff (Retired)

Re: Slope and intercept functions

** Edit:

Based off of this post, I added the Linear Regression() JSL function in JMP 14. The JSL to extract the slope and intercept is much simpler now:

 

 

coefs = Linear Regression( y, X )[1]; 
coefs[1]; // intercept coefs[2]; // slope

 

Various diagnostics such as t-test statistic, p-value, and R-Squared are also returned and (optionally) printed neatly to the log window. Note that Linear Regression() is capable of more than simple linear regression. See the Scripting Index for three different examples of using the function with all optional features. 

 

***

Jesper,

 

Thanks for the recommendation. This is a good suggestion, and I'll look into adding a better solution in a future version of JMP. In the meantime, I added a JSL function in JMP 13 called Least Squares Solve that may be useful to you. 

 

In two lines of code, you can get estimates for the intercept, slope and standard errors (not counting the code to extract the results). It's more general than what you're requesting and is easiest to understand with some knowledge of Linear Algebra. It also doesn't report R^2 values or model fitting diagnostics that are available in platforms such as Fit Y by X.

 

To show you how to fit a simple linear regression with Least Squares Solve, I generated a random vector for x, and then generated y that depends linearly on x.

// Generating some data
intercept = 2; slope = 4;
x = J(100,1,Random Normal());
y = intercept + x * slope +J (100,1,Random Normal(0,.3));

In JMP 13, Least Squares Solve requires a design matrix, and it won't add an intercept term for you. To add an intercept, you have to add a column of 1's to the observed x. The JSL to do this is:

// add an intercept term by adding a column of 1s to the design matrix, X
X.design = J(nrow(x),1) || x;

If you're familiar with Linear Algebra, Least Squares Solve will solve the normal equations associated with the model:

     y = X.design * beta + error
However, since X.design has 2 columns and the first is a column of 1's, this model simplifies to simple linear regression, i.e. for individual i:

     y_i = intercept + slope * x_i + error_i.

// Solve for intercept, slope, and variance terms for each
{beta, var} = Least Squares Solve(X.design, y);

Least Squares Solve() returns the estimated beta coefficients (intercept and slope in this case) as well as their associated variances and covariances. Note that the standard error is the square root of the variance of an estimate. To extract the intercept, slope, and their respective standard errors, the code is:

// To retrieve the intercept, slope, and their associated standard errors:
beta[1]; //the intercept
beta[2]; //the slope
sqrt(var[1,1]); // the standard error for the intercept
sqrt(var[2,2]); // the standard error for the slope

I hope this helps,

 

Milo

View solution in original post

7 REPLIES 7
ian_jmp
Level X

Re: Slope and intercept functions

Perhaps you have seen it already, but if not there's a related thread here.

JesperJohansen
Level IV

Re: Slope and intercept functions

I see that it provides a workaround. I already was able to do the calculations, but not in an easy way. My post was more of a feature request. The lack of said functions seems like an odd ommission.

BR
Jesper
Jeff_Perkinson
Community Manager Community Manager

Re: Slope and intercept functions

The JMP Scripting Guide has a section on Statistical Examples in the Data Structures section on Matrices that shows how to use the matrix operations to compute a linear regression.

 

I've used that example to create the slope() and intercept() functions that you were looking for.

 

 

slope = Function( {x, y},
	{beta},
	If( !(Is Matrix( x ) & Is Matrix( y ) & N Col( x ) == 1 & N Col( y ) == 1),
		Show( "x or y arguments are not matrices or they have more than one column" );
		Throw();
	,
		x = J( N Row( x ), 1 ) || x; // put in an intercept column of 1s 
		beta = Inv( X` * X ) * X` * Y; // the least square estimates 
		beta[2];
	)
);

intercept = Function( {x, y},
	{beta},
	If( !(Is Matrix( x ) & Is Matrix( y ) & N Col( x ) == 1 & N Col( y ) == 1),
		Show( "x or y arguments are not matrices or they have more than one column" );
		Throw();
	,
		x = J( N Row( x ), 1 ) || x; // put in an intercept column of 1s 
		beta = Inv( X` * X ) * X` * Y; // the least square estimates 
		beta[1];
	)
); 

Y = [98, 112.5, 84, 102.5, 102.5, 50.5, 90, 77, 112, 150, 128, 133, 85, 112];
X = [65.3, 69, 56.5, 62.8, 63.5, 51.3, 64.3, 56.3, 66.5, 72, 64.8, 67, 57.5, 66.5];

s = slope( x, y );
i = intercept( x, y );
Show( s, i );

The Statistical Examples section expands on the example to show how to compute residuals and other diagnostic results for multiple regression as well.

 

 

I'll repeat that example below.

 

// open the data table
dt = Open( "$SAMPLE_DATA/Big Class.jmp" );
 
// get data into matrices
x = (Column( "Age" ) << Get Values) || (Column( "Height" ) << Get Values);
x = J( N Row( x ), 1, 1 ) || x;
y = Column( "Weight" ) << Get Values;
 
// regression calculations
xpxi = Inv( x` * x );
beta = xpxi * x` * y; // parameter estimates
resid = y - x * beta; // residuals
sse = resid` * resid; // sum of squared errors
dfe = N Row( x ) - N Col( x ); // degrees of freedom
mse = sse / dfe; // mean square error, error variance estimate
 
// additional calculations on estimates
stdb = Sqrt( Vec Diag( xpxi ) * mse ); // standard errors of estimates
alpha = .05;
qt = Students t Quantile( 1 - alpha / 2, dfe );
betau95 = beta + qt * stdb; // upper 95% confidence limits
betal95 = beta - qt * stdb; // lower 95% confidence limits
tratio = beta :/ stdb; // Student's T ratios
probt = (1 - t Distribution( Abs( tratio ), dfe )) * 2; // p-values
 
// present results
New Window( "Big Class Regression",
	Table Box(
		String Col Box( "Term", {"Intercept", "Age", "Height"} ),
		Number Col Box( "Estimate", beta ),
		Number Col Box( "Std Error", stdb ),
		Number Col Box( "TRatio", tratio ),
		Number Col Box( "Prob>|t|", probt ),
		Number Col Box( "Lower95%", betal95 ),
		Number Col Box( "Upper95%", betau95 )
	)
);

 

-Jeff
JesperJohansen
Level IV

Re: Slope and intercept functions

Thank you

I think I saw the latter example also when I was looking for the solution. I still don't think it solves my problem though. My problem is not that I cannot calculate slope and intercept, but that it seems needlessly cumbersome to do so.

In your example I need to script the functions, and also generate code to extract x- and y-matrices for each relevant By-variable from my table.

My question is not for help to do the calculations. I just want to voice my desire for said build-in functions. If they were there I would use them regularly. And now I miss them with the same frequency.

Jesper

BR
Jesper
stan_koprowski
Community Manager Community Manager

Re: Slope and intercept functions

Hello Jesper,

Have you seen the addin for collecting slopes and intercepts?

I believe this will do what you want.

Cheers,

Stan

JesperJohansen
Level IV

Re: Slope and intercept functions

It does the calculations I want, but not in the way I would prefer. Again, I can do the calculatios, so that is not the issue. I just think there could (should) be an easier way.

BR
Jesper
Milo
Staff (Retired)

Re: Slope and intercept functions

** Edit:

Based off of this post, I added the Linear Regression() JSL function in JMP 14. The JSL to extract the slope and intercept is much simpler now:

 

 

coefs = Linear Regression( y, X )[1]; 
coefs[1]; // intercept coefs[2]; // slope

 

Various diagnostics such as t-test statistic, p-value, and R-Squared are also returned and (optionally) printed neatly to the log window. Note that Linear Regression() is capable of more than simple linear regression. See the Scripting Index for three different examples of using the function with all optional features. 

 

***

Jesper,

 

Thanks for the recommendation. This is a good suggestion, and I'll look into adding a better solution in a future version of JMP. In the meantime, I added a JSL function in JMP 13 called Least Squares Solve that may be useful to you. 

 

In two lines of code, you can get estimates for the intercept, slope and standard errors (not counting the code to extract the results). It's more general than what you're requesting and is easiest to understand with some knowledge of Linear Algebra. It also doesn't report R^2 values or model fitting diagnostics that are available in platforms such as Fit Y by X.

 

To show you how to fit a simple linear regression with Least Squares Solve, I generated a random vector for x, and then generated y that depends linearly on x.

// Generating some data
intercept = 2; slope = 4;
x = J(100,1,Random Normal());
y = intercept + x * slope +J (100,1,Random Normal(0,.3));

In JMP 13, Least Squares Solve requires a design matrix, and it won't add an intercept term for you. To add an intercept, you have to add a column of 1's to the observed x. The JSL to do this is:

// add an intercept term by adding a column of 1s to the design matrix, X
X.design = J(nrow(x),1) || x;

If you're familiar with Linear Algebra, Least Squares Solve will solve the normal equations associated with the model:

     y = X.design * beta + error
However, since X.design has 2 columns and the first is a column of 1's, this model simplifies to simple linear regression, i.e. for individual i:

     y_i = intercept + slope * x_i + error_i.

// Solve for intercept, slope, and variance terms for each
{beta, var} = Least Squares Solve(X.design, y);

Least Squares Solve() returns the estimated beta coefficients (intercept and slope in this case) as well as their associated variances and covariances. Note that the standard error is the square root of the variance of an estimate. To extract the intercept, slope, and their respective standard errors, the code is:

// To retrieve the intercept, slope, and their associated standard errors:
beta[1]; //the intercept
beta[2]; //the slope
sqrt(var[1,1]); // the standard error for the intercept
sqrt(var[2,2]); // the standard error for the slope

I hope this helps,

 

Milo