** Edit:
Based off of this post, I added the Linear Regression() JSL function in JMP 14. The JSL to extract the slope and intercept is much simpler now:
coefs = Linear Regression( y, X )[1];
coefs[1]; // intercept
coefs[2]; // slope
Various diagnostics such as t-test statistic, p-value, and R-Squared are also returned and (optionally) printed neatly to the log window. Note that Linear Regression() is capable of more than simple linear regression. See the Scripting Index for three different examples of using the function with all optional features.
***
Jesper,
Thanks for the recommendation. This is a good suggestion, and I'll look into adding a better solution in a future version of JMP. In the meantime, I added a JSL function in JMP 13 called Least Squares Solve that may be useful to you.
In two lines of code, you can get estimates for the intercept, slope and standard errors (not counting the code to extract the results). It's more general than what you're requesting and is easiest to understand with some knowledge of Linear Algebra. It also doesn't report R^2 values or model fitting diagnostics that are available in platforms such as Fit Y by X.
To show you how to fit a simple linear regression with Least Squares Solve, I generated a random vector for x, and then generated y that depends linearly on x.
// Generating some data
intercept = 2; slope = 4;
x = J(100,1,Random Normal());
y = intercept + x * slope +J (100,1,Random Normal(0,.3));
In JMP 13, Least Squares Solve requires a design matrix, and it won't add an intercept term for you. To add an intercept, you have to add a column of 1's to the observed x. The JSL to do this is:
// add an intercept term by adding a column of 1s to the design matrix, X
X.design = J(nrow(x),1) || x;
If you're familiar with Linear Algebra, Least Squares Solve will solve the normal equations associated with the model:
y = X.design * beta + error
However, since X.design has 2 columns and the first is a column of 1's, this model simplifies to simple linear regression, i.e. for individual i:
y_i = intercept + slope * x_i + error_i.
// Solve for intercept, slope, and variance terms for each
{beta, var} = Least Squares Solve(X.design, y);
Least Squares Solve() returns the estimated beta coefficients (intercept and slope in this case) as well as their associated variances and covariances. Note that the standard error is the square root of the variance of an estimate. To extract the intercept, slope, and their respective standard errors, the code is:
// To retrieve the intercept, slope, and their associated standard errors:
beta[1]; //the intercept
beta[2]; //the slope
sqrt(var[1,1]); // the standard error for the intercept
sqrt(var[2,2]); // the standard error for the slope
I hope this helps,
Milo