cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
  • JMP 19 is here! See the new features at jmp.com/new.
  • Register to attend Discovery Summit 2025 Online: Early Users Edition, Sept. 24-25.
Choose Language Hide Translation Bar
olithun
Level II

Calculating roots of a function

Hi everyone,

How do I calculate the roots of a  function in JSL for confidence intervals?

I have a special case, where I need to calculate confidence Intervals which aren't implemented in JMP.

Using the Wilks theorem, I want to calculate confidence intervalls by calculating the roots of the following function:

 

f(x)=-2log(Likelihood(x|Data))-ChiSquare Quantile( 1-alpha, 1).

Where x is a real number and the Data is a univariate sample

 

Is there any build in JSL function I can use? Or at least something similar?

I couldn`t find anything on the web or in the scripting index.

 

I would really appreciate some suggestions.

Thanks in advance

 

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Calculating roots of a function

See Help about the Minimize() function. Your expression is Abs( -2log(Likelihood(x|Data))-ChiSquare Quantile( 1-alpha, 1) ). This numerical optimization should be able to solve for the roots. The easiest way would be to use a Script Editor. Select File > New > Script. Right-click and select Show Embedded Log. This way the result of the function call will be shown in the same window.

View solution in original post

7 REPLIES 7

Re: Calculating roots of a function

See Help about the Minimize() function. Your expression is Abs( -2log(Likelihood(x|Data))-ChiSquare Quantile( 1-alpha, 1) ). This numerical optimization should be able to solve for the roots. The easiest way would be to use a Script Editor. Select File > New > Script. Right-click and select Show Embedded Log. This way the result of the function call will be shown in the same window.

olithun
Level II

Re: Calculating roots of a function

Thanks for the answer,

the Minimize function seems to be seems to be a little unstable in extreme situations.

However, your tip gave me a good starting point to come up with a solution.

 

 

DunnTiger764
Level II

Re: Calculating roots of a function

Minimize(Abs(function)) can behave badly in some cases.  JMP should really add a bracketed root solver, along the lines of uniroot() in R.  Until that happens, Ridders' method ( Ridders' method - Wikipedia ) can be programmed in jsl.  It is nearly as efficient as Brent's method ( Brent's method - Wikipedia ), but easier to program.  Numerical Recipes includes an implementation.

Re: Calculating roots of a function

Why don't you add this suggestion to the Wish List discussion? Search first to see if anyone else has already suggested it and vote for it (kudo). If no one else has, then add it for consideration. Thanks!

DunnTiger764
Level II

Re: Calculating roots of a function

rcast15
Level II

Re: Calculating roots of a function

@DunnTiger764 Did you end up finding a solution to this problem?

Re: Calculating roots of a function

Hello, @olithun -san, @rcast15 -san and @DunnTiger764 -san,

 

I am Yusuke Ono, Senior Tester at JMP Japan. I apologize for my poor English in advance.

 

If you want to calculate the profile likelihood confidence interval in a general model estimated by the maximum likelihood estimation, I would like to recommend that you use the Nonlinear platform.
 
To use this feature in the Nonliear platform, you need to specify the negative log-likelihood as the Loss function by yourself. Once you specify the Loss function in a data table, the Nonlinear platform automatically calculate profile likelihood confidence limits only by clicking the [Confidence Limits] button, 
 
[Steps]
(1) Specify the Loss formula in a column in a JMP data table. This Loss funciton is a negative log-likelihood (= -LogLikelihood(theta|data)), and the model parameters are defined as Parameter in the formula.
 
(2) Select [Analyze] > [Specialized Modeling] > [Nonlinear].
(3) Select the column which contain the negative log-likelihood function to [Loss].
    Click [OK].
(4) In the report, click [Go] buttton.
(5) Change the Alpha value if needed.
    Change the Converge Criterion if needed.
    Click [Confidence Limits] button.
 
The following is a toy example to calculate the profile likelihood confidence interval for the probability parameter in one-sample binomial distribution. In the formula, I drop the constant term (Log(_nC_k)).
Note that the score interval (which is the method implemented in the Distribution platform) must be statistically better than the profile likelihood interval in this example. This example shows only how to use the Nonlinear platform for calculating the profile likelihood interval.
 
dt = New Table( "Binomial Data",
	New Column( "k", Values( [10] )),
	New Column( "n", Values( [500] )),
	New Column( "Negative Loglikehood",
		Formula( Parameter( {p = 0.5}, -:k * Log( p ) - (:n - :k) * Log( 1 - p ) ) ),
	)
);
dt << Nonlinear(
	Loss( :Negative Loglikehood ),
	CL Limit( 1e-15 ),
	Loss is Neg LogLikelihood( 1 ),
	Newton,
	Finish,
	Plot( 0 ),
	Confidence Limits
);
The following code checks the uppler limit calculated by the above script is really what we expect.
p0 = 0.0348142211885808;
p1 = 0.02;
Show(ChisquareQuantile(0.95,1));
Show(2 *(Log(Binomial Probability(p1,500,10)) - Log(Binomial Probability(p0,500,10))));

/*
ChiSquare Quantile(0.95, 1) = 3.84145882069412;
2 * (Log(Binomial Probability(p1, 500, 10)) - Log(Binomial Probability(p0, 500, 10))) = 3.84145882069407;
*/
 
The Nonlinear platform can also calculate the profile likelihood confidence intervals when your model has more than one model parameters.
 
Also note that some platforms in JMP and JMP Pro calcaulted the confidence intervals by the profile likelihood method.
 
If you want to calculate the root of a general equation, I think how to solve the equation on JMP depends on how complicated the general equation is. JMP does not offer a general solver for general equations. The Profiler might be useful for some situations. You might get a resonable solution by specifying (f(x))^2 as the objective function and use Minimize() JSL function or the Nonlinear platform. And, I think that it not very hard, although it's not very easy, to program a specific code of Regula-Falsi method, Ridders' method or the Newton method by using JMP Scripting Language(JSL). If your general equation is very complicated, it might be better to use Python code from JMP. In JMP 18, you can use Python from JMP very smoothly.
 
The following is a very rough example to program Regula-Falsi method and Ridder's method. You can also use your JSL Function as an argument for your other JSL Function. The following code calculate the profile likelihood confidence interval for the probability parameter in binomial distibution with k = 10 and n = 500.
This script is just a very rough sample to show how hard it is to program the algorithms with JSL. I have no t checked the detail, and there is no error handling in the following code.
 
Names Default To Here(1);
/* Regula Falsi method */
/* https://en.wikipedia.org/wiki/Regula_falsi */
Regula Falsi = Function({f, data, x0, x2, tol=1e-8, niter = 1000}, {Default Local},
	fx0 = f(x0, data);
	fx2 = f(x2, data);
	If(fx0 * fx2>0, Write("f(pl)*f(pu) should be less than zero"); Return(.));
	For(i = 1, i <= niter, i++, 
		If(Abs(fx0)<tol, Return(x0), 
			Abs(fx2)<tol, Return(x2)
		);
		x1 = (fx0*x2 - fx2*x0)/(fx0 - fx2);
		fx1 = f(x1, data);
		If(fx1*fx0>0, fx0=fx1; x0=x1,
			fx1*fx2>0, fx2=fx1; x2=x1,
			Return(x1)
		);
	);
	Return(.);
	
);

/* Ridders' method */
/* https://en.wikipedia.org/wiki/Ridders%27_method */
Ridders = Function({f, data, x0, x2, tol=1e-8, niter = 1000}, {Default Local},
	fx0 = f(x0, data);
	fx2 = f(x2, data);
	If(fx0 * fx2>0, Write("f(pl)*f(pu) should be less than zero"); Return(.));
	For(i = 1, i <= niter, i++, 
		If(Abs(fx0)<tol, Return(x0), 
			Abs(fx2)<tol, Return(x2)
		);
		x1 = (x0 + x2) / 2;
		fx1 = f(x1, data);
		sign fx0 = If(fx0>0,1,fx0<0,-1, Retrun(x0));
		x3 = x1 + (x1-x0) * sign fx0 * fx1 /Sqrt(fx1^2 - fx0*fx2);
		fx3 = f(x3, data);
		If( fx1*fx3 < 0, 
				x0 = x1; fx0 = fx1; 
				x2 = x3; fx2 = fx3;
			,
			fx3*fx0>0, fx0=fx3; x0=x3,
			fx3*fx2>0, fx2=fx3; x2=x3,
			Return(x3)
		);
	);
	Return(.);
	
);


/* Solve for x to satisfy f(x, data) = 0 */
f = Function({p, data}, {Default Local},
	k = data[1];
	n = data[2];
	chi = data[3];
	p1 = k/n;
	2*(k*Log(p1/p)+(n-k)*Log((1-p1)/(1-p))) - chi
);


k = 10;
n = 500;
chi = Chisquare Quantile(1-0.05, 1);
data = k||n||chi;

/* Upper Confidence Limit */
pl = k/n;
pu = 0.99999;
ci up rf= Regula Falsi(NameExpr(f), data, pl, pu);
ci up = Ridders(NameExpr(f), data, pl, pu);

/* Lower Confidence Limit */
pl = 0.00001;
pu = k/n;
ci low rf= Regula Falsi(NameExpr(f), data, pl, pu);
ci low = Ridders(NameExpr(f), data, pl, pu);

Show(ci low rf||ci up rf);
Show(ci low || ci up);

I would like to expect this information is useful even a little bit.
Yusuke Ono (Senior Tester at JMP Japan)

Recommended Articles