Choose Language Hide Translation Bar
Highlighted

## How can I obtain the BIC and AIC form -log likelihood for neural network ?

Hello,

I use the JMP pro 12.

I would like to know, how can I obtain the AIC or the BIC from -likelihood, in neural network analysis?

Regards,

Angelo

4 REPLIES 4
Highlighted  David_Burnham
Super User

## Re: How can I obtain the BIC and AIC form -log likelihood for neural network ?

I have some code that I can share with you to do this - unfortunately I'm travelling at the moment - if you don't get a reply from someone else I'll be able to post it on Monday.

-Dave
Highlighted

## Re: How can I obtain the BIC and AIC form -log likelihood for neural network ?

Dear David,

It is good for me. I will wait Monday.

Thank you so much,
Angelo
Highlighted  David_Burnham
Super User

## Re: How can I obtain the BIC and AIC form -log likelihood for neural network ?

Sorry I was pretty sure I'd done it for neural nets, but it was actually for the nonlinear platform.  Here's the code:

```dt = Open("\$SAMPLE_DATA/Nonlinear Examples/Chemical Kinetics.jmp");
fit = dt << Nonlinear(
Y( :Name( "Velocity (y)" ) ),
X( :Name( "Model (x)" ) ),
Newton,
Finish
);
rep = fit << Report;
matRmse = rep[NumberColBox(5)] << Get As Matrix;

/***************************************************
/          calculate AICc                          /
***************************************************/

// AIC = N.ln(SS/N) + 2k
// where N = # data points
// 		k = # parameters + 1
//		SS = sum of squares for the residuals

// AICc = AIC + [ 2k(k+1) / (N-k-1) ]

N = NRows(dt);
nParameters = 2;
k = nParameters + 1;
rmse = matRmse;
dfSS = N - nParameters;
MS = rmse^2;
SS = dfSS * MS; // this is correct compared to fit model
LL = N*Log(2*Pi()*e()*SS/N);
AIC = LL + 2*k;
AICc = AIC + ( (2*k*(k+1))/(N-k-1) );

/***************************************************
/          calculate BIC                           /
***************************************************/
BIC = LL + k*Log(N);
```

In this code I had to determine the log-likelihood value whereas the neural net gives you this so it should be easier.  I think you just need to think about what is the value of k.  I assume it would be 1 plus the number of weights that are being estimated (for a single layer network that would be #factors x #nodes);

-Dave
Highlighted

## Re: How can I obtain the BIC and AIC form -log likelihood for neural network ?

Is there a way to get k just from a fast column formula for a neural? Or do I have to know how the model was created? For instance

``````Names default to here(1);
dt = open("\$SAMPLE_DATA/Boston Housing.jmp");
nn = dt << Neural(
Y( :mvalue ),
X(
:crim,
:zn,
:indus,
:chas,
:nox,
:rooms,
:age,
:distance,
:tax,
:pt,
:b,
:lstat
),
Informative Missing( 0 ),
Validation Method( "Holdback", 0.3333 ),
Fit( NTanH( 3 ), NLinear( 2 ), NLinear2( 1 ), NGaussian2( 1 ) )
);

nn << (Fit << Save Fast Formulas);
pred = Column(dt, ncols(dt));
pred << Get Formula();``````

Returns:

``````T#1 = Matrix(
{{:crim, :zn, :indus, Design Nom( :chas, [0 1], <<Else Missing ), :nox, :rooms,
:age, :distance, :radial, :tax, :pt, :b, :lstat, }}
);
T#2 = Matrix(
{{T#1 * [0.0255351134390746, -0.0182659064376583, -0.0133657385920784,
0.147341979132032, 4.23887426583154, -0.4759713227295, -0.00456605888120198,
0.26468295131318, -0.0848679297282956, 0.00131989077607407, 0.179512094612108,
0.000822977008301723, 0.19096622684487, -6.88253368570889],
Exp(
-(0.5 * (T#1 * [-0.0381871935060958, -0.0555904841384534,
-0.0411067081795327, -0.0895161828161914, -1.57902743275984,
-0.411055020789087, 0.0078474002018379, 0.0101959901834069,
-0.0847563310399709, 0.00315505316787576, -0.120404341355212,
-0.011567453308019, -0.0508784652936048, 10.3295713240773]) ^ 2)
), 1}}
);
T#3 = Matrix(
{{TanH( 0.5 * T#2 * [0.35672808282841, -0.17630023981386, 0.271293314269094] ),
TanH( 0.5 * T#2 * [-0.622667700216216, 0.120210664201856, 0.530160002865065] ),
TanH( 0.5 * T#2 * [0.841925648920495, 0.63556140976371, -0.0364323446544486] ),
T#2 * [-0.329754708259113, 0.052504565555951, 0.050660282817552], T#2 * [
-1.44800816791098, 1.36833970880568, 0.980670022307775], 1}}
);
T#3 * [-5.78972326240828, -30.0242107869451, -2.5668761531561, 0.482110229175916,
5.9163758026328, 14.9646328172308];``````