cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Browse apps to extend the software in the new JMP Marketplace
Choose Language Hide Translation Bar
sreekumarp
Level I

PREDICTIVE MODELS COMPARISONS - LINEAR REGRESSION AND NEURAL NETWORK

I am utilising JMP PRO 16 for validating the data results from a Finite Element Simulation. I want to utilise both Linear Regression models and Neural Network models and carry out a comparison. For the Linear Regression model , I have found that variable transformations are required to fulfill the assumptions of linearity , normality of residuals and constant variance. The transformations are the square terms of two predictors and the logarithm to the base 10 of the response variable. This gives acceptable results in Linear Regression.

When I am modeling the same data on a Neural Network , do I need to enforce the same transformations for the predictors and the response variable before I start modelling ?

 

Sreekumar

4 REPLIES 4
mzwald
Staff

Re: PREDICTIVE MODELS COMPARISONS - LINEAR REGRESSION AND NEURAL NETWORK

Similar for regression models, it's a good idea to normalize your response for neural network models as well to get the best learning result over the range of response values.  Linearizing the factors are less important as JMP's neural network platform has nonlinear activation nodes using sigmoid and Gaussian nodes.

sreekumarp
Level I

Re: PREDICTIVE MODELS COMPARISONS - LINEAR REGRESSION AND NEURAL NETWORK

Thank you for your prompt response. I will proceed accordingly for Neural Networks as well

Sreekumar

Re: PREDICTIVE MODELS COMPARISONS - LINEAR REGRESSION AND NEURAL NETWORK

What is the basis for your comparison? You can't use AICc if your have different responses across the models.

 

How are you validating the selection of each model?

sreekumarp
Level I

Re: PREDICTIVE MODELS COMPARISONS - LINEAR REGRESSION AND NEURAL NETWORK

I have now used the Linear Regression , Bootstrap Forest and Neural Network for the model. I intend to evaluate the R square, RASE and AAE for comparing the model performances. For the first two options , I have used the logarithm to the base 10 as the transformation for the response variable. But for Neural Network , this transformation yielded erroneous results in the model comparison.

Also it is noticed in Linear Regression and Bootstrap Forest models , JMP Pro converts the predicted response to its regular value (instead of the Log10 values) before saving it to the Data Table. For Neural Network , I did not normalize the response and used the actual observed value as Y. Thereafter the model comparison platform yielded proper results and this comparison table is attached. 

Is this approach appropriate for this problem? Is there a way in Neural network to normalize the response values and obtain the predicted results in the regular form in the data table while saving or publishing the prediction formula ?