Turn on suggestions

Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results for

- JMP User Community
- :
- Discussions
- :
- Discussions
- :
- How does JMP Pro compute the RSquare for neural networks and bootstrap forests?

Topic Options

- Start Article
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

Feb 7, 2019 6:36 AM
(912 views)

Dear all, I am wondering how does JMP compute the RSquare for neural nets and bootstrap forest? What are the formulas? For example, I know the calculation of RSquare for least square regression: "RSquare: Estimates the proportion of variation in the response that can be attributed to the model rather than to random error. Using quantities from the corresponding Analysis of Variance table, RSquare (also called the coefficient of multiple determination) is calculated as: sum of squares (Model)/ sum of squares (C. Total)" I found that in the manual there are two RSquares for neural nets and bootstrap forests: Generalized RSquare & Entropy RSquare. However, when I ran my neural nets and bootstrap forests, I only got one RSquare: RSquare, which is not explained in the manual. So I am writing to ask how is this RSquare be calculated? Is there a formula? Thank you.

1 ACCEPTED SOLUTION

Accepted Solutions

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

If your response variable is continuous then you get R square the same as for linear regression. If your response is categorical, then you get the other two R square measures.

The generalized R square is the same as R square for a continuous response.

The entropy R square is the ratio of the log likelihood difference between the full and reduced model to the log likelihood of the reduced model.

Learn it once, use it forever!

3 REPLIES 3

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

If your response variable is continuous then you get R square the same as for linear regression. If your response is categorical, then you get the other two R square measures.

The generalized R square is the same as R square for a continuous response.

The entropy R square is the ratio of the log likelihood difference between the full and reduced model to the log likelihood of the reduced model.

Learn it once, use it forever!

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

Re: How does JMP Pro compute the RSquare for neural networks and bootstrap forests?

Is there a equation of how to calculate RSquare for neural networks with 10-fold cross validation?

I just wondering if JMP overestimate the RSquare.

I ran a neural network (10-fold cross validation, Robust fit, 1 Tour), both RSquares of Training and Validation are > 0.8 (Fig1.png).

Then I generated a new column "Predicted Y" by "Save Fast Formulas".

However, the RSquare between Y and Predicted Y is only 0.74 (Fig2.png).

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Get Direct Link
- Email to a Friend
- Report Inappropriate Content

Re: How does JMP Pro compute the RSquare for neural networks and bootstrap forests?

Hello Leon,

the validation strategy separates your data into two groups and the rsquare is calculated for both groups separately. When you save the prediction formula and plot predicted vs observed that thenm takes all data into one set. Clearly the rsquare of this regression is less than that of the two separate groups.