<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: When running Logistic Regressions with no intercepts, has anyone observed very high General RSquares, about 25-30 points higher than models with intercepts? in Discussions</title>
    <link>https://community.jmp.com/t5/Discussions/When-running-Logistic-Regressions-with-no-intercepts-has-anyone/m-p/487182#M73101</link>
    <description>&lt;P&gt;Yes, I am sending in my example JMP file to JMP support today.&amp;nbsp; The model without an intercept has Generalized RSquare at 98% and the models with an intercept has Generalized RSquare at 55%.&amp;nbsp; I'll keep the community posted.&lt;/P&gt;</description>
    <pubDate>Mon, 16 May 2022 14:59:24 GMT</pubDate>
    <dc:creator>Liz_S</dc:creator>
    <dc:date>2022-05-16T14:59:24Z</dc:date>
    <item>
      <title>When running Logistic Regressions with no intercepts, has anyone observed very high General RSquares, about 25-30 points higher than models with intercepts?</title>
      <link>https://community.jmp.com/t5/Discussions/When-running-Logistic-Regressions-with-no-intercepts-has-anyone/m-p/486173#M73015</link>
      <description>&lt;P&gt;On a couple different projects with a 0/1 outcome variable to predict, I have noticed that running the logistic regression (generalized, with binomial variance) checking the No Intercept box boosts the Generalized R Square substantially, about 25 or 30 percentage points.&amp;nbsp; My latest model is for a rare event that is observed at about 1.2% in the experience period population.&amp;nbsp; Predictive models with an intercept yield Generalized R Squares at about .60 to .67, while models without the intercept are at about .95 to .98.&amp;nbsp; I do have some key variables that are highly predictive, so at first I thought the Gen R Squares above 90% seemed reasonable.&amp;nbsp; But the confusion matrix shows more errors than I would like, even lowering the threshold to 2%-5%.&amp;nbsp; I do like the idea that No Intercept implies a blind log-odds ratio for the constant term since Intercept=0, like flipping a coin, Probability =0.50. But perhaps these models are too easy to beat, inflating the Generalized R Square that depends on the likelihood ratios of (L0=intercept only model) to (LM fitted models with X predictors).&amp;nbsp; Particularly if I know before hand (a priori) that the outcome event is rare.&amp;nbsp; So, while it would be great to write a brief that has Generalized R Square about 95%-98%, I think it might be more prudent and practical if I use a model with an intercept that comes in at Gen R Square at 60%.&amp;nbsp; Please respond back if you have any advise for me.&amp;nbsp; Thanks!&lt;/P&gt;</description>
      <pubDate>Sat, 10 Jun 2023 20:50:26 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/When-running-Logistic-Regressions-with-no-intercepts-has-anyone/m-p/486173#M73015</guid>
      <dc:creator>Liz_S</dc:creator>
      <dc:date>2023-06-10T20:50:26Z</dc:date>
    </item>
    <item>
      <title>Re: When running Logistic Regressions with no intercepts, has anyone observed very high General RSquares, about 25-30 points higher than models with intercepts?</title>
      <link>https://community.jmp.com/t5/Discussions/When-running-Logistic-Regressions-with-no-intercepts-has-anyone/m-p/486304#M73034</link>
      <description>&lt;P&gt;I compared the models with and without an intercept in a few examples and always observed the opposite trend: the R square metrics were better when the model included an intercept term. If you can reproduce the results that you reported then I suggest contacting JMP Technical Support (&lt;A href="mailto:support@jmp.com" target="_blank"&gt;support@jmp.com&lt;/A&gt;) to get resolution. Please reply to this discussion to capture their findings for the benefit of the Community.&lt;/P&gt;</description>
      <pubDate>Thu, 12 May 2022 15:30:18 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/When-running-Logistic-Regressions-with-no-intercepts-has-anyone/m-p/486304#M73034</guid>
      <dc:creator>Mark_Bailey</dc:creator>
      <dc:date>2022-05-12T15:30:18Z</dc:date>
    </item>
    <item>
      <title>Re: When running Logistic Regressions with no intercepts, has anyone observed very high General RSquares, about 25-30 points higher than models with intercepts?</title>
      <link>https://community.jmp.com/t5/Discussions/When-running-Logistic-Regressions-with-no-intercepts-has-anyone/m-p/487182#M73101</link>
      <description>&lt;P&gt;Yes, I am sending in my example JMP file to JMP support today.&amp;nbsp; The model without an intercept has Generalized RSquare at 98% and the models with an intercept has Generalized RSquare at 55%.&amp;nbsp; I'll keep the community posted.&lt;/P&gt;</description>
      <pubDate>Mon, 16 May 2022 14:59:24 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/When-running-Logistic-Regressions-with-no-intercepts-has-anyone/m-p/487182#M73101</guid>
      <dc:creator>Liz_S</dc:creator>
      <dc:date>2022-05-16T14:59:24Z</dc:date>
    </item>
    <item>
      <title>Re: When running Logistic Regressions with no intercepts, has anyone observed very high General RSquares, about 25-30 points higher than models with intercepts?</title>
      <link>https://community.jmp.com/t5/Discussions/When-running-Logistic-Regressions-with-no-intercepts-has-anyone/m-p/488203#M73161</link>
      <description>&lt;P&gt;Hello, this morning I received a detailed response from JMP Support with numerous suggestions that will help me be more efficient in JMP when modeling, as well as to try some techniques I have never used yet.&amp;nbsp; The response (from&amp;nbsp;Patrick Giuliano) referenced this article and advice: "The importance of "many approaches" leads to a common and defendable solution. From Lavine, M., Frequentist, Bayes, or Other? (Summarized in Editorial THE AMERICAN STATISTICIAN, 2019, VOL. 73, NO. 51, 1-19): 1. Look for and present results from many models that fit the data well. 2. Evaluate models, not just procedures."&lt;/P&gt;&lt;P&gt;Essentially, I learned that&amp;nbsp;the very high Generalized R Squares (~98%) for the no-intercept models probably indicate a lack of stability; that it was too strong of an assumption to force the linear models through the origin.&amp;nbsp; Perhaps I also should revisit some of the modeling issues created by the multicollinearity in the predictors.&amp;nbsp; It was a helpful reply!&amp;nbsp; I appreciate being able to reach out to JMP Support with my de-identified data and my scripts.&amp;nbsp; Thanks much!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 18 May 2022 17:38:00 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/When-running-Logistic-Regressions-with-no-intercepts-has-anyone/m-p/488203#M73161</guid>
      <dc:creator>Liz_S</dc:creator>
      <dc:date>2022-05-18T17:38:00Z</dc:date>
    </item>
    <item>
      <title>Re: When running Logistic Regressions with no intercepts, has anyone observed very high General RSquares, about 25-30 points higher than models with intercepts?</title>
      <link>https://community.jmp.com/t5/Discussions/When-running-Logistic-Regressions-with-no-intercepts-has-anyone/m-p/488462#M73195</link>
      <description>&lt;P&gt;I'm glad that you got a helpful answer. Best of luck in all your modeling!&lt;/P&gt;</description>
      <pubDate>Thu, 19 May 2022 12:40:47 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/When-running-Logistic-Regressions-with-no-intercepts-has-anyone/m-p/488462#M73195</guid>
      <dc:creator>Mark_Bailey</dc:creator>
      <dc:date>2022-05-19T12:40:47Z</dc:date>
    </item>
  </channel>
</rss>

