<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Different slopes between lin. regression and main axis of density elipse: why? in Discussions</title>
    <link>https://community.jmp.com/t5/Discussions/Different-slopes-between-lin-regression-and-main-axis-of-density/m-p/64974#M34273</link>
    <description>&lt;P&gt;Because the regression line assumes there is no error/variability in the X-values. All of the variability is in the Y. A density ellipse does not make that assumption. To see this, from your Fit Y by X plot, choose to Fit Orthogonal &amp;gt; Univariate Variances, Prin Comp. That line will be the major axis of the density ellipse.&lt;/P&gt;</description>
    <pubDate>Mon, 30 Jul 2018 16:48:13 GMT</pubDate>
    <dc:creator>Dan_Obermiller</dc:creator>
    <dc:date>2018-07-30T16:48:13Z</dc:date>
    <item>
      <title>Different slopes between lin. regression and main axis of density elipse: why?</title>
      <link>https://community.jmp.com/t5/Discussions/Different-slopes-between-lin-regression-and-main-axis-of-density/m-p/64954#M34268</link>
      <description>&lt;P&gt;Hi JMP Community,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This may be a silly question but I cannot figure out why the slope of a linear regression does not match that of the main axis of the distribution elipse for the same data set&amp;nbsp; (see below)?&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="DOT PLOT LIN REG + DENSITY ELIPSE.png" style="width: 400px;"&gt;&lt;img src="https://community.jmp.com/t5/image/serverpage/image-id/11743i7D168188DFD7040A/image-size/medium?v=v2&amp;amp;px=400" role="button" title="DOT PLOT LIN REG + DENSITY ELIPSE.png" alt="DOT PLOT LIN REG + DENSITY ELIPSE.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for your help.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Sincerely,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;TS&lt;/P&gt;</description>
      <pubDate>Mon, 30 Jul 2018 16:20:16 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Different-slopes-between-lin-regression-and-main-axis-of-density/m-p/64954#M34268</guid>
      <dc:creator>Thierry_S</dc:creator>
      <dc:date>2018-07-30T16:20:16Z</dc:date>
    </item>
    <item>
      <title>Re: Different slopes between lin. regression and main axis of density elipse: why?</title>
      <link>https://community.jmp.com/t5/Discussions/Different-slopes-between-lin-regression-and-main-axis-of-density/m-p/64974#M34273</link>
      <description>&lt;P&gt;Because the regression line assumes there is no error/variability in the X-values. All of the variability is in the Y. A density ellipse does not make that assumption. To see this, from your Fit Y by X plot, choose to Fit Orthogonal &amp;gt; Univariate Variances, Prin Comp. That line will be the major axis of the density ellipse.&lt;/P&gt;</description>
      <pubDate>Mon, 30 Jul 2018 16:48:13 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Different-slopes-between-lin-regression-and-main-axis-of-density/m-p/64974#M34273</guid>
      <dc:creator>Dan_Obermiller</dc:creator>
      <dc:date>2018-07-30T16:48:13Z</dc:date>
    </item>
    <item>
      <title>Re: Different slopes between lin. regression and main axis of density elipse: why?</title>
      <link>https://community.jmp.com/t5/Discussions/Different-slopes-between-lin-regression-and-main-axis-of-density/m-p/233903#M46372</link>
      <description>&lt;P&gt;Hi Dan&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;I have some question about&amp;nbsp;&lt;SPAN&gt;Fit Orthogonal between Univariate Variances, Prin Comp&amp;nbsp; and Fit X and Y.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;From my understand. When we used Fit X and Y to plot Orthogonal line with no error. For&amp;nbsp;Univariate Variances, Prin Comp It will add error variance ratio to&amp;nbsp; calculatetion and finding the equetion symmetry between Upper and Lower Othogonal line.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; My question is How Jmp can calculated variance ration, UpperCL and LowerCL ?. I still stuck this long time.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;I very appreciate to you.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Thanks you&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2019-11-15 15_55_48-Evans_at99 - Fit Y by X of CTQ_WRT_WDTH by PWT95 - JMP.png" style="width: 999px;"&gt;&lt;img src="https://community.jmp.com/t5/image/serverpage/image-id/20321i2A62DDE8792A8F5E/image-size/large?v=v2&amp;amp;px=999" role="button" title="2019-11-15 15_55_48-Evans_at99 - Fit Y by X of CTQ_WRT_WDTH by PWT95 - JMP.png" alt="2019-11-15 15_55_48-Evans_at99 - Fit Y by X of CTQ_WRT_WDTH by PWT95 - JMP.png" /&gt;&lt;/span&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 15 Nov 2019 09:03:54 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Different-slopes-between-lin-regression-and-main-axis-of-density/m-p/233903#M46372</guid>
      <dc:creator>Pisit_Seagate</dc:creator>
      <dc:date>2019-11-15T09:03:54Z</dc:date>
    </item>
    <item>
      <title>Re: Different slopes between lin. regression and main axis of density elipse: why?</title>
      <link>https://community.jmp.com/t5/Discussions/Different-slopes-between-lin-regression-and-main-axis-of-density/m-p/233916#M46374</link>
      <description>&lt;P&gt;You could start with Help &amp;gt; Books &amp;gt; Basic Analysis &amp;gt; Bivariate. The Statistical Details at the end of this chapter provides this explanation:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Fit Orthogonal&lt;/STRONG&gt;&lt;BR /&gt;Standard least square fitting assumes that the X variable is fixed and the Y variable is a&lt;BR /&gt;function of X plus error. If there is random variation in the measurement of X, you should fit a&lt;BR /&gt;line that minimizes the sum of the squared perpendicular differences (Figure 5.26). However,&lt;BR /&gt;the perpendicular distance depends on how X and Y are scaled, and the scaling for the&lt;BR /&gt;perpendicular is reserved as a statistical issue, not a graphical one.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Figure 5.26 Line Perpendicular to the Line of Fit&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Capture.JPG" style="width: 386px;"&gt;&lt;img src="https://community.jmp.com/t5/image/serverpage/image-id/20322i4CD219780EFA2955/image-size/large?v=v2&amp;amp;px=999" role="button" title="Capture.JPG" alt="Capture.JPG" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The fit requires that you specify the ratio of the variance of the error in Y to the error in X. This&lt;BR /&gt;is the variance of the error, not the variance of the sample points, so you must choose carefully.&lt;BR /&gt;The ratio is infinite in standard least squares because is zero. If you do an&lt;BR /&gt;orthogonal fit with a large error ratio, the fitted line approaches the standard least squares line&lt;BR /&gt;of fit. If you specify a ratio of zero, the fit is equivalent to the regression of X on Y, instead of Y&lt;BR /&gt;on X.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;The most common use of this technique is in comparing two measurement systems that both&lt;BR /&gt;have errors in measuring the same value. Thus, the Y response error and the X measurement&lt;BR /&gt;error are both the same type of measurement error. Where do you get the measurement error&lt;BR /&gt;variances? You cannot get them from bivariate data because you cannot tell which&lt;BR /&gt;measurement system produces what proportion of the error. So, you either must blindly&lt;BR /&gt;assume some ratio like 1, or you must rely on separate repeated measurements of the same&lt;BR /&gt;unit by the two measurement systems.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;An advantage to this approach is that the computations give you predicted values for both Y&lt;BR /&gt;and X; the predicted values are the point on the line that is closest to the data point, where&lt;BR /&gt;closeness is relative to the variance ratio.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;Confidence limits are calculated as described in Tan and Iglewicz (1999).&lt;/P&gt;</description>
      <pubDate>Fri, 15 Nov 2019 10:58:36 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Different-slopes-between-lin-regression-and-main-axis-of-density/m-p/233916#M46374</guid>
      <dc:creator>Mark_Bailey</dc:creator>
      <dc:date>2019-11-15T10:58:36Z</dc:date>
    </item>
    <item>
      <title>Re: Different slopes between lin. regression and main axis of density elipse: why?</title>
      <link>https://community.jmp.com/t5/Discussions/Different-slopes-between-lin-regression-and-main-axis-of-density/m-p/238299#M47073</link>
      <description>&lt;P&gt;Hi Mark,&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Thanks you for you comment. After i try to understand a concept of&amp;nbsp;&lt;STRONG&gt;Fit Orthogonal with variance ration.&amp;nbsp;&lt;/STRONG&gt;&amp;nbsp;From Jmp Document, I already knows concept and how to calculate a Intercept , Slope and variance ratio when variance ratio = 0.&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Anyway, Jmp Document still not mention a theory of &lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;How to calculate a slope and intercept when&amp;nbsp;variance ration not equal 0. &lt;/FONT&gt;&lt;/STRONG&gt;&lt;FONT color="#FF0000"&gt;&lt;FONT color="#000000"&gt;Do you have any knowleadge about this ?&amp;nbsp;&lt;/FONT&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="2019-12-12 10_53_28-.png" style="width: 633px;"&gt;&lt;img src="https://community.jmp.com/t5/image/serverpage/image-id/20618i08EAF0FFE804029E/image-dimensions/633x729?v=v2" width="633" height="729" role="button" title="2019-12-12 10_53_28-.png" alt="2019-12-12 10_53_28-.png" /&gt;&lt;/span&gt;&lt;A href="https://community.jmp.com/t5/forums/replypage/board-id/discussions/message-id/46374#" target="_blank" rel="noopener"&gt;Quote&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 12 Dec 2019 04:00:15 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Different-slopes-between-lin-regression-and-main-axis-of-density/m-p/238299#M47073</guid>
      <dc:creator>Pisit_Seagate</dc:creator>
      <dc:date>2019-12-12T04:00:15Z</dc:date>
    </item>
    <item>
      <title>Re: Different slopes between lin. regression and main axis of density elipse: why?</title>
      <link>https://community.jmp.com/t5/Discussions/Different-slopes-between-lin-regression-and-main-axis-of-density/m-p/238336#M47086</link>
      <description>&lt;P&gt;See this &lt;A href="https://en.wikipedia.org/wiki/Total_least_squares" target="_self"&gt;article&lt;/A&gt; for background on the computation of the orthogonal regression estimates.&lt;/P&gt;</description>
      <pubDate>Thu, 12 Dec 2019 12:59:44 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Different-slopes-between-lin-regression-and-main-axis-of-density/m-p/238336#M47086</guid>
      <dc:creator>Mark_Bailey</dc:creator>
      <dc:date>2019-12-12T12:59:44Z</dc:date>
    </item>
  </channel>
</rss>

