<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Whitening the correlation using PCA in Discussions</title>
    <link>https://community.jmp.com/t5/Discussions/Whitening-the-correlation-using-PCA/m-p/371819#M62172</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.jmp.com/t5/user/viewprofilepage/user-id/10363"&gt;@AT&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; If I understand what you're after, you want to know how to go from the principle components back to X and Y, is that correct?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; For example, if you have a three component system: X1, X2, X3 and you perform a PCA you have Prin1, Prin2, and Prin3. These PCs will be linear combinations of X1, X2, and X3 with coefficients that are the eigenvalues. You can get the eigenvalues and eigenvectors from the hot button next to the PC at the top of the report.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; The Eigenvector matrix shows you what the coefficients are to get the linear combination of either what X1 is in terms of Prin1, Prin2, and Prin3 (row in table), or you can find what Prin1 is in terms of X1, X2, X3 (column in table). See an example screen shot below.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="DiedrichSchmidt_0-1616783208631.png" style="width: 400px;"&gt;&lt;img src="https://community.jmp.com/t5/image/serverpage/image-id/31606i4F1F86639B3CA97D/image-size/medium?v=v2&amp;amp;px=400" role="button" title="DiedrichSchmidt_0-1616783208631.png" alt="DiedrichSchmidt_0-1616783208631.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp; The Eigenvalue table should give you want you're looking for.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Hope this helps!,&lt;/P&gt;&lt;P&gt;DS&lt;/P&gt;</description>
    <pubDate>Fri, 26 Mar 2021 18:28:10 GMT</pubDate>
    <dc:creator>SDF1</dc:creator>
    <dc:date>2021-03-26T18:28:10Z</dc:date>
    <item>
      <title>Whitening the correlation using PCA</title>
      <link>https://community.jmp.com/t5/Discussions/Whitening-the-correlation-using-PCA/m-p/371753#M62170</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;I have a simple Y vs. X with some linear correlation. I like to de-correlate&amp;nbsp; (whitening) Y vs. X. I know how to use Principal Component Analysis to get the new uncorrelated principal component Y and X. How do you transform the new component to the original X and Y scale in JMP?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I appreciate your help.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;&lt;P&gt;Adam&lt;/P&gt;</description>
      <pubDate>Fri, 09 Jun 2023 00:31:07 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Whitening-the-correlation-using-PCA/m-p/371753#M62170</guid>
      <dc:creator>AT</dc:creator>
      <dc:date>2023-06-09T00:31:07Z</dc:date>
    </item>
    <item>
      <title>Re: Whitening the correlation using PCA</title>
      <link>https://community.jmp.com/t5/Discussions/Whitening-the-correlation-using-PCA/m-p/371819#M62172</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.jmp.com/t5/user/viewprofilepage/user-id/10363"&gt;@AT&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; If I understand what you're after, you want to know how to go from the principle components back to X and Y, is that correct?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; For example, if you have a three component system: X1, X2, X3 and you perform a PCA you have Prin1, Prin2, and Prin3. These PCs will be linear combinations of X1, X2, and X3 with coefficients that are the eigenvalues. You can get the eigenvalues and eigenvectors from the hot button next to the PC at the top of the report.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; The Eigenvector matrix shows you what the coefficients are to get the linear combination of either what X1 is in terms of Prin1, Prin2, and Prin3 (row in table), or you can find what Prin1 is in terms of X1, X2, X3 (column in table). See an example screen shot below.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="DiedrichSchmidt_0-1616783208631.png" style="width: 400px;"&gt;&lt;img src="https://community.jmp.com/t5/image/serverpage/image-id/31606i4F1F86639B3CA97D/image-size/medium?v=v2&amp;amp;px=400" role="button" title="DiedrichSchmidt_0-1616783208631.png" alt="DiedrichSchmidt_0-1616783208631.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp; The Eigenvalue table should give you want you're looking for.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Hope this helps!,&lt;/P&gt;&lt;P&gt;DS&lt;/P&gt;</description>
      <pubDate>Fri, 26 Mar 2021 18:28:10 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Whitening-the-correlation-using-PCA/m-p/371819#M62172</guid>
      <dc:creator>SDF1</dc:creator>
      <dc:date>2021-03-26T18:28:10Z</dc:date>
    </item>
    <item>
      <title>Re: Whitening the correlation using PCA</title>
      <link>https://community.jmp.com/t5/Discussions/Whitening-the-correlation-using-PCA/m-p/371870#M62176</link>
      <description>&lt;P&gt;Hi DS,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for quick response. After PCA, the P1 and P2 are given in terms of X and Y (two variables). I know my original X, and Y but I like to get de-correlated Y vs. X. For instance from a highly correlated weight vs. height, I like to get de-correlated unscaled weight vs. height after removal of correlation effect.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks again.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Adam&lt;/P&gt;</description>
      <pubDate>Fri, 26 Mar 2021 20:47:06 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Whitening-the-correlation-using-PCA/m-p/371870#M62176</guid>
      <dc:creator>AT</dc:creator>
      <dc:date>2021-03-26T20:47:06Z</dc:date>
    </item>
    <item>
      <title>Re: Whitening the correlation using PCA</title>
      <link>https://community.jmp.com/t5/Discussions/Whitening-the-correlation-using-PCA/m-p/371889#M62179</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.jmp.com/t5/user/viewprofilepage/user-id/10363"&gt;@AT&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I am a little confused about your goal here.&amp;nbsp; Here are two things I think you might be trying to do.&amp;nbsp; Is either close?&lt;BR /&gt;&lt;BR /&gt;&lt;STRONG&gt;After accounting for the variation in height that corresponds with weight, what &lt;I&gt;other&amp;nbsp;&lt;/I&gt;variation in height is there?&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;To do this you could do a PCA with height and weight and save the predicted values (PCA Platform &amp;gt; Red Triangle &amp;gt; Save Columns &amp;gt; Predicteds), and then subtract the predicted from the actual value using additional columns in your data table.&amp;nbsp; This will give you the&amp;nbsp;&lt;EM&gt;residual&lt;/EM&gt; which is the variation in height that is not correlated with weight.&amp;nbsp; The units on these variables are the same as the original variables, but their values will be centered around zero.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;OR you might mean:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Produce single set of variables to represent both height and weight, such that the new varaibles are uncorrelated with&amp;nbsp;each other, and which would be used in other analyses.&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;In this case you would not transform the variables back to the original space, but rather use them as latent variables.&amp;nbsp; To do this, save the principal component values (PCA Platform &amp;gt; Red Triangle &amp;gt; Save Columns &amp;gt; Principal Components). These latent variables do not have units, and when examining them in any other analysis, you would need to consider their loadings to see how they effect both height and weight.&lt;/P&gt;</description>
      <pubDate>Fri, 26 Mar 2021 22:10:34 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Whitening-the-correlation-using-PCA/m-p/371889#M62179</guid>
      <dc:creator>ih</dc:creator>
      <dc:date>2021-03-26T22:10:34Z</dc:date>
    </item>
    <item>
      <title>Re: Whitening the correlation using PCA</title>
      <link>https://community.jmp.com/t5/Discussions/Whitening-the-correlation-using-PCA/m-p/371901#M62181</link>
      <description>&lt;P&gt;Hi DS,&lt;/P&gt;&lt;P&gt;Thanks for your response and sorry for confusion. My goal is your first scenario, I like to remove the effect of height from weight and see the distribution of weight without effect of height. Thanks for the help.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Adam&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 26 Mar 2021 23:57:56 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Whitening-the-correlation-using-PCA/m-p/371901#M62181</guid>
      <dc:creator>AT</dc:creator>
      <dc:date>2021-03-26T23:57:56Z</dc:date>
    </item>
  </channel>
</rss>

