<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic k-fold r2 in Discussions</title>
    <link>https://community.jmp.com/t5/Discussions/k-fold-r2/m-p/484230#M72859</link>
    <description>&lt;P&gt;I am performing a stepwise procedure and evaluating the performance using a k-fold r-squared. Currently, the only k-fold r-square that is provided is averaged across all 5 folds. Is it possible to secure the r-squared for each individual k-fold in a model?&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Fri, 09 Jun 2023 00:48:54 GMT</pubDate>
    <dc:creator>bbrown1</dc:creator>
    <dc:date>2023-06-09T00:48:54Z</dc:date>
    <item>
      <title>k-fold r2</title>
      <link>https://community.jmp.com/t5/Discussions/k-fold-r2/m-p/484230#M72859</link>
      <description>&lt;P&gt;I am performing a stepwise procedure and evaluating the performance using a k-fold r-squared. Currently, the only k-fold r-square that is provided is averaged across all 5 folds. Is it possible to secure the r-squared for each individual k-fold in a model?&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 09 Jun 2023 00:48:54 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/k-fold-r2/m-p/484230#M72859</guid>
      <dc:creator>bbrown1</dc:creator>
      <dc:date>2023-06-09T00:48:54Z</dc:date>
    </item>
    <item>
      <title>Re: k-fold r2</title>
      <link>https://community.jmp.com/t5/Discussions/k-fold-r2/m-p/484352#M72866</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.jmp.com/t5/user/viewprofilepage/user-id/39964"&gt;@bbrown1&lt;/a&gt;,&lt;BR /&gt;&lt;BR /&gt;In order to see the individual results for each fold, you can try to use instead the "Model Screening" platform ("Analyze", "Predictive Modeling", "Model Screening").&lt;BR /&gt;You'll enter your factors and response(s), check the "K-fold cross-validation", specify the number of folds, and simply check the "Fit Stepwise" in the menu "Methods".&lt;BR /&gt;Then, you should have your individual folds results for training and validation like in this screenshot :&amp;nbsp;&lt;/P&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Victor_G_1-1651735401115.png" style="width: 400px;"&gt;&lt;img src="https://community.jmp.com/t5/image/serverpage/image-id/42250iA4469BD71C09004A/image-size/medium?v=v2&amp;amp;px=400" role="button" title="Victor_G_1-1651735401115.png" alt="Victor_G_1-1651735401115.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Be careful not to do "cherry picking" and choose to use the fold with the best results, as it will mean you are choosing the training and validation set in order to get the best results, which is not the target of this validation technique (and not the objective of a validation at all). It's more for assessing the robustness of a model in the context of small dataset, where no data is "lost" or excluded in the validation set.&lt;BR /&gt;You can find more information on cross-validation and model assessment in this discussion :&amp;nbsp;&lt;A href="https://community.jmp.com/t5/Discussions/How-good-is-K-fold-cross-validation-for-small-datasets/m-p/250294#M49129" target="_blank" rel="noopener"&gt;Solved: How good is K-fold cross validation for small datasets? - JMP User Community&lt;/A&gt;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;I hope it will help you,&lt;/P&gt;</description>
      <pubDate>Thu, 05 May 2022 07:36:06 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/k-fold-r2/m-p/484352#M72866</guid>
      <dc:creator>Victor_G</dc:creator>
      <dc:date>2022-05-05T07:36:06Z</dc:date>
    </item>
  </channel>
</rss>

