<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Set Probability Threshold / Confusion Matrix in Discussions</title>
    <link>https://community.jmp.com/t5/Discussions/Set-Probability-Threshold-Confusion-Matrix/m-p/39048#M22833</link>
    <description>Yes! Thank you!</description>
    <pubDate>Tue, 09 May 2017 21:34:18 GMT</pubDate>
    <dc:creator>AaronNHorvitz</dc:creator>
    <dc:date>2017-05-09T21:34:18Z</dc:date>
    <item>
      <title>Set Probability Threshold / Confusion Matrix</title>
      <link>https://community.jmp.com/t5/Discussions/Set-Probability-Threshold-Confusion-Matrix/m-p/38959#M22776</link>
      <description>&lt;P&gt;The default probability threshold is set to 0.5. &amp;nbsp;That can make it difficult to compare models when one has higher specificity and lower sensitivity, but both have similar accuracies and positive predictive values. &amp;nbsp;How do we change the probability threshold to compare model performance? &amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;&lt;P&gt;Aaron&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 08 May 2017 07:41:37 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Set-Probability-Threshold-Confusion-Matrix/m-p/38959#M22776</guid>
      <dc:creator>AaronNHorvitz</dc:creator>
      <dc:date>2017-05-08T07:41:37Z</dc:date>
    </item>
    <item>
      <title>Re: Set Probability Threshold / Confusion Matrix</title>
      <link>https://community.jmp.com/t5/Discussions/Set-Probability-Threshold-Confusion-Matrix/m-p/39043#M22830</link>
      <description>Aaron,&lt;BR /&gt;It is unclear which platform you are working in. If you happen to have JMP Pro and are fitting models using generalized regression then one of my favorite JMP13 features is the ability to interactively change the probability threshold and watch my sensitivity and specificity estimates change. But I digress. To explore various cut-offs on a model I might save the model formula to my data table so that I can set up a column with the "model call" (use the formula:  score&amp;gt; cut-off to get a column of 0s and 1s) and use that to evaluate model performance (thinking in the diagnostic model world). If you are in the logistic platform and have ROC curves for your models then the ROC Table will provide "2x2" tables for each possible cut-off in your data set. Finally there is an add-in in the file exchange that will calculate a series of performance measures and confidence intervals for diagnostic type models. &lt;BR /&gt;&lt;BR /&gt;&lt;A href="https://community.jmp.com/t5/JMP-Add-Ins/Performance-Summary-for-Diagnostic-Tests/ta-p/22524" target="_blank"&gt;https://community.jmp.com/t5/JMP-Add-Ins/Performance-Summary-for-Diagnostic-Tests/ta-p/22524&lt;/A&gt;&lt;BR /&gt;&lt;BR /&gt;Hopefully something here is helpful to what you are trying to do.</description>
      <pubDate>Tue, 09 May 2017 20:14:49 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Set-Probability-Threshold-Confusion-Matrix/m-p/39043#M22830</guid>
      <dc:creator>KarenC</dc:creator>
      <dc:date>2017-05-09T20:14:49Z</dc:date>
    </item>
    <item>
      <title>Re: Set Probability Threshold / Confusion Matrix</title>
      <link>https://community.jmp.com/t5/Discussions/Set-Probability-Threshold-Confusion-Matrix/m-p/39048#M22833</link>
      <description>Yes! Thank you!</description>
      <pubDate>Tue, 09 May 2017 21:34:18 GMT</pubDate>
      <guid>https://community.jmp.com/t5/Discussions/Set-Probability-Threshold-Confusion-Matrix/m-p/39048#M22833</guid>
      <dc:creator>AaronNHorvitz</dc:creator>
      <dc:date>2017-05-09T21:34:18Z</dc:date>
    </item>
  </channel>
</rss>

