Hi @Maureen ,
It sounds like you want to do something like a Profit Matrix or Cost Sensitive Learning. This would be done by doing a nominal logistic fit in the fit model platform.
Either that, or you would have to replicate your experiment several times to gather enough statistics for each method that you can determine p-values for the summary stats of sensitivity, specificity, etc.
For example if Run 1 of the experiment results in:
Ref = TP
A = FP
B = TP
C = FN
But, if you repeat Run 1 again (after you've gone through all the other runs), maybe you get:
Ref = TP
A = FP
B = FN
C = TP
This would result in the numerical count being different for Run 1 for the different methods. As a result, if you then do a Fit Y by X, where the y-axis is the counts for each of the different runs and the x-axis is the different methods (ref, a, b, c), then each method (except maybe the reference method) should have a spread in the counts, which would result in slightly different summary statistics. The ANOVA would look at the means and standard deviations of those summary statistics to find if there are any significant differences between the methods. Since you'd be comparing four different methods, you'd want to do a Tukey-Kramer analysis of means rather than a Student's-t.
Another option would be to look into a Bland-Altman analysis. You'd have to recode the TP, TN, FP, FN to actual numerical values. See for example here. This analysis compares the difference of one method (to the reference) and compares it to the average of the reference to the measurement. In other words, it would look at DIF = Ref-A (on the y-axis) and AVG = (Ref+A)/2 (on the x-axis). You could do this analysis on the FP, TN, etc. or the summary statistics.
Depending on how many rows (experimental runs you have), I think you might be better off doing an analysis like in the first paragraph.
As mentioned before, if you can share your data table (you can always anonymize it (Tables > Anonymize)), it would be a lot easier to determine exactly how to help you.
Good luck!,
DS