I'm transforming some data into ZScores. A friend transformed the data in excel and the values came up slightly different than those in JMP. When trying to track down why, I found that the variance and standard deviation in excel differed from JMP. I decided to calculate the standard deviation by hand, using excel as a glorified calculator. My variance and standard deviation still came out different than what JMP was giving me. For example, excel (mean = 1.86, var = 0.1482, st. dev. = 0.3850) and JMP (mean = 1.8533, var = 0.1512, st. dev. = 0.38889). You can see they are close, but still off. I'm not exactly sure how JMP got that mean....even with a hand-held calculator the mean is exactly 1.86. Even if I used JMP's mean to calculate the variance and standard deviation in excel, I get different numbers (using mean = 1.8533 give var = 0.1483 and st. dev. = 3.851 in excel). Note that the variance in each case is calculated as a sample variance (n-1).
Does anyone know why JMP is giving me higher values? This is important as it actually makes the difference between a P = 0.05 and P = 0.06.
Thanks for any help!
Disregard! I found the error. One value in my data set between excel and putting it to JMP had changed and I completely blanked on updating that value in excel. Dumb mistake!!!
Message was edited by: Jennifer Campbell-Smith