I'd like to know why the fitted lognormal distribution for a given data set provides a different standard deviation than the fitted normal distribution of the log-transformed data set. For example, suppose I have 10 datapoints: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10. When I fit the lognormal distribution, I get a standard deviation of 0.6954075. When I log transform the 10 datapoints and fit a normal distribution, I get a standard deviation of 0.7330239. See snippet below. Thank you!
![jmiller_0-1673035658583.png jmiller_0-1673035658583.png](https://community.jmp.com/t5/image/serverpage/image-id/48884iCAC6F80EF91C0744/image-dimensions/792x94?v=v2)
JMP Version: 16.2