Hello all,
In this toy-example i want to do capability analysis on my data. I have a dataset which is a lognormal distribution. I can do a Box-Cox transformation of the data and it becomes a normal distribution. This is all well and good but when i try to transform the specification limits with the same transform used for the data, i do not get the same amount of datapoints within and outside the specification limits.
My question is: how do I correctly transform my specification limits using a Box-Cox transform?
Original data: (lognormal)
![asd11.png asd11.png](https://community.jmp.com/t5/image/serverpage/image-id/50688i246B8DB2A684E102/image-size/medium?v=v2&px=400)
The Box-Cox transform (lambda almost 0 since the original data was a lognormal dist)
![asd3.png asd3.png](https://community.jmp.com/t5/image/serverpage/image-id/50689i6D0767EFA3255328/image-size/large?v=v2&px=999)
The transformed data with transformed (wrong) spec limits: (notice the different ratio of data inside and outside of the spec limits)
![asd1.png asd1.png](https://community.jmp.com/t5/image/serverpage/image-id/50690i24878178FDCCAC9B/image-size/medium?v=v2&px=400)
Best regards,
Mathias