cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Browse apps to extend the software in the new JMP Marketplace
Choose Language Hide Translation Bar
mmeewes
Level II

How to offset Spec Limits

Hi all, I work for a food manufacturer currently making granola bars. This is a new process and we are finding some significant bias in the weight readings on our X-Ray system. I have done some analysis that seems to be showing that for anything north of the teach point adds positive bias and anything south of the teach point adds negative bias. Due to this bias, we are rejecting bars within specification as they approach the spec limits. In my mind, I've shown sufficient evidence that any values north of the teach point are always positive bias to one extent, and the same goes for anything south. So from here, I want to adjust our spec limits on the X-Ray machines to account for this bias. Can anyone offer a reccomendation of how I should find this interecept on each spec? Thanks

1 ACCEPTED SOLUTION

Accepted Solutions
Byron_JMP
Staff

Re: How to offset Spec Limits

Oops, pasted the wrong script, sorry about that, its corrected now.

 

The model is fitting :Data vs. :Standard

Then I saved the inverse prediction column, which predicts the standard from the :Data column. 

In practice, this "transformation" will correct the bias from the non-linearity issue in the measurement system.

 

Something to try:

1. Use fit model for Data vs Standard, fit a 4p Robard model, save columns/save inverse prediction column

2. Use fit model for Data vs Standard, group by Teach Point, fit a 4p Robard model, save columns/save inverse prediction column

3. Analyze/Quality/Variiability.  In the Dialog, :Standard goes into the Standard role, then :Data and the two columns you just made go into the Y response role, and Teach Point goes in to the X grouping role and Sample ID goes in to the Sample ID role.

 

//this works too
Variability Chart(
    Y( :Data,:Standard Predictor,:Standard Predictor by Teach Point ),
    X( :Teach Point, :Sample ID ),
    Standard( :Standard )
):

 Now that the Variability report is up, hold down the Ctrl key and from the red triangle menu go to Gauge Studies and pick Bias Report, and also Linearity report (click cancel for the sigma question)

The reports (if this worked) wlll show you that in the transformed (inverse prediction formula) the bias is smaller, and that there is no or very little linearity in the bias. 

 

This was a very fun data set, thank you

 

JMP Systems Engineer, Health and Life Sciences (Pharma)

View solution in original post

5 REPLIES 5
Byron_JMP
Staff

Re: How to offset Spec Limits

Maybe try fitting 4p Robard model then save the inverse prediction formula as use that for bias correction.

Works well for all the Teach Points, maybe a little less well on the USL Chewy, but still pretty good.

 


Fit Curve(
	Y( :Data ),
	X( :Standard ),
	Group( :Teach Point ),
	Fit Logistic 4P Rodbard(
		Plot Actual by Predicted( 1 ),
		Plot Residual by Predicted( 1 )
	),
	SendToReport(
		Dispatch(
			{"Logistic 4P Rodbard", "Residual by Predicted Plot"},
			"Fit Nonlinear Diagnostic Plots",
			FrameBox,
			{Frame Size( 434, 230 )}
		)
	)
);

 

 

//This is the inverse prediction formula

Match( :Teach Point,
	"LSL Chewy",
		24.9145298299807 * Exp(
			Log(
				(32.3147049980956 - 18.3751462995063) / (:Data - 18.3751462995063)
				 - 1
			) / -8.80085037624688
		),
	"23g Learning",
		25.3324944849549 * Exp(
			Log(
				(33.3922517886217 - 18.0024302265495) / (:Data - 18.0024302265495)
				 - 1
			) / -7.88036690230438
		),
	"24g Learning",
		25.5173152024576 * Exp(
			Log(
				(33.3698484779835 - 18.1286639991536) / (:Data - 18.1286639991536)
				 - 1
			) / -8.03788584010198
		),
	"USL Chewy",
		24.8653808538003 * Exp(
			Log(
				(31.1879817143387 - 18.9552914323179) / (:Data - 18.9552914323179)
				 - 1
			) / -10.2749089803876
		),
	.
)
JMP Systems Engineer, Health and Life Sciences (Pharma)
mmeewes
Level II

Re: How to offset Spec Limits

In your script I see standard predictor as the regressor. What is that referencing?
Byron_JMP
Staff

Re: How to offset Spec Limits

Oops, pasted the wrong script, sorry about that, its corrected now.

 

The model is fitting :Data vs. :Standard

Then I saved the inverse prediction column, which predicts the standard from the :Data column. 

In practice, this "transformation" will correct the bias from the non-linearity issue in the measurement system.

 

Something to try:

1. Use fit model for Data vs Standard, fit a 4p Robard model, save columns/save inverse prediction column

2. Use fit model for Data vs Standard, group by Teach Point, fit a 4p Robard model, save columns/save inverse prediction column

3. Analyze/Quality/Variiability.  In the Dialog, :Standard goes into the Standard role, then :Data and the two columns you just made go into the Y response role, and Teach Point goes in to the X grouping role and Sample ID goes in to the Sample ID role.

 

//this works too
Variability Chart(
    Y( :Data,:Standard Predictor,:Standard Predictor by Teach Point ),
    X( :Teach Point, :Sample ID ),
    Standard( :Standard )
):

 Now that the Variability report is up, hold down the Ctrl key and from the red triangle menu go to Gauge Studies and pick Bias Report, and also Linearity report (click cancel for the sigma question)

The reports (if this worked) wlll show you that in the transformed (inverse prediction formula) the bias is smaller, and that there is no or very little linearity in the bias. 

 

This was a very fun data set, thank you

 

JMP Systems Engineer, Health and Life Sciences (Pharma)
mmeewes
Level II

Re: How to offset Spec Limits

Thank you for the awesome analysis, and i'm glad you enjoyed the data set! Now in practical terms, should I then take one of the two prediction formulas and run them through the profiler, and then type in either spec? The predicted value should essentially tell me what I should change the spec values too in order to offset that bias, right?

mmeewes
Level II

Re: How to offset Spec Limits

Thank you for the awesome analysis, and i'm glad you ejoyed the data set! Now in practical terms, should I then take one of the two prediction formulas and run them through the profiler, and then type in either spec? The predicted value should essentially tell me what I should change the spec values too in order to offset that bias, right?