Choose Language Hide Translation Bar
shampton82
Level VII

Statistical Process Control for Process Variables that have a Functional Form (2020-US-30MP-621)

Level: Intermediate

 

Steve Hampton, Process Control Manager, PCC Structurals
Jordan Hiller, JMP Senior Systems Engineer, JMP

 

Many manufacturing processes produce streams of sensor data that reflect the health of the process. In our business case, thermocouple curves are key process variables in a manufacturing plant. The process produces a series of sensor measurements over time, forming a functional curve for each manufacturing run. These curves have complex shapes, and blunt univariate summary statistics do not capture key shifts in the process. Traditional SPC methods can only use point measures, missing much of the richness and nuance present in the sensor streams. Forcing functional sensor streams into traditional SPC methods leaves valuable data on the table, reducing the business value of collecting this data in the first place. This discrepancy was the motivator for us to explore new techniques for SPC with sensor stream data. In this presentation, we discuss two tools in JMP — the Functional Data Explorer and the Model Driven Multivariate Control Chart — and how together they can be used to apply SPC methods to the complex functional curves that are produced by sensors over time. Using the business case data, we explore different approaches and suggest best practices, areas for future work and software development.

 

 

Auto-generated transcript...

 


Speaker

Transcript

Jordan Hiller Hi everybody. I'm Jordan Hiller, senior systems engineer at JMP, and I'm presenting with Steve Hampton, process control manager at PCC Structurals. Today we're talking about statistical process control for process variables that have a functional form.
  And that's a nice picture right there on the title
  slide. We're talking about statistical process control, when it's not a single number, a point measure, but instead, the thing that we're trying to control has the shape of a functional curve.
  Steve's going to talk through the business case, why we're interested in that in a few minutes. I'm just going to say a few words about methodology.
  We reviewed the literature in this area for the last 20 years or so. There are many, many papers on this topic. However, there doesn't really appear to be a clear consensus about the best way to approach this statistical
  process control
  when your variables take the form of a curve. So we were inspired by some recent developments in JMP, specifically the model driven multivariate control chart introduced in JMP 15 and the functional data explorer introduced in JMP 14.
  Multivariate control charts are not really a new technique they've been around for a long time. They just got a facelift in JMP recently.
  And they use either principal components or partial least squares to reduce data, to model and reduce many, many process variables so that you can look at them with a single chart. We're going to focus on the on the PCA case, we're not really going to talk about partial
  the
  partial least squares here.
  Functional Data Explorer is the method we use in JMP in order to work with data in the shape of a curve, functional
  data. And it uses a form of principal components analysis, an extension of principal components analysis for functional data.
  So it was a very natural kind of idea to say what if we take our functional curves, reduce and model that using the functional data explorer.
  The result of that is functional principal components and just as you you would add regular principal components and push that through a model driven multivariate control chart,
  what if we could do that with a functional principal components? Would that be feasible and would that be useful?
  So with that, I'll turn things over to Steve and he will introduce the business case that we're going to discuss today.
1253****529 All right. Thank you very much. Jordan.
  Since I do not have video, I decided to let you guys know what I look like.
  There's me with my wife Megan and my son Ethan
  with last year's pumpkin patch. So I wanted to step into the case study with a little background on
  what I do, and so you have an idea of where this information is coming from. I work in investment casting for precision casting...
  Investment Casting Division.
  Investment casting involves making a wax replicate of what you want to sell, putting it into a pattern assembly,
  dipping it multiple times in proprietary concrete until you get enough strength to be able to dewax that mold.
  And we fire it to have enough strength to be able to pour metal into it. Then we knock off our concrete, we take off the excessive metal use for the casting process. We do our non destructive testing and we ship the part.
  The drive for looking at improved process control methods is the fact that
  Steps 7, 8, and 9 take up 75% of the standing costs because of process variability in Steps 1-6. So if we can tighten up 1-6,
  most of ??? and cost go there, which is much cheaper, much shorter, then there is a large value add for the company and for our customers in making 7, 8, and 9 much smaller.
  So PCC Structurals. My plant, Titanium Plant, makes mostly aerospace components. On the left there you can see a fan ??? that is glowing green from some ??? developer.
  And then we have our land based products, which right there's a N155 howitzer stabilizer leg.
  And just to kind of get an idea where it goes. Because every single airplane up in the sky basically has a part we make or multiple parts, this is an engine sections ???, it's about six feet in diameter, it's a one piece casting
  that goes into the very front of the core of a gas turbine engine. This one in particular is for the Trent XWB that powers the Airbus A350
  jets.
  So let's get into JMP. So the big driver here is, as you can imagine, with something that is a complex as an investment casting process for a large part, there is tons of
  data coming our way. And more and more, it's becoming functional as we increase the number of centers, we have and we increase the number of machines that we use. So in this case study, we are looking at
  data that comes with a timestamp. We have 145 batches. We have our variable interest which is X1.
  We have our counter, which is a way that I've normalized that timestamp, so it's easier to overlay the run in Graph Builder and also it has a little bit of added
  niceness in the FTP platform. We have our period, which allows us to have that historic period and a current period that lines up with the model driven multivariate control chart platform,
  so that we can have our FDE
  only be looking at the historic so it's not changing as we add more current data. So this is kind of looking at this if you were in using this in practice, and then the test type is my own validation
  attempts. And what you'll see here is I've mainly gone in and tagged thing as bad, marginal or good. So red is bad, marginal is purple, and green is good and you can see how they overlay.
  Off the bat, you can see that we have some curvey
  ??? curves from mean. These are obviously what we will call out of control or bad.
  This would be what manufacturing called a disaster because, like, that would be discrepant product. So we want to be able to identify those
  earlier, so that we can go look at what's going on the process and fix it. This is what it looks like
  breaking out so you can see that the bad has some major deviation, sometimes of mean curve and a lot of character towards the end.
  The marginal ones are not quite as deviant from the mean curves but have more bouncing towards the tail and then good one is pretty tight. You can see there's still some bouncing. So this is where the
  the marginal and the good is really based upon my judgment, and I would probably fail an attribute Gage R&R based on just visually looking at this. So
  we have a total of 33 bad curves, 45 marginal and 67. And manually, you can just see about 10 of them are out. So you would have an option if you didn't want to use a point estimate, which I'll show a little bit later that doesn't work that great, of maybe making...
  control them by points using the counter. And how you do that would be to split the bad table by counter, put it into an individual moving range control chart through control chart building and then you would get out,
  like 3500 control charts in this case, which you can use the awesome ability to make combined data tables to turn that that list summary from each one into its own data table that you can then link back to your main data table and you get a pretty cool looking
  analysis that looks like this, where you have control limits based upon the counters and historic data and you can overlay your curves. So if you had an algorithm that would tag whenever it went outside the control limits, you know, that would be an option of trying to
  have a control....
  a control chart functionality with functional data. But you can see, especially I highlighted 38 here, that you can have some major deviation and stay within the control limits. So that's where this FDE
  platform really can shine, in that it can identify an FPC that corresponds with some of these major deviations. And so we can tag the curves based upon those at FPCs.
  And we'll see that little later on. So,
  using the FDE platform, it's really straightforward. Here for this demonstration, we're going to focus on a step function with 100 knots.
  And you can see how the FPCs capture the variability. So the main FPC is saying, you know, beginning of the curve, there's...that's what's driving the most variability, this deviation from the mean.
  And setup is X1 and their output, counters. Our input, batch number and then I added test type. So we can use that as some of our validation in FPC table and the model driven multivariate control chart and the period so that only our historic is what's driving the FDE fit.
  And so
  just looking at the fit is actually a pretty important part of making sure you get correct
  control charting later on, is I'm using this P Step
  Function 100 knots model. You can see, actually, if I use a B spline and so with Cubic 20 knots, it actually looks pretty close to my P spline.
  But from the BIC you can actually see that I should be going to more knots, so if I do that, now we start to see them overfitting, really focusing on the isolated peaks and it will cause you to have an FDE
  model that doesn't look right and causes you to not be as sensitive and your model driven multivariate control chart.
  0