Control Charting Dynamic Processes
Many industrial processes involve parameters that are dynamic for at least part of the manufacturing sequence, e.g., chemical batch reactor temperature or pressure, autoclave vacuum ramp. It is important for batch-to-batch consistency to find a way to compare these dynamic phases of production or setup in a similar way to the use of controls charts in monitoring steady state production.
JMP can be used, with a few relatively simple steps, to create dynamic control limits from a number of representative batches. Future batches can then be overlaid in Graph Builder to ensure they fit within these historical limits. This demonstration shows the steps used to create the control limits and the use of the limits for batch-to-batch comparisons. Some of the challenges involved with this type of data are discussed.
Welcome to my talk on control charting dynamic processes. SPC or statistical process control has been used for many years to monitor processes that vary around a fixed set point. The process is monitored over time, and the variation can be separated into two components: common cause variation, that is variation that's there constantly through the process and will always be present unless some fundamental changes are made to the process, and special cause variation, where a process goes out of control because something external to the process has increased the amount of variation.
However, lots of processes in industry do not vary around a set point. There's a lot of batch processes where the batch is processed over time to a recipe with predefined steps where temperature, pressure, vacuum may be increased over time. For instance, converting monomers to polymers in a batch reaction vessel or increasing the vacuum or pressure on a mixer to drive the solvent level or moisture content of the material. We still need to have a way to make sure that we achieve consistent results from batch to batch, and that behavior of the process is consistent over time, and to identify batches of material that are not made to the standard process.
This presentation demonstrates a pragmatic visual approach to being able to put limits over time to those processes, to be able to monitor deviations in the process, but also to look at the setup and cooldown of production equipment, the condition monitoring, to make sure that those startup and cooldown phases are consistent over time.
Some examples from the industry that I'm in include batch mixers, where a monomer is added to a mixer. It's heated, held at a constant temperature, and then cooled. Here, the temperature can be monitored over time using this approach, but also other parameters such as the torque on the motor that might indicate the degree of polymerization. Another example is an autoclave where we take carbon fiber composite materials and apply vacuum and temperature, again, at a predefined rate. We might have some dwells in there where we hold the material at a certain temperature for a given amount of time, and then we cool the parts down.
An example where we use this technique for condition monitoring is in twin screw extruders, where we're applying vacuum via a number of vacuum pumps in order to remove the volatile component from the resin. We want to ensure that those vacuum pumps are working at their optimum settings. We look at the startup phase of those vacuum pumps where they go from atmospheric down to the defined set point at the start of the production run. All of these types of process where the process parameters are varying over time, lend themselves to this type of approach.
In this case, we've got some data that has come out of our data historian. We're lucky that the data is captured electronically. We have multiple batches stacked one on top of another, in this case, labeled as data manufacturer. I've got 21 batches of material here, each with a different batch number. I've got my process parameter, vacuum, and then I've got my time increments that are predefined. I've captured the data every 5 seconds, and I know that each batch should achieve the set points of 210 millibars within no more than 17 minutes. That's giving me 205 rows of data for every batch that's been manufactured in these 5-second intervals. Here's my data for 21 batches.
I need to take this data now, and I need to organize it in such a way that I can calculate what the normal range of values would be at each time increment. I'm going to split my data table and create one column for each time increment, so 5, 10, 15 seconds, et cetera, all the way up to my 17-minute mark. I'm going to have one row of data for each batch that's been manufactured. Because I've got 21 batches of data here, I'm going to use 15 of them to create the control limits. Then I'm going to look at my subsequent six batches against those control limits to make sure that the behavior of the process is continuing to be normal, stable, and predictable.
I'll take my stacked data set. I will go to Tables and Split. My process parameter is what I wish to split by, so the vacuum. I'm going to split my columns by vacuum. I'm going to group it by my batch number, so I'm going to have one row of data for each batch number, but I want one column of data for each time increment, so I'm going to split by time. We can see in this preview table, we can check that the data is in the format that we would expect. We have one row for each batch and one column for each time increment. If we're happy with that, we can click okay, and we'll have a data table created. As I said, we're going to take our first 15 batches and use them to create the control limits. That means I'm going to hide and exclude the last six batches. My data table now looks like the one in the slide.
Now, what I need to do is calculate control limits for each of those time increments. What I'm doing here is I'm looking at what vacuum level I would expect to get at each time period. What's the average of that, and what are the control limits around that average? What's the variability in pressure that I would expect to have at time increment 5 and time increment 10, et cetera.
I'm going to create individual moving range charts because each of these is an independent run that's been set up separately. We want individual moving range charts. I want to do one for each time increment, and I want to make sure that they're ordered in time sequence, which is going to be by data manufacturer. There's 205 control charts to be generated, so it just takes a few seconds to generate.
Now, I have one individual moving range chart for each time period, and the time period is identified in the description of the chart. I can scroll through them and check that I don't have anything other than common cause variation in there. No strange-looking data points. If I'm happy with each of those charts, I can now save these control limits. If we look at the example of 5 seconds, at 5 seconds into my back-down of the extruder, I would expect to have a vacuum level in the extruder of 914.6 millibars with a range from 865 up to 963 millibars.
If I go to the red triangle, I can save those limits. It will save the limits for each of those time periods, and I want to save them in a tall table. This will create a table where each time period has got an average value and the values in which I would expect the normal population to fall within. I will tidy this up and get rid of the columns that I don't need just to keep the mean and the control limits. This is all I need for the future analysis.
Then because this column has taken a label from the control charts, you can see that the format of this data is not continuous numerical. I need to just change that back now to ensure that it's the correct type of data for time, and I can relabel it Time as well just to make sure it's clear in the future exactly what that data column is.
I have one data table now containing my control limits for each time increment. Now, I need to organize my original data again with time and each batch over time, changing over time in order to be able to produce charts. I'll go back to my stacked data. I'm going to split it again, but this time I'm going to split it slightly differently. I still want to split by my process parameter, vacuum, but in this case, I want to group by time. I'll have one row for each time increment, and I want to split by my batch number, so I have one column for each batch. Again, I can check the preview, ensure that the data is grouped in the way that I'm expecting it to be grouped. When I'm happy, I can press okay. Now, I have time sequence data for each batch as a separate column, and I can add batches to this as they become available in the future.
I could either take those control limits that I calculated and add them to that table of data, or I could do a virtual link. If I do a virtual link, the data isn't permanently attached to the data table, so it can also be used in other data sets, or it can be recalculated and changed over time without changing the data sets, but it's still available on the process data in order to be able to use it for limits to the process. I can still use the data without actually incorporating it into the data table. That's what I'm going to do. I'm going to do a virtual link between the two data tables.
To do that, I need to have a column that's the same in both data sets. In this case, it's going to be the time. I'll right-click on my time column, and I'm going to create the time as a link ID. This is the column that includes the identification information in order to link it to the data table. In the data table, I'm going to click Time again, right-click. Now, this is the link reference, and I'm going to link it to the table that contains the control limits and the average.
If I look at the column selection on the left-hand side, I can see each of my batches of material, my Time column, and then at the bottom, this referenced virtual-linked columns. We can see that there's a virtual link there, this little icon. We can see that those columns are hidden from sight at the moment, and we can see that they're locked. I'm not able to change the data in those columns in this data table. I'd have to go back to the table where the data actually exists. I can unhide them if I want to see them. I can see the data there, but they don't need to be unhidden in order to be able to use them. I'll hide them again to keep it neat.
Now, I'm going to plot those control limits using Graph Builder, which is an extremely flexible platform for visualizing data. Then I can overlay each of my batches in turn on those control limits over time and to see whether the behavior of each batch is where I would expect it to be based on the historical population of 15 batches that I used to create those limits.
If I go to my data table, I go to Graph, Graph Builder. Obviously, time on my X-axis. I'm then going to pull over my control limits. I only need the lines, not the individual data points. Then to better see my production batches, I'm going to just gray these out and change them to dash lines, so it's clear what's batch data and what's control limits. I'm going to make them gray, put a dashed line for the average and a dotted line for the control limits. Now, we can see what a normal process would look like over time. We would expect each value of the behavior of the processes stable and consistent over time to fall within the dotted lines and on average to be close to the dashed line.
Now, I can superimpose one of my new batches that wasn't used to create control limits. For instance, this 06/09/2023 batch over the top of the control limits and check its behavior over time compared to where we would expect it to be. We can also put on a column switcher in order to be able to see, to scroll through the batches, and check each one against the control limits. If I highlight the production batch in the top, and then I select all the other batches as replacement columns, I now have a list in my column switcher of every batch that's been manufactured, and I can scroll through them one by one and check my six new batches against the historical control limits. If I add new batches to my data table, I can easily check them against these control limits. If I want to be able to send this data to somebody else who maybe doesn't have JMP, I can also animate this and record it and save it as a GIF and attach it to an email if I wanted to send that information to somebody else.
Another example, based on some of our data. Here, we've got a resin process where we're heating up the resin. We need to keep it at a set point for a certain amount of time, and then we need to cool it down again for discharge. Again, created similar data and control charts. We can see when we scroll through the batches, that occasionally, we might have an anomaly in the data. Those are cases where we would need to go investigate because something is different in this particular batch compared to the standard population.
We've used this technique extensively in different areas of our manufacturing and testing processes, and it gives us good early identification of when processes are starting to go out of control or where equipment might require some maintenance and allows us to see these patterns in the data before we risk failure of the material or failure of production equipment. What we don't want to have is unscheduled maintenance breakdowns. By monitoring processes like this, it allows us to plan ahead of time equipment is starting to deteriorate. Thank you.