The use of different control charting approaches for complex in-process measurements (such as curves and distributions), as well as how statistical tools are used together with automation and data management are critical for efficient and sustainable manufacturing processes.  

While in R&D there is greater emphasis on extracting numerous complex measurements to characterise the products, in manufacturing the pass/fail criteria tends to be based on a reduced number of metrics, often single value data. We strive to bridge the gap between R&D and manufacturing by exploring analytical tools to utilise valuable in-process complex measurements for monitoring and controling processes.  

 

 

 

Hi, my name is Patricia Blanco García. I am a Principal Scientist in the Formulations team at Johnson Matthey's Technology Center near Reading in the South of England. I'm a chemist by training, and I've been using JM for about 4–5 years. In this contribution, I'm going to talk about the role of data analysis in new process analysis technologies. If you haven't heard of Johnson Matthey, JM, is a global leader in sustainable technology solutions, which are helping leading energy, chemicals, and automotive companies to decarbonize, reduce harmful emissions, and improve their sustainability.

As an example, one in three cars are fitted with a catalyst converter from Johnson Matthey. In November 2023, JM enabled the world's first commercial transatlantic flight, powered entirely by sustainable aviation fuel. Most of Johnson Matthey's products are formulated, and this picture illustrates that. From the left, we have catalytic converters that go in the exhaust of cars, and these are coated with a slurry that contains the active catalyst.

The second picture is a fuel cell ink, and these things are deposited to produce coated layers for full-cell and electrolyzer applications. Then we have powders that they are using in inks, but they are also used to make granulated or pelletized catalysts that they are using the production of chemicals.

The formulations contain a mixture of dispersed particles and also polymers and additives in a solvent. As I mentioned, one example is the autocatalyst that go into the emission control systems. In this case, the active materials for the autocatalyst are dispersed onto mixed metal oxide powders, and these are formulating to a slurry, and that slurry is then coating onto the supports that go into the exhaust of the car.

At JM, we produce hundreds of kilos of batches of these slurries, and they are milled to certain specifications. For us, addressing the process in a step of these materials is crucial to sustainability, both from the point of view of the use of materials and also the use of energy. Therefore, introducing online measurements is very important to achieve this.

The team I work in was part of a Horizon 2020 project, which was developing real-time laser diffraction technology for particle size analysis during the processing of inorganic catalyst dispersions. Particle size analysis is a critical measurement for many of JM's products and is part of a characterization suite for defining the past criteria for products in different businesses.

However, particle size measurements are not online. Ideally, you want a measurement here, but in our case, the samples are mostly collected manually at, and specify time scales, and then they're measured in the lab. The samples can age, and they can also be measured by different operators. This is inefficient, and it can lead to repeatability and reproducibility errors. Therefore, real-time measurements are important because they allow accurate endpoint assessment of expensive milling processes, and they also minimize their risk of over-processing.

What I want to discuss in this poster is the use of different control charting approaches for complex in process measurements such as curves and distributions, and how the use of statistical tools together with automation and data management are critical towards efficient and sustainable manufacturing processes.

While in R&D, there is greater emphasis on extracting numerous complex measurements to characterize products in manufacturing. The past fail criteria tends to be based on a reduced number of metrics, and it's often based on a single value data. For example, percentiles of a size distribution, like shown here on the right, but a complex particle size distribution is better described using the full curve like the ones on the left.

On the left-hand side, the different colors correspond the different intervals in the middle in time. In both cases, we are showing a comparison between online and offline data. We can see that the trends are comparable, but it's all very subjective. The use of statistical tools can help us to be less subjective. The first tool that I want to show you is a control chart. Control charts are specifically used for monitoring of one or several process variables. This process variables can be temperature, pressure, etc. The X axis will be the time.

In this example, we are looking at particle size measurements for a powder sample, and we are monitoring the percentage independently. We have the D10, D50, and D90. The percentiles, although we use them all the time, they're not very good as cryptos for a multimodal distribution. Therefore, the ultimate aim is to monitor or visualize the whole curve, not just the percentiles.

We can build curve control charts to help us visualize a whole distribution, not just a value. This methodology, we adapted from Nicola Brammer from Solvay, who presented this at last year's summit, and she uses this tool to monitor a batch process, but we have adapted it to monitor particle size measurements.

The individual control charts are constructed for each size class in order to be able to build the curve control chart. The control limits are a representation of the inherit variation in the process. We use historical measurements to calculate the control limits based on previous measurements. Visually, we can see if the process is out of control, if a curve is out of the control limits. The green line shows a current measurement plotted against the mean of the measurements, which is the solid black line, and against the upper and lower control limits, which are the dotted lines, as well as control limits, we could also add spec limits if we wanted to.

This is arguably better than the previous control chart because here we're looking at the whole distribution, not just a set of values, but it's still just visual. If we take this one step further and use principal components, we can monitor the size distributions, the principal components of each curve. In this case, the principal component analysis model-driven control chart allows to monitor the distribution using function and principal components for each of the curves. A control chart can be created to represent the whole distribution, which is based on the functional principal components.

I hope I have shown you that real-time measurements of process parameters are critical to control processes, and that statistical tools can help to extract the most value of the data we produce. We are working to deploy this approach to real case studies, both in R&D and in manufacturing settings. To conclude, I'd just like to thank George Platt and Chand Malde from supplying some of the milling data in this presentation. Also, Pilar Gomez, had helped turning my ideas into tools that we can apply to our data. Thank you for listening.

Presenters

Skill level

Intermediate
  • Beginner
  • Intermediate
  • Advanced

Files

Published on ‎12-15-2024 08:23 AM by Community Manager Community Manager | Updated on ‎03-18-2025 01:12 PM

The use of different control charting approaches for complex in-process measurements (such as curves and distributions), as well as how statistical tools are used together with automation and data management are critical for efficient and sustainable manufacturing processes.  

While in R&D there is greater emphasis on extracting numerous complex measurements to characterise the products, in manufacturing the pass/fail criteria tends to be based on a reduced number of metrics, often single value data. We strive to bridge the gap between R&D and manufacturing by exploring analytical tools to utilise valuable in-process complex measurements for monitoring and controling processes.  

 

 

 

Hi, my name is Patricia Blanco García. I am a Principal Scientist in the Formulations team at Johnson Matthey's Technology Center near Reading in the South of England. I'm a chemist by training, and I've been using JM for about 4–5 years. In this contribution, I'm going to talk about the role of data analysis in new process analysis technologies. If you haven't heard of Johnson Matthey, JM, is a global leader in sustainable technology solutions, which are helping leading energy, chemicals, and automotive companies to decarbonize, reduce harmful emissions, and improve their sustainability.

As an example, one in three cars are fitted with a catalyst converter from Johnson Matthey. In November 2023, JM enabled the world's first commercial transatlantic flight, powered entirely by sustainable aviation fuel. Most of Johnson Matthey's products are formulated, and this picture illustrates that. From the left, we have catalytic converters that go in the exhaust of cars, and these are coated with a slurry that contains the active catalyst.

The second picture is a fuel cell ink, and these things are deposited to produce coated layers for full-cell and electrolyzer applications. Then we have powders that they are using in inks, but they are also used to make granulated or pelletized catalysts that they are using the production of chemicals.

The formulations contain a mixture of dispersed particles and also polymers and additives in a solvent. As I mentioned, one example is the autocatalyst that go into the emission control systems. In this case, the active materials for the autocatalyst are dispersed onto mixed metal oxide powders, and these are formulating to a slurry, and that slurry is then coating onto the supports that go into the exhaust of the car.

At JM, we produce hundreds of kilos of batches of these slurries, and they are milled to certain specifications. For us, addressing the process in a step of these materials is crucial to sustainability, both from the point of view of the use of materials and also the use of energy. Therefore, introducing online measurements is very important to achieve this.

The team I work in was part of a Horizon 2020 project, which was developing real-time laser diffraction technology for particle size analysis during the processing of inorganic catalyst dispersions. Particle size analysis is a critical measurement for many of JM's products and is part of a characterization suite for defining the past criteria for products in different businesses.

However, particle size measurements are not online. Ideally, you want a measurement here, but in our case, the samples are mostly collected manually at, and specify time scales, and then they're measured in the lab. The samples can age, and they can also be measured by different operators. This is inefficient, and it can lead to repeatability and reproducibility errors. Therefore, real-time measurements are important because they allow accurate endpoint assessment of expensive milling processes, and they also minimize their risk of over-processing.

What I want to discuss in this poster is the use of different control charting approaches for complex in process measurements such as curves and distributions, and how the use of statistical tools together with automation and data management are critical towards efficient and sustainable manufacturing processes.

While in R&D, there is greater emphasis on extracting numerous complex measurements to characterize products in manufacturing. The past fail criteria tends to be based on a reduced number of metrics, and it's often based on a single value data. For example, percentiles of a size distribution, like shown here on the right, but a complex particle size distribution is better described using the full curve like the ones on the left.

On the left-hand side, the different colors correspond the different intervals in the middle in time. In both cases, we are showing a comparison between online and offline data. We can see that the trends are comparable, but it's all very subjective. The use of statistical tools can help us to be less subjective. The first tool that I want to show you is a control chart. Control charts are specifically used for monitoring of one or several process variables. This process variables can be temperature, pressure, etc. The X axis will be the time.

In this example, we are looking at particle size measurements for a powder sample, and we are monitoring the percentage independently. We have the D10, D50, and D90. The percentiles, although we use them all the time, they're not very good as cryptos for a multimodal distribution. Therefore, the ultimate aim is to monitor or visualize the whole curve, not just the percentiles.

We can build curve control charts to help us visualize a whole distribution, not just a value. This methodology, we adapted from Nicola Brammer from Solvay, who presented this at last year's summit, and she uses this tool to monitor a batch process, but we have adapted it to monitor particle size measurements.

The individual control charts are constructed for each size class in order to be able to build the curve control chart. The control limits are a representation of the inherit variation in the process. We use historical measurements to calculate the control limits based on previous measurements. Visually, we can see if the process is out of control, if a curve is out of the control limits. The green line shows a current measurement plotted against the mean of the measurements, which is the solid black line, and against the upper and lower control limits, which are the dotted lines, as well as control limits, we could also add spec limits if we wanted to.

This is arguably better than the previous control chart because here we're looking at the whole distribution, not just a set of values, but it's still just visual. If we take this one step further and use principal components, we can monitor the size distributions, the principal components of each curve. In this case, the principal component analysis model-driven control chart allows to monitor the distribution using function and principal components for each of the curves. A control chart can be created to represent the whole distribution, which is based on the functional principal components.

I hope I have shown you that real-time measurements of process parameters are critical to control processes, and that statistical tools can help to extract the most value of the data we produce. We are working to deploy this approach to real case studies, both in R&D and in manufacturing settings. To conclude, I'd just like to thank George Platt and Chand Malde from supplying some of the milling data in this presentation. Also, Pilar Gomez, had helped turning my ideas into tools that we can apply to our data. Thank you for listening.



Event has ended
You can no longer attend this event.

Start:
Thu, Mar 13, 2025 06:00 AM EDT
End:
Thu, Mar 13, 2025 06:40 AM EDT
Ballroom Gallery-Ped 4
0 Kudos