A QbD update: Current and future trends in Quality by Design
Apr 16, 2020 6:48 AM
| Last Modified: Sep 16, 2020 11:37 AM
QbD update: a look back and to the future to see how QbD can help pharma and other industries streamline and optimize processes for state-of-the-art results.Quality by Design (QbD) is about processes, products and clinical understanding. Five years ago, my colleagues and I launched a QbD column, which consisted of nine entries. I encourage you to review the original posts in the QbD column since they are still very relevant today.
Before you revisit the column, take a moment to watch this four-minute video of the Sanofi factory of the future to see an advanced implementation of biotechnology, chemistry, manufacturing and controls (CMC). To appropriately manufacture a drug product, a specific manufacturing process, product characteristics, and product testing must all be defined to ensure that the product is safe, effective and consistent between batches.
The QbD column series provides a peek under the hood on how the pharmaceutical industry develops and optimizes products and processes, so that they meet the requirements of the factory of the future. Let me give you an overview of each column:
These are all very brief sections designed to set up the context for the following columns.
A factorial design case study based on the formulation of a steroid lotion of a generic product. The design was set to optimize eight quality attributes simultaneously by setting temperature, blending time and cooling time. This study shows how JMP simulates responses to visualize how the variability in factor-level settings affects the variability in responses.
A QbD fractional factorial experiment. The goal of the experiment is to explore the process of preparing nanosuspensions, a popular formulation for water-insoluble drugs. Nanosuspensions involve colloidal dispersions of discrete drug particles, which are stabilized with polymers and/or surfactants. The experiment involves four three-level factors and one two-level factor.
Achieving robustness with stochastic emulators. Stochastic emulators were first proposed in Bates et al, 2006 to achieve robust, on-target performance. This very powerful approach is implemented in JMP. The method is demonstrated with a case study that refers to the standard on Scale-Up and Post-Approval Changes (SUPAC) of the FDA Center for Drug Evaluation and Research (CDER).
Mixture designs. For these designs, the emphasis is on the effect of the relative proportions of ingredients in a formulation, rather than their absolute amounts. The column shows examples of ternary plots and profilers tailored to treat mixtures. The example used is a case study on mixing powders. Key questions for the blending operation are:
How to quantify components of powder blends simultaneously?
How to validate or confirm the process analytical technology (PAT) blending process monitoring results via other fast and convenient spectroscopic methods?
How to link the scale of scrutiny and the homogeneity of both API and excipients?
Response surfaces and sequential experimentation. In this post, a Placket Burman (PB) design followed by a central composite design (CCD) is featured in a case study that considers three immediate outcomes from an experiment on formulations: encapsulation efficiency (EE), particle size and zeta potential. In addition, each formulation was diluted and divided into samples for storage, half at 4⁰ C and half at 37⁰ C. These samples were tested at predetermined times (up to 24 months) to detect loss of active drug during storage.
Split plot experiments. These experiments have hard-to-change factors that are difficult to randomize and can only be applied at the block level. Once the level of a hard-to-change factor is set, experiments are run with several other factors that keep that level fixed. The case study used in this column, which is from preclinical research on animal models where several methods for the treatment of severe chronic skin irritations are tested, demonstrates how to design and analyze such experiments.
The material and examples contained in the QbD columns are still relevant. Some additional developments account for the growing impact of sensor technology, big data, machine learning and hybrid models.
Eventually, the challenge is to generate information from all the data gathered in the laboratory and the factory floor; an even greater challenge is to ensure the quality of this information, which I discuss in this conversation with Anne Milley (a transcript is available here). A video of a more formal presentation at a JMP QbD discovery event in Copenhagen is available here.
In 2020, the Fourth Industrial Revolution continues to grow in advanced manufacturing, combining extensive sensor data with flexible manufacturing and advanced analytics. A new comprehensive book, covering state-of-the-art elements of system engineering and the Fourth Industrial Revolution, is available here.I spoke with Anne Milley about "Quality Assurance in the Golden Age of Analytics" for the Analytically Speaking video series.