cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Choose Language Hide Translation Bar
Utilizing Design of Experiment Models to Improve Assay Development in High-Throughput Biology

 Utilizing Design of Experiment Models to Improve Assay Development in High-Throughput Biology

 

Diana Ballard, Senior Consulting Statistician, Predictum Inc.
Samuel Hasson, PhD, Pharmacology Research Fellow, National Institutes of Health
Wayne Levin, President, Predictum Inc.

High-throughput biology has become a ubiquitous platform for functional genomics, chemical biology and drug discovery. While a multitude of technologies, such as automated cellular imaging, have advanced to support the capabilities of microplate and array-based explorations, there has been a lack of innovation in the process of assay development. Prior to primary screening, assay design, miniaturization and optimization for high-throughput campaigns can often involve a diverse landscape of potential variables. Scientists often approach assay development by adjusting experimental conditions independently, ignoring important principles such as randomization and varying experimental unit size. From a statistical standpoint, this method is slow, expends large quantities of costly reagents, and risks missing or misattributing insights. Conventional practices may also fail to identify complex interactions. The adaptation of modern design of experiment (DOE) approaches to high-throughput screening (HTS) offers a far more efficient and insightful framework for assay development. However, the application of contemporary DOE designs by biologists has been limited due in part to the advanced mathematics required and the lack of tools tailored to HTS environments. Those limitations have been overcome in a DOE successfully executed by biologists, yielding data on complex interactions within a biochemical reaction.