Meta-Modeling of Computational Models – Challenges and Opportunities
Cy Wegman, Section Head; Bill Worley, MS, Technology Leader – Procter & Gamble
Complex computer simulations are critical to the development of predictive models for many products and processes today. These models have their own challenges: they can become computationally expensive and time-consuming for expert users as the number of simulation “experiments” grows in a one-at-a-time simulation environment. Design of experiments and meta-modeling are being used to find holistic solutions to these problems. One would think that, due to the deterministic nature of computer simulations creating meta-models, this would be as easy as falling off of a log. Our experience is that it’s more like being hit with a log. Finding a model that best approximates the holistic behavior of the system has been very difficult. Conventional wisdom may not lead to the best overall meta-model. For example, a Gaussian Process model that fits all the computational data points exactly will give a “good” meta-model. However, close examination of the jackknife residuals versus actual plots shows that the residuals data look more like the noise seen in stochastic experiments. So, in a case like this, a neural net or even a response surface approach may lead to a better overall approximation model than the Gaussian approach. This talk will give an overview of the challenges and opportunities that are a part of meta-modeling.