Original session date: 3 March 2022
Topics covered: DOE, space filling, Python, machine learning
Speaker: Nick Shelton, Systems Engineer Manager
Optimizing machine learning hyperparameters is an important and sometimes fraught process. Variation in a single hyperparameter can greatly alter the efficacy of a model and when you have upwards of 40 hyperparameters, how can you possibly know the ideal settings? Historically, the procedure to identify the optimal settings has been around using either a grid or random framework to identify them. Herein, Nick Shelton walks us through the pitfalls of both methodologies and offers us a new solution: Space-filling Design of Experiment. He uses JMP Scripting Language (JSL) to call out directly to python to test multiple hyperparameters at once over the course of 30 iterations of the same model to rapidly identify the ideal hyperparameters for Python. Truly a time saving and, dare I say, headache saving tool any machine learning programmer ought to employ.
Figure 1. The accuracy of a given machine learning algorithm when changing only a single parameter highlights the need to optimize hyperparameter selection.
Figure 2. JMP's Surface Profiler allows for 3D representation of optimum hyperparameter settings
HP Optimization Journal.jrn
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.