cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Discussions

Solve problems, and share tips and tricks with other JMP users.
Choose Language Hide Translation Bar
jpol
Level IV

How do I set up a constraint when using Bayesian Optimization?

Hi,

I have been evaluating the Bayesian Optimization platform with some success.

 

In order to run trials quickly I have been using SigmaZones's virtual catapult         Six Sigma Virtual Catapult - SigmaZone

BO was very efficient in achieving the targets set in the exercises for up to 4 input factors (9 total runs from a 4-factor experiment).

 

I attempted to evaluate performance using 5 factors :  

Bungee Position
Cup Elevation
Firing Angle
Pin Elevation
Release Angle

 

My historical trials were as follows:

jpol_0-1761755316110.png

As I advanced through the BO iterations, I noticed that a certain combination of factors resulted in the ball being dropped and not projected. I needed to set up a constraint:  

Constraint:  (Release Angle – Firing Angle) should never be less than 30

 

Something like this:  

jpol_1-1761755523665.png

 

I exported the Candidate set and set up the constraints by removing the "forbidden" combinations. However after modifying and uploading the Candidate set Set, BO continued with the original Candidate set after the next iteration. It did not use the uploaded, constrained Candidate set.

 

What is the correct way to modify and "freeze" the constrained Candidate set?

 

P.S. Should anybody want to run this same exercise, here are the Coding properties:

 

jpol_2-1761755984965.png

The Target is to project a ball 545 mm +/- 20mm  

Have Fun!

 

- Philip

 

 

1 ACCEPTED SOLUTION

Accepted Solutions
Victor_G
Super User

Re: How do I set up a constraint when using Bayesian Optimization?

Hi @jpol,

 

I think the easy way to include this constraint for BO is to add the script "Constraint" in your datatable with this code:

{1 * :Release Angle + -1 * :Firing Angle >= 30};

I have reproduced your dataset and adding this constraint in the scripts of the datatable make the BayesOpt respect this constraint, in a similar way as it is done for any DoE with constraint(s). Here is how the candidate set looks like using a scatterplot matrix:

Victor_G_1-1761757388480.png

 

Hope this answer will help you,

Victor GUILLER

"It is not unusual for a well-designed experiment to analyze itself" (Box, Hunter and Hunter)

View solution in original post

10 REPLIES 10
Victor_G
Super User

Re: How do I set up a constraint when using Bayesian Optimization?

Hi @jpol,

 

I think the easy way to include this constraint for BO is to add the script "Constraint" in your datatable with this code:

{1 * :Release Angle + -1 * :Firing Angle >= 30};

I have reproduced your dataset and adding this constraint in the scripts of the datatable make the BayesOpt respect this constraint, in a similar way as it is done for any DoE with constraint(s). Here is how the candidate set looks like using a scatterplot matrix:

Victor_G_1-1761757388480.png

 

Hope this answer will help you,

Victor GUILLER

"It is not unusual for a well-designed experiment to analyze itself" (Box, Hunter and Hunter)
jpol
Level IV

Re: How do I set up a constraint when using Bayesian Optimization?

Thanks Victor for your answer.

It not only manages to enforce the constraint but also maintains a full Candidate set of 5000 trials. My first attempt reduced  the Candidate set to 3962 trials.

Could you please explain how BayesOpt knows it should obey the  "Constraint" table script? Is the name "Constraint" reserved for that purpose?

 

- Philip

Victor_G
Super User

Re: How do I set up a constraint when using Bayesian Optimization?

Hi Philip,

 

As you already may have notice, JMP DoE and Bayesian Optimization platforms use similar metadata from the datatable to make the analysis or design creation easier: Factors column properties (Design Role, Coding, Factor Changes, Mixture, etc...), Response Limits, and Constraint script added to the datatable as soon as you have specified any constraint and created the DoE datatable. The Bayesian Optimization platform use many of these metadata to work, which makes it relatively well integrated in JMP ecosystem. 

There is also an "official" solution in JMP Help 19.0 documentation, where you have to search for the "Factors" tab in the Optimization panel you can obtain from the red triangle of Augmented Acquisition Functions Profiler>Optimization and Desirability>Optimization Control Panel:

Victor_G_0-1761810312423.png
From here, you can add any constraint you want and save it in your datatable. See details here: Additional Example of the Bayesian Optimization Platform with Factor Constraints

The result from this option after having clicked on "Apply Changes and Save to Script" is the same as my quick manual trick, it will create a script called "Constraint" in your datatable with the constraint you have specified:

{1 * :Release Angle + -1 * :Firing Angle <= 30};


Hope this complementary answer will clarify the inner working of this platform to handle constraints,

Victor GUILLER

"It is not unusual for a well-designed experiment to analyze itself" (Box, Hunter and Hunter)
jpol
Level IV

Re: How do I set up a constraint when using Bayesian Optimization?

Thanks Victor,

Much appreciated.

 

Philip

statman
Super User

Re: How do I set up a constraint when using Bayesian Optimization?

You realize the algorithm for the optimum model is already known?  You are just trying to discover it.  Very unrealistic, but have fun...

"All models are wrong, some are useful" G.E.P. Box
jpol
Level IV

Re: How do I set up a constraint when using Bayesian Optimization?

Thank you statman for your insight.

Yes, I do realize that, but it is a fun exercise, excellent for teaching purposes and certainly beats waiting several hours for each trial to be run in manufacturing environment. 

statman
Super User

Re: How do I set up a constraint when using Bayesian Optimization?

I have mixed feelings about the utility for teaching DOE.  I think to teach DOE, you need real projects with real noise and challenges about different options to handle different situations.  Seems like this might be useful to demonstrate "proof of concept"...yes DOE is effective.  But teaching requires using more of your senses.

I believe Fisher nailed it:

"There is, frankly, no easy substitute for the educational discipline of whole-time personal responsibility for the planning and conduct of experiments designed for the ascertainment of fact, or the improvement of Natural Knowledge, I say "educational discipline" because such experience trains the mind and deepens the judgement for innumerable ancillary decisions, on which the value or cogency of an experimental program depends.  A man with five, or ten, or fifteen years experience given to such discipline has been himself profoundly modified in his capacity for the direction of such work.  He has, as we say, learnt by experience, and this effect will be more profound the more deeply his thought has been immersed in his problems.  Such men, if they have the taste and gift for exposition, should be the authors of outreach textbooks on Experimental Design, and the teachers and directors of our advanced schools of statistics." and

“The literature as it has grown up seems to be unbalanced in its comparative neglect of the Scientific aspects of the problem, and of its Logical aspects. This perhaps might have been expected, since many of the authors, albeit talented mathematicians, have evidently never submitted their minds to the specifically educational discipline of any one of the Natural Sciences, have never taken personal responsibility for experimentation at ground level, and have no direct experience of the kind of material involved…”

Sir Ronald A. Fischer (1962) Colloques International du Centre National de la Recherche Scientifique. Paris, No. 110:13-19

"All models are wrong, some are useful" G.E.P. Box
jpol
Level IV

Re: How do I set up a constraint when using Bayesian Optimization?

Hi statman,

I appreciate your thoughtful response and the Fisher quotes—there’s certainly wisdom in them. I agree that real-world experimentation brings irreplaceable depth, especially in developing judgment and intuition.
That said, I see virtual tools like the SigmaZone catapult as complementary rather than substitutes. They allow learners to grasp core DOE concepts quickly, iterate without resource constraints, and build confidence before tackling messier real-world scenarios. In my experience, this scaffolding can make the transition to physical experimentation more effective.

I’d be curious to hear if you’ve found any hybrid approaches that balance realism with accessibility in teaching DOE. Especially, as nowadays, when so many of our colleagues still work remotely.

- Philip

statman
Super User

Re: How do I set up a constraint when using Bayesian Optimization?

The key is to run "mock" experiments that include noise.  Have you seen Box's helicopter experiments?  I run several experiments in my classes that are "hands-on".  Balsa gliders, aspirin dissolution, penny drop, etc.  Much better than a simulation program because they involve more senses.  Even the Statapult (catapult) is better, though it isn't very noisy.

"All models are wrong, some are useful" G.E.P. Box

Recommended Articles