Bayesian Optimization in Innovation: Key Strategies

Bayesian Optimization in Innovation: Key Strategies
Slide Note
Embed
Share

Bayesian Optimization is an integrated methodology with the potential to drastically reduce time and resources for innovation. Through agile product development, it can lower R&D costs and expedite time-to-market. The iterative approach of Bayesian Optimization aligns well with the research and development culture in certain companies, making it a natural fit for enhancing innovation processes. This technique can facilitate the filling of knowledge gaps, fostering competitive advantage in rapidly evolving markets. Thank you to the contributors and supporters who have enabled the advancement of Bayesian Optimization in various fields of innovation.

  • Bayesian Optimization
  • Innovation
  • Agile Development
  • R&D
  • Competitive Advantage

Uploaded on Mar 12, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Introduction to Bayesian Optimization Chris Gotwalt, JMP

  2. BAYESIAN OPTIMIZATION INTRO Innovation is the key to economic growth in advanced economies: Product formulation for pharma, consumer products, materials Process innovation in semiconductor manufacturing Design of mechanical devices: engines, machines, motors Competitive Innovation Pressure demands companies to innovate quickly to be the first to market

  3. BAYESIAN OPTIMIZATION INTRO Bayesian Optimization: 1) Is an integrated modeling/data augmentation methodology 2) Has the potential to dramatically reduce time/resources for innovation Agile product development Reduce R&D costs Reduce time-to-market 3) BO s iterative filling in the knowledge gaps approach may be a very natural fit to R&D culture in some companies

  4. BAYESIAN OPTIMIZATION INTRO Many Thanks: Mark Bailey Created the BO JMP Pro Addin Kasia Dobrzycka Collaborator on the new BO Platform Ron Kenett Feedback and discussions on the material Several Key JMP Customers for sharing their needs and data Phil Kay Who is has been the champion of new JMP functionality on our business side

  5. BAYESIAN OPTIMIZATION INTRO Cold Start No Existing Data Finished Starting Design Yes Collect Responses Goals Met? Model Responses No Determine Next Run(s)

  6. BAYESIAN OPTIMIZATION INTRO Warm Start with Existing Data Finished Yes Collect Responses Goals Met? Model Responses No Determine Next Run(s)

  7. BAYESIAN OPTIMIZATION INTRO Gaussian Process Model Warm Start with Existing Data Finished Yes Collect Responses Goals Met? Model Responses No Determine Next Run(s) BO Acquisition Function

  8. POPULAR INNOVATION STRATEGIES IN INDUSTRY Existing data-centric procedures for product/process innovation Subject Matter Expert guided, ad hoc artisanal approaches Design of Experiments (DoE)

  9. BAYESIAN OPTIMIZATION INTRO Subject Matter Expert guided, artisanal approaches Trial and error, using the knowledge and experience of an expert Often falls into the trap of one factor at a time experimentation Uncertainty is not quantified

  10. BAYESIAN OPTIMIZATION INTRO Design of Experiments (DoE) e.g., Box Hunter & Hunter (BH2) further developed by Jones, Goos, Nachtsheim 1) Consult with the expert to define goals and factor ranges 2) Run a screening design to identify important factors 3) Augment via replicates, foldovers, or optimality criteria a) Resolve interactions b) Accommodate curvature c) Improve precision in estimates 4) Fit a final response surface model and optimize

  11. BAYESIAN OPTIMIZATION INTRO Classical BH2 Design of Experiments (DoE) 1) Built-in handling of variation, measurement error 2) Augmentation addresses issues with a linear model a) Estimability of interactions and curvature b) Improving precision via functions of ??? ? ??? 3) Augmentation is statistically motivated: a) Does not leverage the response values! b) Does not incorporate the goals c) Places points at extremes and centers of ranges (because ???)

  12. BAYESIAN OPTIMIZATION INTRO Classical BH2 Design of Experiments (DoE) 1) Has had tremendous success for 50+ years, which will certainly continue 2) Many potential R&D practitioners have had criticisms: 1) Large up-front sample size requirements 2) Focus on statistical issues as opposed to problem solving

  13. BAYESIAN OPTIMIZATION Bayesian Optimization is a mathematical optimization algorithm: Find settings, ?, that maximize (or minimize) y(?) BO selects new ? s using Acquisition Rules that combine: ? ? | ?,?,? y(?), a data-derived model ??? ? ? | ?,?,? , the estimated precision of ? ? | ?,?,? Augmentation is problem solving oriented, directly incorporating the y(?) goal

  14. BASIC BAYESIAN OPTIMIZATION y(?) is a real valued (1d), continuous function of a real valued input ? y(?) is deterministic, its observations are without random error Goal is to minimize or maximize y(?) Augmentation consists of only a single run at a time, but uses the current response value

  15. BASIC BAYESIAN OPTIMIZATION Have data (X,y X ) Obtain model ? ? along with ??? ??| ?,?,? The Acquisition Function A( ? ? , value on a candidate new ? to try. ??? ? ? | ?,?,? ) places a numeric We perform a search (somehow ) to find the ???? with the best value of the Acquisition. Obtain y ????, update model, and repeat until satisfied

  16. EXOTIC BAYESIAN OPTIMIZATION y ? observed with random error Multiple responses, ??? , with Competing goals Complicated sparsity Categorical inputs or outputs Match target/spec limit goal y ? = ?? or y ? [??,??] Missing inputs Batch augmentation of more than one run Randomization restrictions (blocking & split plotting) Many inputs: dim ? > 20 Messy training data of unknown and dubious value

  17. EXOTIC BAYESIAN OPTIMIZATION Every example I have seen has been Exotic in multiple ways

  18. BAYESIAN OPTIMIZATION I ve been working with several clients very motivated to use BO Level of basic knowledge in industry about BO & GaSP modeling is low Existing material is (mostly) oriented towards academic research The goal of this workshop is to come away with an applied understanding of the essential concepts and tools of BO for use in product/process design

  19. BAYESIAN OPTIMIZATION (POTENTIAL) ADVANTAGES Incorporates the goals, model predictions, and uncertainty into choice of next run (it learns from the responses) Potentially speeds up innovation (using data efficiently) Feels more natural than linear models augmentation Gives us clearer guidance on when we can stop experimenting Cold start initial data/design can be very small, low startup costs Warm start data often messy, BO tells you where to fill the gaps Using ML models could ultimately simplify training ( less linear stat models )

  20. BAYESIAN OPTIMIZATION (POTENTIAL) DISADVANTAGES The methodology for common types of exoticness is not mature I know of several companies that have spent $100k-$250k for consulting on single BO projects, and all have failed miserably Diagnostics for when things are going wrong and what to do about it is not mature (how do we know the existing data are relevant?) Often Gaussian Process models are the base learner, these are numerically slow and temperamental Inference is not the focus (but uncertainty quantification still is)

  21. THIS WORKSHOP We will be focusing on Basic BO, but with error in y. Single continuous response Continuous inputs The modeling family will be Gaussian Process regression GaSP models are flexible and have a convenient natural extension to batch augmentation GaSP interpretability is less than linear models, but higher than pure ML models like NN s and Tree-based models ??? for BO with a

  22. JMP Pro 19 Roadmap Empower SMEs in industrial R&D (like 6? Black Belts) to design products and processes w BO efficiently, effectively and safely Decision process: Comprehendible, transparent, reproducable, trustworthy, and easily trainable to non-expert, intermittent users Handling of multiple responses with complex sparsity, with different goals ( including matching targets ) Batch augmentation Faster and more robust estimation with a prediction orientation, with sanity priors on the nugget and GaSP paramters

  23. WE ARE AT THE BEGINNING GaSP/BO fits squarely in the industrial product/process innovation space The techniques are squarely in the realm of statistical modeling and design (it is both a neural network and a linear mixed model, augmentation principles are closely related to G-optimality) This will have a transformative disruptive effect. If it isn t a total revolution it will occupy an important place in the future This is a call to action for the industrial statistics community. If we don t take the lead, others will and are

More Related Content