Design optimization under uncertainty is notoriously difficult when the objective function is expensive to evaluate. State-of-the-art techniques, e.g., stochastic optimization or sampling average approximation, fail to learn exploitable patterns from collected data and require a lot of objective function evaluations. There is a need for techniques that alleviate the high cost of information acquisition and select sequential simulations optimally. In the field of deterministic single-objective unconstrained global optimization, the Bayesian global optimization (BGO) approach has been relatively successful in addressing the information acquisition problem. BGO builds a probabilistic surrogate of the expensive objective function and uses it to define an information acquisition function (IAF) that quantifies the merit of making new objective evaluations. In this work, we reformulate the expected improvement (EI) IAF to filter out parametric and measurement uncertainties. We bypass the curse of dimensionality, since the method does not require learning the response surface as a function of the stochastic parameters, and we employ a fully Bayesian interpretation of Gaussian processes (GPs) by constructing a particle approximation of the posterior of its hyperparameters using adaptive Markov chain Monte Carlo (MCMC) to increase the methods robustness. Also, our approach quantifies the epistemic uncertainty on the location of the optimum and the optimal value as induced by the limited number of objective evaluations used in obtaining it. We verify and validate our approach by solving two synthetic optimization problems under uncertainty and demonstrate it by solving the oil-well placement problem (OWPP) with uncertainties in the permeability field and the oil price time series.

References

1.
Bottou
,
L.
,
2010
, “
Large-Scale Machine Learning With Stochastic Gradient Descent
,”
19th International Conference on Computational Statistics, COMPSTAT’2010
, Paris, France, Aug. 22–27, Springer, Berlin, pp.
177
186
.
2.
Kleywegt
,
A. J.
,
Shapiro
,
A.
, and
Homem-de Mello
,
T.
,
2002
, “
The Sample Average Approximation Method for Stochastic Discrete Optimization
,”
SIAM J. Optim.
,
12
(
2
), pp.
479
502
.
3.
Heyman
,
D. P.
, and
Sobel
,
M. J.
,
2003
,
Stochastic Models in Operations Research: Stochastic Optimization
, Vol.
2
,
Courier Corporation
, Chicago, IL.
4.
Zinkevich
,
M.
,
Weimer
,
M.
,
Li
,
L.
, and
Smola
,
A. J.
,
2010
, “
Parallelized Stochastic Gradient Descent
,”
Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems
, Vancouver, BC, Canada, Dec. 6–9, pp.
2595
2603
.
5.
Torn
,
A.
, and
Zilinskas
,
A.
,
1987
,
Global Optimization
,
Springer
, Heidelberg, Germany.
6.
Mockus
,
J.
,
1994
, “
Application of Bayesian Approach to Numerical Methods of Global and Stochastic Optimization
,”
J. Global Optim.
,
4
(
4
), pp.
347
365
.
7.
Locatelli
,
M.
,
1997
, “
Bayesian Algorithms for One-Dimensional Global Optimization
,”
J. Global Optim.
,
10
(
1
), pp.
57
76
.
8.
Jones
,
D. R.
,
2001
, “
A Taxonomy of Global Optimization Methods Based on Response Surfaces
,”
J. Global Optim.
,
21
(
4
), pp.
345
383
.
9.
Lizotte
,
D.
,
2008
, “
Practical Bayesian Optimization
,” Ph.D. thesis, University of Alberta, Edmonton, AB, Canada.
10.
Benassi
,
R.
,
Bect
,
J.
, and
Vazquez
,
E.
,
2011
,
Robust Gaussian Process-Based Global Optimization Using a Fully Bayesian Expected Improvement Criterion
,
Springer
, Heidelberg, Germany, pp.
176
190
.
11.
Bull
,
A. D.
,
2011
, “
Convergence Rates of Efficient Global Optimization Algorithms
,”
J. Mach. Learn. Res.
,
12
, pp.
2879
2904
.http://www.jmlr.org/papers/v12/bull11a.html
12.
Jones
,
D. R.
,
Schonlau
,
M.
, and
Welch
,
W. J.
,
1998
, “
Efficient Global Optimization of Expensive Black-Box Functions
,”
J. Global Optim.
,
13
(
4
), pp.
455
492
.
13.
Frazier
,
P. I.
,
Powell
,
W. B.
, and
Dayanik
,
S.
,
2008
, “
A Knowledge-Gradient Policy for Sequential Information Collection
,”
SIAM J. Control Optim.
,
47
(
5
), pp.
2410
2439
.
14.
Frazier
,
P.
,
Powell
,
W.
, and
Dayanik
,
S.
,
2009
, “
The Knowledge-Gradient Policy for Correlated Normal Beliefs
,”
INFORMS J. Comput.
,
21
(
4
), pp.
599
613
.
15.
Negoescu
,
D. M.
,
Frazier
,
P. I.
, and
Powell
,
W. B.
,
2011
, “
The Knowledge-Gradient Algorithm for Sequencing Experiments in Drug Discovery
,”
INFORMS J. Comput.
,
23
(
3
), pp.
346
363
.
16.
Scott
,
W.
,
Frazier
,
P.
, and
Powell
,
W.
,
2011
, “
The Correlated Knowledge Gradient for Simulation Optimization of Continuous Parameters Using Gaussian Process Regression
,”
SIAM J. Optim.
,
21
(
3
), pp.
996
1026
.
17.
Villemonteix
,
J.
,
Vazquez
,
E.
, and
Walter
,
E.
,
2009
, “
An Informational Approach to the Global Optimization of Expensive-to-Evaluate Functions
,”
J. Global Optim.
,
44
(
4
), pp.
509
534
.
18.
Hennig
,
P.
, and
Schuler
,
C. J.
,
2012
, “
Entropy Search for Information-Efficient Global Optimization
,”
J. Mach. Learn. Res.
,
13
, pp.
1809
1837
.http://dl.acm.org/citation.cfm?id=2343701
19.
Hernadez-Lobato
,
J. M.
,
Hoffman
,
M.
, and
Ghahramani
,
Z.
,
2014
, “
Predictive Entropy Search for Efficient Global Optimization of Black-Box Functions
,”
Advances in Neural Information Processing Systems 27: 28th Annual Conference on Neural Information Processing Systems
, Montreal, QC, Canada, Dec. 8–13, pp.
918
926
.
20.
Gaul
,
N. J.
,
Cowles
,
M. K.
,
Cho
,
H.
,
Choi
,
K.
, and
Lamb
,
D.
,
2015
, “
Modified Bayesian Kriging for Noisy Response Problems for Reliability Analysis
,”
ASME
Paper No. DETC2015-47370.
21.
Gaul
,
N. J.
,
2014
, “
Modified Bayesian Kriging for Noisy Response Problems and Bayesian Confidence-Based Reliability-Based Design Optimization
,” Ph.D. dissertation, University of Iowa, Iowa City, IA.
22.
Arendt
,
P. D.
,
Apley
,
D. W.
, and
Chen
,
W.
,
2013
, “
Objective-Oriented Sequential Sampling for Simulation Based Robust Design Considering Multiple Sources of Uncertainty
,”
ASME J. Mech. Des.
,
135
(
5
), p.
051005
.
23.
MacKay
,
D. J. C.
,
1992
, “
Information-Based Objective Functions for Active Data Selection
,”
Neural Comput.
,
4
(
4
), pp.
590
604
.
24.
Tajbakhsh
,
S. D.
,
Castillo
,
E.
, and
Rosenberger
,
J. L.
,
2015
, “
A Bayesian Approach to Sequential Optimization Based on Computer Experiments
,”
Qual. Reliab. Eng. Int.
,
31
(
6
), pp.
1001
1012
.
25.
Jaynes
,
E. T.
,
2003
,
Probability Theory: The Logic of Science
,
Cambridge University Press
,
Cambridge, UK
.
26.
Powell
,
W. B.
, and
Ryzhov
,
I. O.
,
2012
,
Optimal Learning
, Vol.
841
,
Wiley
, New York.
27.
Bertsekas
,
D.
,
2007
,
Dynamic Programming and Optimal Control
, 4th ed.,
Athena Scientific
, Nashua, NH.
28.
Rasmussen
,
C. E.
, and
Williams
,
C. K. I.
,
2006
,
Gaussian Processes for Machine Learning
(Adaptive Computation and Machine Learning),
MIT Press
,
Cambridge, MA
.
29.
Cressie
,
N.
,
1990
, “
The Origins of Kriging
,”
Math. Geol.
,
22
(
3
), pp.
239
252
.
30.
Smith
,
T. E.
, and
Dearmon
,
J.
,
2014
, “
Gaussian Process Regression and Bayesian Model Averaging: An Alternative Approach to Modeling Spatial Phenomena
,”
Geogr. Anal.
,
48
(
1
), pp.
82
111
.
31.
Jeffreys
,
H.
,
1946
, “
An Invariant Form for the Prior Probability in Estimation Problems
,”
Proc. R. Soc. London, Ser. A
,
186
(
1007
), pp.
453
461
.
32.
Conti
,
S.
, and
O'Hagan
,
A.
,
2010
, “
Bayesian Emulation of Complex Multi-Output and Dynamic Computer Models
,”
J. Stat. Plann. Inference
,
140
(
3
), pp.
640
651
.
33.
Haario
,
H.
,
Laine
,
M.
,
Mira
,
A.
, and
Saksman
,
E.
,
2006
, “
DRAM: Efficient Adaptive MCMC
,”
Stat. Comput.
,
16
(
4
), pp.
339
354
.
34.
Bilionis
,
I.
, and
Zabaras
,
N.
,
2012
, “
Multi-Output Local Gaussian Process Regression: Applications to Uncertainty Quantification
,”
J. Comput. Phys.
,
231
(
17
), pp.
5718
5746
.
35.
Bilionis
,
I.
, and
Zabaras
,
N.
,
2012
, “
Multidimensional Adaptive Relevance Vector Machines for Uncertainty Quantification
,”
SIAM J. Sci. Comput.
,
34
(
6
), pp.
B881
B908
.
36.
Bilionis
,
I.
,
Zabaras
,
N.
,
Konomi
,
B. A.
, and
Lin
,
G.
,
2013
, “
Multi-Output Separable Gaussian Process: Towards an Efficient, Fully Bayesian Paradigm for Uncertainty Quantification
,”
J. Comput. Phys.
,
241
, pp.
212
239
.
37.
Bilionis
,
I.
, and
Zabaras
,
N.
,
2014
, “
Solution of Inverse Problems With Limited Forward Solver Evaluations: A Bayesian Perspective
,”
Inverse Probl.
,
30
(
1
), p.
015004
.
38.
Chen
,
P.
,
Zabaras
,
N.
, and
Bilionis
,
I.
,
2015
, “
Uncertainty Propagation Using Infinite Mixture of Gaussian Processes and Variational Bayesian Inference
,”
J. Comput. Phys.
,
284
, pp.
291
333
.
39.
Huang
,
D.
,
Allen
,
T. T.
,
Notz
,
W. I.
, and
Zeng
,
N.
,
2006
, “
Global Optimization of Stochastic Black-Box Systems Via Sequential Kriging Meta-Models
,”
J. Global Optim.
,
34
(
3
), pp.
441
466
.
40.
Huang
,
D.
,
Allen
,
T.
,
Notz
,
W.
, and
Miller
,
R.
,
2006
, “
Sequential Kriging Optimization Using Multiple-Fidelity Evaluations
,”
Struct. Multidiscip. Optim.
,
32
(
5
), pp.
369
382
.
41.
Mckay
,
M. D.
,
Beckman
,
R. J.
, and
Conover
,
W. J.
,
2000
, “
A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output From a Computer Code
,”
Technometrics
,
42
(
1
), pp.
55
61
.
42.
Sasena
,
M. J.
,
2002
, “
Flexibility and Efficiency Enhancements for Constrained Global Design Optimization With Kriging Approximations
,” Ph.D. thesis, University of Michigan, Ann Arbor, MI.
43.
Christie
,
M.
, and
Blunt
,
M.
,
2001
, “
Tenth SPE Comparative Solution Project: A Comparison of Upscaling Techniques
,”
SPE Reservoir Eval. Eng.
,
4
(
4
), pp.
308
317
.
44.
Ghanem
,
R. G.
, and
Spanos
,
P. D.
,
2003
,
Stochastic Finite Elements: A Spectral Approach
,
Courier Corporation
, Chicago, IL.
You do not currently have access to this content.