Metamodels, or surrogate models, have been proposed in the literature to reduce the resources (time/cost) invested in the design and optimization of engineering systems whose behavior is modeled using complex computer codes, in an area commonly known as simulation-based design optimization. Following the seminal paper of Sacks et al. (1989, “Design and Analysis of Computer Experiments,” Stat. Sci., 4(4), pp. 409–435), researchers have developed the field of design and analysis of computer experiments (DACE), focusing on different aspects of the problem such as experimental design, approximation methods, model fitting, model validation, and metamodeling-based optimization methods. Among these, model validation remains a key issue, as the reliability and trustworthiness of the results depend greatly on the quality of approximation of the metamodel. Typically, model validation involves calculating prediction errors of the metamodel using a data set different from the one used to build the model. Due to the high cost associated with computer experiments with simulation codes, validation approaches that do not require additional data points (samples) are preferable. However, it is documented that methods based on resampling, e.g., cross validation (CV), can exhibit oscillatory behavior during sequential/adaptive sampling and model refinement, thus making it difficult to quantify the approximation capabilities of the metamodels and/or to define rational stopping criteria for the metamodel refinement process. In this work, we present the results of a simulation experiment conducted to study the evolution of several error metrics during sequential model refinement, to estimate prediction errors, and to define proper stopping criteria without requiring additional samples beyond those used to build the metamodels. Our results show that it is possible to accurately estimate the predictive performance of Kriging metamodels without additional samples, and that leave-one-out CV errors perform poorly in this context. Based on our findings, we propose guidelines for choosing the sample size of computer experiments that use sequential/adaptive model refinement paradigm. We also propose a stopping criterion for sequential model refinement that does not require additional samples.

References

References
1.
Queipo
,
N.
,
Haftka
,
R.
,
Shyy
,
W.
,
Goel
,
T.
,
Vaidyanathan
,
R.
, and
Tucker
,
P.
,
2005
, “
Surrogate-Based Analysis and Optimization
,”
Prog. Aerospace Sci.
,
41
(
1
), pp.
1
28
.10.1016/j.paerosci.2005.02.001
2.
Sacks
,
J.
,
Welch
,
W.
,
Mitchell
,
T.
, and
Wynn
,
H.
,
1989
, “
Design and Analysis of Computer Experiments
,”
Stat. Sci.
,
4
(
4
), pp.
409
435
.10.1214/ss/1177012413
3.
Wang
,
G.
, and
Shan
,
S.
,
2007
, “
Review of Metamodeling Techniques in Support of Engineering Design Optimization
,”
ASME J. Mech. Des.
,
129
(
4
), pp.
370
380
.10.1115/1.2429697
4.
Forrester
,
A.
, and
Keane
,
A.
,
2009
, “
Recent Advances in Surrogate-Based Optimization
,”
Prog. Aerospace Sci.
,
45
(
1–3
), pp.
50
79
10.1016/j.paerosci.2008.11.001
5.
McKay
,
M.
,
Beckman
,
R.
, and
Conover
,
W.
,
1979
, “
A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output From a Computer Code
,”
Technometrics
,
21
(
2
), pp.
239
245
.
6.
Jones
,
D.
,
Schonlau
,
M.
, and
Welch
,
W.
,
1998
, “
Efficient Global Optimization of Expensive Black-Box Functions
,”
J. Global Optim.
,
13
(
4
), pp.
455
492
.10.1023/A:1008306431147
7.
Sasena
,
M.
,
Papalambros
,
P.
, and
Goovaerts
,
P.
,
2002
, “
Exploration of Metamodeling Sampling Criteria for Constrained Global Optimization
,”
Eng. Optim.
,
34
(
3
), pp.
263
278
.10.1080/03052150211751
8.
Jin
,
R.
,
Chen
,
W.
, and
Simpson
,
T.
,
2001
, “
Comparative Studies of Metamodeling Techniques Under Multiple Modeling Criteria
,”
Struct. Multidisc. Optim.
,
23
(
1
), pp.
1
13
.10.1007/s00158-001-0160-4
9.
Goel
,
T.
,
Haftka
,
R.
, and
Shyy
,
W.
,
2009
, “
Comparing Error Estimation Measures for Polynomial and Kriging Approximation of Noise-Free Functions
,”
Struct. Multidisc. Optim.
,
38
(
5
), pp.
429
442
.10.1007/s00158-008-0290-z
10.
Goel
,
T.
, and
Stander
,
N.
,
2009
, “
Comparing Three Error Criteria for Selecting Radial Basis Function Network Topology
,”
Comput. Methods Appl. Mech. Eng.
,
198
(
27
), pp.
2137
2150
.10.1016/j.cma.2009.02.016
11.
Viana
,
F.
,
Picheny
,
V.
, and
Haftka
,
R.
,
2010
, “
Using Cross-Validation to Design Conservative Surrogates
,”
AIAA J.
,
48
(
10
), pp.
2286
2298
.10.2514/1.J050327
12.
Viana
,
F.
,
Haftka
,
R.
, and
Steffen
,
V.
,
2009
, “
Multiple Surrogates: How Cross-Validation Errors Can Help Us to Obtain the Best Predictor
,”
Struct. Multidisc. Optim.
,
39
(
4
), pp.
439
457
.10.1007/s00158-008-0338-0
13.
Viana
,
F.
, and
Haftka
,
R.
,
2009
, “
Cross Validation Can Estimate How Well Prediction Variance Correlates With Error
,”
AIAA J.
,
47
(
9
), pp.
2266
2270
.10.2514/1.42162
14.
Loeppky
,
J.
,
Sacks
,
J.
, and
Welch
,
W.
,
2009
, “
Choosing the Sample Size of a Computer Experiment: A Practical Guide
,”
Technometrics
,
51
(
4
), pp.
366
376
.10.1198/TECH.2009.08040
15.
Bischl
,
B.
,
Mersmann
,
O.
, and
Trautmann
,
H.
,
2010
, “
Resampling Methods in Model Validation
,”
Workshop on Experimental Methods for the Assessment of Computational Systems (WEMACS 2010)
, held in conjunction with the International Conference on Parallel Problem Solving From Nature (PPSN 2010), Krakow, Poland, Sept. 11, p.
14
.
16.
Martin
,
J.
,
2009
, “
Computational Improvements to Estimating Kriging Metamodel Parameters
,”
ASME J. Mech. Des.
,
131
(
8
), p.
084501
.10.1115/1.3151807
17.
Lophaven
,
S.
,
Nielsen
,
H.
, and
Sondergaard
,
J.
,
2002
, “
Aspects of the matlab toolbox DACE
,” Technical Report No. IMM-REP-2002-13.
18.
Fang
,
K.
,
Li
,
R.
, and
Sudjianto
,
A.
,
2006
,
Design and Modeling for Computer Experiments
,
Chapman & Hall/CRC
, Boca Raton, FL.
19.
Jin
,
R.
,
Chen
,
W.
, and
Sudjianto
,
A.
,
2002
, “
On Sequential Sampling for Global Metamodeling in Engineering Design
,”
ASME
Paper No. DETC2002/DAC-34092.10.1115/DETC2002/DAC-34092
20.
Lin
,
Y.
,
Mistree
,
F.
,
Allen
,
J.
,
Tsui
,
K.
, and
Chen
,
V.
,
2004
, “
A Sequential Exploratory Experimental Design Method: Development of Appropriate Empirical Models in Design
,”
ASME
Paper No. DETC2004-57527.10.1115/DETC2004-57527
21.
Romero
,
D.
,
Amon
,
C.
, and
Finger
,
S.
,
2006
, “
On Adaptive Sampling for Single and Multi-Response Bayesian Surrogate Models
,”
ASME
Paper No. DETC2006-99210.10.1115/DETC2006-99210
22.
Wang
,
G.
, and
Shan
,
S.
,
2006
, “
Review of Metamodeling Techniques in Support of Engineering Design Optimization
,”
ASME
Paper No. DETC2006-99412.10.1115/DETC2006-99412
23.
Osio
,
I. G.
,
1996
, “
Multistage Bayesian Surrogates and Optimal Sampling for Engineering Design and Process Improvement
,” Ph.D. thesis, Carnegie Mellon University, Pittsburgh, PA.
24.
Romero
,
D.
,
Amon
,
C.
, and
Finger
,
S.
,
2012
, “
Multiresponse Metamodeling in Simulation-Based Design Applications
,”
ASME J. Mech. Des.
,
134
(
9
), p.
091001
.10.1115/1.4006996
25.
Shewry
,
M.
, and
Wynn
,
H.
,
1987
, “
Maximum Entropy Sampling
,”
J. Appl. Stat.
,
14
(
2
), pp.
165
170
.10.1080/02664768700000020
26.
Sacks
,
J.
,
Schiller
,
S.
, and
Welch
,
W.
,
1989
, “
Designs for Computer Experiments
,”
Technometrics
,
31
(
1
), pp.
41
47
.10.1080/00401706.1989.10488474
27.
Santner
,
T.
,
Williams
,
B.
, and
Notz
,
W.
,
2003
,
The Design and Analysis of Computer Experiments
(Springer Series in Statistics),
Springer-Verlag, Inc.
,
New York
.
28.
Martin
,
J.
, and
Simpson
,
T.
,
2004
, “
On the Use of Kriging Models to Approximate Deterministic Computer Models
,”
ASME
Paper No. DETC2004-57300.10.1115/DETC2004-57300
29.
Emmerich
,
M.
,
Giannakoglou
,
K.
, and
Naujoks
,
B.
,
2006
, “
Single- and Multi-Objective Evolutionary Optimization Assisted by Gaussian Random Field Metamodels
,”
IEEE Trans. Evol. Comput.
,
10
(
4
), pp.
421
439
.10.1109/TEVC.2005.859463
30.
Ginsbourger
,
D.
,
Le Riche
,
R.
, and
Carraro
,
L.
,
2007
, “
A Multi-Points Criterion for Deterministic Parallel Global Optimization Based on Kriging
,”
International Conference on Nonconvex Programming (NCP07)
National Institute for Applied Sciences Rouen, France, Dec. 17–21.
31.
Ponweiser
,
W.
,
Wagner
,
T.
, and
Vincze
,
M.
,
2008
, “
Clustered Multiple Generalized Expected Improvement: A Novel Infill Sampling Criterion for Surrogate Models
,”
IEEE
Congress on Evolutionary Computation, pp.
1515
1522
, Hong Kong, June 1–6, pp. 3515–3522.10.1109/CEC.2008.4631273
32.
Wasserman
,
L.
,
2006
,
All of Nonparametric Statistics
(Springer Texts in Statistics),
Springer-Verlag
, New York.
33.
Meckesheimer
,
M.
,
Booker
,
A.
,
Barton
,
R.
, and
Simpson
,
T.
,
2002
, “
Computationally Inexpensive Metamodel Assessment Strategies
,”
AIAA J.
,
40
(
10
), pp.
2053
2060
.10.2514/2.1538
34.
Wasserman
,
L.
,
2006
,
All of Statistics: A Concise Course in Statistical Inference
(Springer Texts in Statistics),
Springer-Verlag
, New York.
35.
Trosset
,
M.
, and
Padula
,
A.
,
2000
, “
Designing and Analyzing Computational Experiments for Global Optimization
,” Technical Report No. TR00-25.
36.
Burnham
,
K.
, and
Anderson
,
D.
,
2004
, “
Multimodel Selection: Understanding AIC and BIC in Model Selection
,” Sociolog. Meth. Res.,
33
(2), pp. 261–304.
37.
Akaike
,
H.
,
1974
. “
A New Look at the Statistical Model Identification
,”
IEEE Trans. Autom. Control
,
AC-19
(
6
), pp.
716
723
.10.1109/TAC.1974.1100705
You do not currently have access to this content.