Abstract

Model validation is the process of determining the degree to which a model is an accurate representation of the true value in the real world. The results of a model validation study can be used to either quantify the model form uncertainty or to improve/calibrate the model. The model validation process becomes complex when there is uncertainty in the simulation and/or experimental outcomes. These uncertainties can be in the form of aleatory uncertainties due to randomness or epistemic uncertainties due to lack of knowledge. Five different approaches are used for addressing model validation and predictive capability: (1) the area validation metric (AVM), (2) a modified area validation metric (MAVM) with confidence intervals, (3) the validation uncertainty procedure from ASME V&V 20, (4) a calibration procedure interpreted from ASME V&V 20, and (5) identification of the model discrepancy term using Bayesian estimation. To provide an unambiguous assessment of these different approaches, synthetic experimental data is generated from computational fluid dynamics simulations of an airfoil with a flap. A simplified model is then developed using thin airfoil theory. The accuracy of the simplified model is assessed using the synthetic experimental data. The quantities examined include the two-dimensional lift and moment coefficients for the airfoil with varying angles of attack and flap deflection angles. Each of these approaches is assessed for the ability to tightly encapsulate the true value at conditions both where experimental results are provided and prediction locations where no experimental data are available. Generally, it was seen that the MAVM performed the best in cases where there is a sparse amount of data and/or large extrapolations. Furthermore, it was found that Bayesian estimation outperformed the others where there is an extensive amount of experimental data that covers the application domain.

References

1.
Oberkampf
,
W. L.
, and
Roy
,
C. J.
,
2010
,
Verification and Validation in Scientific Computing
,
Cambridge University Press
,
New York
.
2.
Oberkampf
,
W. L.
, and
Smith
,
B. L.
,
2017
, “
Assessment Criteria for Computational Fluid Dynamics Model Validation Experiments
,”
ASME J. Verif., Valid., Uncert. Quantif.
,
2
(
3
), p.
031002
.10.1115/1.4037887
3.
Montgomery
,
D. C.
,
2017
,
Design and Analysis of Experiments
, 9th ed,.
Wiley
, Hoboken,
NJ
.
4.
ASME
,
2005
, “
Test Uncertainty
,” ASME, New York, Standard No. ASME PTC 19.1-2005.
5.
ISO
,
1995
,
ISO Guide to the Expression of Uncertainty in Measurement
,
ISO
,
Geneva, Switzerland
.
6.
Ferson
,
S.
,
Oberkampf
,
W. L.
, and
Ginzburg
,
L.
,
2008
, “
Model Validation and Predictive Capability for the Thermal Challenge Problem
,”
Comput. Methods Appl. Mech. Eng.
,
197
(
29–32
), pp.
2408
2430
.10.1016/j.cma.2007.07.030
7.
Voyles
,
I. T.
, and
Roy
,
C. J.
, Jan
2014
, “
Evaluation of Model Validation Techniques in the Presence of Uncertainty
,”
AIAA
Paper No. 2014-0120. 10.2514/6.2014-0120
8.
Voyles
,
I. T.
, and
Roy
,
C. J.
, Jan
2015
, “
Evaluation of Model Validation Techniques in the Presence of Aleatory and Epistemic Input Uncertainties
,”
AIAA
Paper No. 2015-1374. 10.2514/6.2015-1374
9.
ASME
,
2009
,
Standard for Verification and Validation in Computational Fluid Dynamics and Heat Transfer
,
American Society of Mechanical Engineers
,
New York
, ASME Standard No. V&V 20-2009.
10.
Kennedy
,
M. C.
, and
O'Hagan
,
A.
,
2001
, “
Bayesian Calibration of Computer Models
,”
J. R. Stat. Soc. Ser. B (Stat. Methodol.)
,
63
(
3
), pp.
425
464
.10.1111/1467-9868.00294
11.
Stripling
,
H. F.
,
Adams
,
M. L.
,
McClarren
,
R. G.
, and
Mallick
,
B. K.
,
2011
, “
The Method of Manufactured Universes for Validating Uncertainty Quantification Methods
,”
Reliab. Eng. Syst. Saf.
,
96
(
9
), pp.
1242
1256
.10.1016/j.ress.2010.11.012
12.
Roy
,
C. J.
, and
Oberkampf
,
W. L.
,
2011
, “
A Comprehensive Framework for Verification, Validation, and Uncertainty Quantification in Scientific Computing
,”
Comput. Methods Appl. Mech. Eng.
,
200
(
25–28
), pp.
2131
2144
.10.1016/j.cma.2011.03.016
13.
Roy
,
C. J.
, and
Balch
,
M. S.
,
2012
, “
A Holistic Approach to Uncertainty Quantification With Application to Supersonic Nozzle Thrust
,”
Int. J. Uncertainty Quantif.
,
2
(
4
), pp.
363
381
.10.1615/Int.J.UncertaintyQuantification.2012003562
14.
Roache
,
P. J.
,
2017
, “
Interpretation of Validation Results Following ASME V&V20-2009
,”
ASME J. Verif., Valid., Uncertainty Quantif.
,
2
(
2
), p.
024501
.10.1115/1.4037706
15.
Coleman
,
H. W.
, and
Steele
,
W. G.
,
2009
,
Experimentation, Validation, and Uncertainty Analysis for Engineers
, 3rd ed.,
Wiley and Sons
,
New York
.
16.
Rasmussen
,
C. E.
, and
Williams
,
C. K. I.
,
2006
,
Gaussian Processes for Machine Learning
,
The MIT Press
, Cambridge, MA, pp.
14
16
.
17.
Buning
,
P. G.
,
Gomez
,
R. J.
, and
Scallion
,
W. I.
,
2004
, “
CFD Approaches for Simulation of Wing-Body Stage Separation
,”
AIAA
Paper No. 2004-4838. 10.2514/6.2004-4838
18.
Morrison
,
J. H.
,
1998
, “
Numerical Study of Turbulence Model Predictions for the MD 30P/30N and NHLP-2D Three-Element Highlift Configurations
,” NASA, Langley, VA, Standard No. NASA/CR-1998-208967, NAS 1.26:208967.
19.
Anderson
,
J. D.
,
2011
,
Fundamentals of Aerodynamics
, 5th ed.,
McGraw-Hill
, New York.
You do not currently have access to this content.