There has been recently an intense interest in verification and validation of large-scale simulations and in quantifying uncertainty, and several workshops have been organized to address this subject. Characterization of uncertainty is a complex subject in general, but it can be roughly classified as numerical uncertainty and physical uncertainty. The former includes spatiotemporal discretization errors, errors in numerical boundary conditions (e.g., outflow), errors in solvers or geometry description, etc. On the other hand, physical uncertainty includes errors due to unknown boundary and initial conditions, imprecise transport coefficients or interaction terms, insufficient knowledge of the flow geometry, approximate constitutive laws, etc. Coupled problems involving source and interaction terms tend to be particularly difficult to simulate even deterministically, so providing error bars for such solutions is an even more difficult task. Uncertainty can also be characterized as epistemic, i.e., reducible, or as irreducible. For example, if much finer simulations are performed or better experiments with higher resolution instruments are conducted, that will provide more accurate boundary conditions. This will reduce the uncertainty level. But even in such cases, and certainly in many simulations of realistic configurations, uncertainty is irreducible beyond some level due to insufficient detail of input, e.g., background turbulence and random roughness. There are no absolutely quiet wind tunnels and the ocean or the atmosphere are not quiet environments either.

Progress, however, has been made. With regards to numerical uncertainty, accuracy tests and error control have been employed in simulations for some time now, at least for the more modern discretizations. While fully-adaptive simulations are limited to some demonstration examples at the moment, at least the algorithmic framework and mesh generation technology exists for routine adaptive CFD in the near future. Also, a posteriori error bounds and other post-processing tools are available and are used (albeit not very often) in CFD. To this end, the editorial policy statement on the control of numerical accuracy that JFE pioneered in 1986 along with more recent enhancements, has positively influenced the field.

With regards to physical uncertainty, it is only recently that a systematic effort has been made to address it. Most of the effort in CFD research so far has been in developing efficient algorithms for different applications, assuming an ideal input with precisely defined computational domains. With the field reaching now some degree of maturity, we naturally pose the more general question of how to model uncertainty and stochastic input, and how to formulate algorithms so that the simulation output reflects accurately the propagation of uncertainty. That is, in addition to a posteriori error bounds, the new objective is to model uncertainty from the beginning of the simulation and not simply as an afterthought. Stochastic simulations have been going on in solid mechanics for some time now, especially in the finite elements community, so in that respect CFD is about ten years behind! But stochastic CFD simulations have begun, and the first results have provided very valuable information, as shown in the figure.

This special issue of JFE on quantifying uncertainty in CFD is in response to such pressing needs as error bars in CFD, and for reliable answers even for “nonsterilized” problems. The eight invited papers address both numerical accuracy and physical uncertainty issues. They present different techniques and sometimes diverse philosophies, as this is a new field and no absolute consensus exists at the moment.

The first paper by Roache presents a straightforward approach to verification of codes by using manufactured solutions. Roache has been a long-time advocate of verification and validation in CFD and he presents here an overview and new clarifications of his previous work. Of particular interest is the blind study work that he reviews, where his proposed method of manufactured solutions was employed for correcting an intentionally sabotaged code.

The second and third papers deal with numerical accuracy as well. In particular, the paper by Cadafalch et al. is a comprehensive study of solution errors in finite volumes, and addresses laminar flows, turbulent flows with two-equations models, and reactive flows at steady state. The generalized Richardson extrapolation is employed and the grid convergence index is used along with other metrics for providing local and global estimators. The next paper by Pozrikidis addresses convergence of boundary element methods in the presence of sharp corners. This is a notoriously hard problem for any discretization (e.g., Motz problem in finite elements) and the paper presents an effective way of distributing the elements in a geometric fashion to restore uniform convergence.

The fourth paper by DeVolder et al. presents a general framework for quantifying uncertainty in multiscale simulations. It is based on the Bayesian approach for statistical inference and presents methods for determining the Bayesian likelihood and for fast integration. The case of flow in porous media is addressed first, and the shock wave dynamics is then considered. In particular, the stochastic Riemann problem is identified as a fundamental paradigm and analyzed in some detail.

The next two papers address modeling of physical uncertainty and its propagation through the simulation using polynomial chaos expansions. In particular, the paper by Ghanem and Hayek considers the dynamics of overland flow as a model problem. Ghanem pioneered the use of Wiener-Hermite expansions in modeling random input and solving stochastic partial differential equations. In this paper, the use of the Karhunen-Loeve expansion to represent stochastic input reduces significantly the dimensionality of the problem, thus requiring substantially less computational effort compared to Monte Carlo simulation. In the next paper by Xiu et al. a generalized polynomial chaos approach is introduced that extends the work of Wiener to best representations for other non-Gaussian distributions. A new class of polynomial functionals, the Askey family, is introduced and the Wiener-Askey chaos is formulated for the Navier-Stokes equations. The method is used to model flow-structure interaction problems.

The seventh paper by Putko et al. addresses robust design with the uncertainties in the input incorporated into the optimization procedure. Specifically, the approximate statistical moment method is employed for uncertainty propagation and statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. The method is applied to shape optimization of a nozzle using a one-dimensional Euler code.

The final paper by Prud’homme et al. is more mathematical and is one of the first attempts to rigorously quantify uncertainty in simulations based on reduced-basis representations. Specifically, this new method involves the a priori generation of several solutions corresponding to different values of the uncertain parameter, and subsequently a fast solution at a specified parameter value based on a projection to the pre-computed solution space. This approach is demonstrated for heat conduction problems but a formulation for the convection-diffusion problem is also discussed.

I hope that the readers of JFE enjoy these exciting papers collected in this volume, as well as similar future publications that address uncertainty quantification. I would like to thank all the authors and referees for their contribution, and also the editor, Joe Katz, for recognizing how timely and important is this subject.