Over the past twenty years, computer-based simulation codes have emerged as the leading tools to assess risks of severe events such as fire. The results of such simulation codes are usually used to estimate the magnitude, frequency and consequence of hazards. A typical simulation program/model utilizes many different sub-models, each characterizing a physical or chemical process contributing to exposure of the hazard or occurrence of certain adverse failure events. The final prediction made by such simulation codes can be temporal, spatial or just a single estimate for the measure of interest. The predictions made by the simulation codes are subject to different contributing uncertainties, including the uncertainty about the inputs to the code, uncertainty of sub-models used in the codes and uncertainty in the parameters of the probabilistic models (if applicable) used in the codes to characterize (e.g., validate) code outputs. A primary way to measure the model uncertainties is to perform independent experiments and assess conformance of the models to observations from such experiments. It is very important to note that the experimental results themselves may also involve uncertainties, for example due to measurement errors and lack of precision in instrumentation. In this research experimental data collected as part of the Fire Model Verification and Validation [1] are used to characterize the share of model uncertainty in the total code output uncertainty, when experimental data are compared to the code predictions. In this particular case, one should assume the uncertainty of experiments (e.g., due to sensor or material variability) is available from independent sources. The outcome of this study is the probabilistic estimation of uncertainty associated with the model and the corresponding uncertainty in the predictions made by the simulation code. A Bayesian framework was developed in this research to assess fire model prediction uncertainties in light of uncertain experimental observations. In this research the complexity of the Bayesian inference equations was overcome by adopting a Markov Chain Monte Carlo (MCMC) simulation technique. This paper will discuss the Bayesian framework, examples of using this framework in assessing fire model uncertainties, and a discussion of how the results can be used in risk-informed analyses.
Skip Nav Destination
17th International Conference on Nuclear Engineering
July 12–16, 2009
Brussels, Belgium
Conference Sponsors:
- Nuclear Engineering Division
ISBN:
978-0-7918-4352-9
PROCEEDINGS PAPER
A Bayesian Framework for Model Uncertainty Considerations in Fire Simulation Codes
Mohammadreza Azarkhail,
Mohammadreza Azarkhail
University of Maryland, College Park, MD
Search for other works by this author on:
Victor Ontiveros,
Victor Ontiveros
University of Maryland, College Park, MD
Search for other works by this author on:
Mohammad Modarres
Mohammad Modarres
University of Maryland, College Park, MD
Search for other works by this author on:
Mohammadreza Azarkhail
University of Maryland, College Park, MD
Victor Ontiveros
University of Maryland, College Park, MD
Mohammad Modarres
University of Maryland, College Park, MD
Paper No:
ICONE17-75684, pp. 649-656; 8 pages
Published Online:
February 25, 2010
Citation
Azarkhail, M, Ontiveros, V, & Modarres, M. "A Bayesian Framework for Model Uncertainty Considerations in Fire Simulation Codes." Proceedings of the 17th International Conference on Nuclear Engineering. Volume 2: Structural Integrity; Safety and Security; Advanced Applications of Nuclear Technology; Balance of Plant for Nuclear Applications. Brussels, Belgium. July 12–16, 2009. pp. 649-656. ASME. https://doi.org/10.1115/ICONE17-75684
Download citation file:
19
Views
Related Proceedings Papers
Related Articles
Why Do Verification and Validation?
J. Verif. Valid. Uncert (March,2016)
Bayesian Inference Based on Monte Carlo Technique for Multiplier of Performance Shaping Factor
ASME J. Risk Uncertainty Part B (December,2024)
A Framework for In Silico Clinical Trials for Medical Devices Using Concepts From Model Verification, Validation, and Uncertainty Quantification
J. Verif. Valid. Uncert (June,2022)
Related Chapters
Constructing Dynamic Event Trees from Markov Models (PSAM-0369)
Proceedings of the Eighth International Conference on Probabilistic Safety Assessment & Management (PSAM)
A PSA Update to Reflect Procedural Changes (PSAM-0217)
Proceedings of the Eighth International Conference on Probabilistic Safety Assessment & Management (PSAM)
Developing Human Performance Measures (PSAM-0207)
Proceedings of the Eighth International Conference on Probabilistic Safety Assessment & Management (PSAM)