Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibility of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. Here, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. The proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.

References

References
1.
FDA
,
2011
, “
Advancing Regulatory Science at FDA: A Strategic Plan
,”
U.S. Food and Drug Administration
, Silver Spring, MD.https://www.fda.gov/scienceresearch/specialtopics/regulatoryscience/ucm267719.htm
2.
Viceconti
,
M.
,
Henney
,
A.
, and
Morley-Fletcher
,
E.
,
2016
, “
In Silico Clinical Trials: How Computer Simulation Will Transform the Biomedical Industry
,”
Int. J. Clin. Trials
,
3
(2), pp. 37–46.
3.
Haddad
,
T.
,
Himes
,
A.
,
Thompson
,
L.
,
Irony
,
T.
,
Nair
,
R.
, and MDIC Computer Modeling and Simulation Working Group Participants, 2017, “
Incorporation of Stochastic Engineering Models as Prior Information in Bayesian Medical Device Trials
,”
J. Biopharm. Stat.
, epub.
4.
Winslow
,
R. L.
,
Trayanova
,
N.
,
Geman
,
D.
, and
Miller
,
M. I.
,
2012
, “
Computational Medicine: Translating Models to Clinical Care
,”
Sci. Transl. Med.
,
4
(
158
), p.
158rv111
.
5.
Kitano
,
H.
,
2002
, “
Overview Computational Systems Biology
,”
Nature
,
420
(
6912
), pp.
206
210
.
6.
Taylor
,
C. A.
,
Draney
,
M. T.
,
Ku
,
J. P.
,
Parker
,
D.
,
Steele
,
B. N.
,
Wang
,
K.
, and
Zarins
,
C. K.
,
1999
, “
Predictive Medicine: Computational Techniques in Therapeutic Decision-Making
,”
Comput. Aided Surg.
,
4
(
5
), pp.
231
247
.
7.
Metaxas
,
D. N.
,
2012
,
Physics-Based Deformable Models: Applications to Computer Vision, Graphics and Medical Imaging
,
Springer Science & Business Media
, New York.
8.
ASME
,
2016
, “
Draft V&V 40 - Standard for Verification and Validation in Computational Methods for Medical Devices
,” American Society of Mechanical Engineers, New York.
9.
Oberkampf
,
W. L.
, and
Roy
,
C. J.
,
2010
,
Verification and Validation in Scientific Computing
,
Cambridge University Press
, New York.
10.
National Research Council
,
2012
,
Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification
,
National Academies Press
, Washington, DC.
11.
Pathmanathan
,
P.
, and
Gray
,
R. A.
,
2013
, “
Ensuring Reliability of Safety-Critical Clinical Applications of Computational Cardiac Models
,”
Front. Physiol.
,
4
, p.
358
.
12.
Hemez
,
F.
,
Atamturktur
,
H. S.
, and
Unal
,
C.
,
2010
, “
Defining Predictive Maturity for Validated Numerical Simulations
,”
Comput. Struct.
,
88
(
7
), pp.
497
505
.
13.
Elele
,
J.
, and
Smith
,
J.
,
2010
, “
Risk-Based Verification, Validation, and Accreditation Process
,”
Proc. SPIE
,
7705
, p. 77050E.
14.
Oberkampf
,
W. L.
,
Trucano
,
T. G.
, and
Hirsch
,
C.
,
2004
, “
Verification, Validation, and Predictive Capability in Computational Engineering and Physics
,”
ASME Appl. Mech. Rev.
,
57
(
5
), pp.
345
384
.
15.
Oberkampf
,
W. L.
,
Pilch
,
M.
, and
Trucano
,
T. G.
,
2007
, “
Predictive Capability Maturity Model for Computational Modeling and Simulation
,” Sandia National Laboratories, Albuquerque, NM, Report No.
SAND2007-5948
https://cfwebprod.sandia.gov/cfdocs/CompResearch/docs/Oberkampf-Pilch-Trucano-SAND2007-5948.pdf.
16.
Beghini
,
L. L.
, and
Hough
,
P. D.
,
2016
, “
Sandia Verification and Validation Challenge Problem: A PCMM-Based Approach to Assessing Prediction Credibility
,”
ASME J. Verif. Validation Uncertainty Quantif.
,
1
(
1
), p.
011002
.
17.
Trucano
,
T. G.
,
Swiler
,
L. P.
,
Igusa
,
T.
,
Oberkampf
,
W. L.
, and
Pilch
,
M.
,
2006
, “
Calibration, Validation, and Sensitivity Analysis: What's What
,”
Reliab. Eng. Syst. Saf.
,
91
(
10–11
), pp.
1331
1357
.
18.
Thacker
,
B. H.
,
Doebling
,
S. W.
,
Hemez
,
F. M.
,
Anderson
,
M. C.
,
Pepin
,
J. E.
, and
Rodriguez
,
E. A.
,
2004
, “
Concepts of Model Verification and Validation
,” Los Alamos National Laboratory, Los Alamos, NM, Technical Report No.
LA-14167
.https://inis.iaea.org/search/search.aspx?orig_q=RN:36030870
19.
Kennedy
,
M. C.
, and
O'Hagan
,
A.
,
2001
, “
Bayesian Calibration of Computer Models
,”
J. R. Stat. Soc. B
,
63
(
3
), pp.
425
450
.
20.
Hills
,
R. G.
,
2013
, “
Roll-Up of Validation Results to a Target Application
,” Sandia National Laboratories, Albuquerque, NM, Report No.
SAND2013-7424
.http://prod.sandia.gov/techlib/access-control.cgi/2013/137424.pdf
21.
Romero
,
V.
,
2016
, “
An Introduction to Some Model Validation Concepts and Paradigms and the Real Space Approach to Model Validation
,” Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification, Joint Army/Navy/NASA/Air Force (JANNAF) and NASA, Washington, DC.
22.
Romero
,
V. J.
,
2008
, “
Type X and Y Errors and Data & Model Conditioning for Systematic Uncertainty in Model Calibration, Validation, and Extrapolation 1
,”
SAE
Paper No. 0148-7191.
23.
Diamond
,
D. J.
,
2006
, “
Experience Using Phenomena Identification and Ranking Technique (PIRT) for Nuclear Analysis
,” PHYSOR-2006 Topical Meeting, Vancouver, BC, Canada, Sept. 10–14, Paper No.
BNL-76750-2006-CP
https://www.bnl.gov/isd/documents/32315.pdf.
24.
FDA
,
2010
, “
Non-Clinical Engineering Tests and Recommended Labeling for Intravascular Stents
,”
Food and Drug Administration
, Silver Spring, MD.https://www.fda.gov/MedicalDevices/ucm071863.htm
25.
Ansari
,
F.
,
Pack
,
L. K.
,
Brooks
,
S. S.
, and
Morrison
,
T. M.
,
2013
, “
Design Considerations for Studies of the Biomechanical Environment of the Femoropopliteal Arteries
,”
J. Vasc. Surg.
,
58
(
3
), pp.
804
813
.
26.
Trépanier
,
C.
, and
Pelton
,
A. R.
, 2004, “
Effect of Temperature and pH on the Corrosion Resistance of Nitinol
,” International Conference on Shape Memory and Superelastic Technology (
SMST
), Baden-Baden, Germany, Oct. 3–7https://www.researchgate.net/publication/242400171_EFFECT_OF_TEMPERATURE_AND_pH_ON_THE_CORROSION_RESISTANCE_OF_PASSIVATED_NITINOL_AND_STAINLESS_STEEL.
27.
Schlun
,
M.
,
Zipse
,
A.
,
Dreher
,
G.
, and
Rebelo
,
N.
,
2011
, “
Effects of Cyclic Loading on the Uniaxial Behavior of Nitinol
,”
J. Mater. Eng. Perform.
,
20
(
4–5
), pp.
684
687
.
You do not currently have access to this content.