Verification, validation, and predictive capability in computational engineering and physics

Author:

Oberkampf William L1,Trucano Timothy G2,Hirsch Charles3

Affiliation:

1. Validation and Uncertainty Estimation Department, MS 0828, Sandia National Laboratories, PO Box 5800, Albuquerque NM 87185-0828; wloberk@sandia.gov

2. Optimization and Uncertainty Estimation Department, Sandia National Laboratories, Albuquerque NM; tgtruca@sandia.gov

3. Department of Fluid Mechanics, Vrije Universiteit Brussel, Brussels, Belgium; hirsch@stro.vub.ac.be

Abstract

Developers of computer codes, analysts who use the codes, and decision makers who rely on the results of the analyses face a critical question: How should confidence in modeling and simulation be critically assessed? Verification and validation (V&V) of computational simulations are the primary methods for building and quantifying this confidence. Briefly, verification is the assessment of the accuracy of the solution to a computational model. Validation is the assessment of the accuracy of a computational simulation by comparison with experimental data. In verification, the relationship of the simulation to the real world is not an issue. In validation, the relationship between computation and the real world, ie, experimental data, is the issue. This paper presents our viewpoint of the state of the art in V&V in computational physics. (In this paper we refer to all fields of computational engineering and physics, eg, computational fluid dynamics, computational solid mechanics, structural dynamics, shock wave physics, computational chemistry, etc, as computational physics.) We describe our view of the framework in which predictive capability relies on V&V, as well as other factors that affect predictive capability. Our opinions about the research needs and management issues in V&V are very practical: What methods and techniques need to be developed and what changes in the views of management need to occur to increase the usefulness, reliability, and impact of computational physics for decision making about engineering systems? We review the state of the art in V&V over a wide range of topics, for example, prioritization of V&V activities using the Phenomena Identification and Ranking Table (PIRT), code verification, software quality assurance (SQA), numerical error estimation, hierarchical experiments for validation, characteristics of validation experiments, the need to perform nondeterministic computational simulations in comparisons with experimental data, and validation metrics. We then provide an extensive discussion of V&V research and implementation issues that we believe must be addressed for V&V to be more effective in improving confidence in computational predictive capability. Some of the research topics addressed are development of improved procedures for the use of the PIRT for prioritizing V&V activities, the method of manufactured solutions for code verification, development and use of hierarchical validation diagrams, and the construction and use of validation metrics incorporating statistical measures. Some of the implementation topics addressed are the needed management initiatives to better align and team computationalists and experimentalists in conducting validation activities, the perspective of commercial software companies, the key role of analysts and decision makers as code customers, obstacles to the improved effectiveness of V&V, effects of cost and schedule constraints on practical applications in industrial settings, and the role of engineering standards committees in documenting best practices for V&V. There are 207 references cited in this review article.

Publisher

ASME International

Subject

Mechanical Engineering

Reference207 articles.

1. DoD (1994), DoD directive No 5000.59: Modeling and Simulation (M&S) Management, Defense Modeling and Simulation Office, Office of the Director of Defense Research and Engineering.

2. DoD (1996), Verification, Validation, and Accreditation (VV&A) Recommended Practices Guide, Defense Modeling and Simulation Office, Office of the Director of Defense Research and Engineering.

3. Cohen ML, Rolph JE, and Steffey DL (eds) (1998), Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements, National Academy Press, Washington DC.

4. IEEE (1984), IEEE Standard Dictionary of Electrical and Electronics Terms, ANSI/IEEE Std 100-1984, New York.

5. ANS (1987), Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, Am Nuc Soc, ANSI/ANS-10.4-1987, La Grange Park IL.

Cited by 403 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3