Affiliation:
1. Validation and Uncertainty Estimation Department, MS 0828, Sandia National Laboratories, PO Box 5800, Albuquerque NM 87185-0828; wloberk@sandia.gov
2. Optimization and Uncertainty Estimation Department, Sandia National Laboratories, Albuquerque NM; tgtruca@sandia.gov
3. Department of Fluid Mechanics, Vrije Universiteit Brussel, Brussels, Belgium; hirsch@stro.vub.ac.be
Abstract
Developers of computer codes, analysts who use the codes, and decision makers who rely on the results of the analyses face a critical question: How should confidence in modeling and simulation be critically assessed? Verification and validation (V&V) of computational simulations are the primary methods for building and quantifying this confidence. Briefly, verification is the assessment of the accuracy of the solution to a computational model. Validation is the assessment of the accuracy of a computational simulation by comparison with experimental data. In verification, the relationship of the simulation to the real world is not an issue. In validation, the relationship between computation and the real world, ie, experimental data, is the issue. This paper presents our viewpoint of the state of the art in V&V in computational physics. (In this paper we refer to all fields of computational engineering and physics, eg, computational fluid dynamics, computational solid mechanics, structural dynamics, shock wave physics, computational chemistry, etc, as computational physics.) We describe our view of the framework in which predictive capability relies on V&V, as well as other factors that affect predictive capability. Our opinions about the research needs and management issues in V&V are very practical: What methods and techniques need to be developed and what changes in the views of management need to occur to increase the usefulness, reliability, and impact of computational physics for decision making about engineering systems? We review the state of the art in V&V over a wide range of topics, for example, prioritization of V&V activities using the Phenomena Identification and Ranking Table (PIRT), code verification, software quality assurance (SQA), numerical error estimation, hierarchical experiments for validation, characteristics of validation experiments, the need to perform nondeterministic computational simulations in comparisons with experimental data, and validation metrics. We then provide an extensive discussion of V&V research and implementation issues that we believe must be addressed for V&V to be more effective in improving confidence in computational predictive capability. Some of the research topics addressed are development of improved procedures for the use of the PIRT for prioritizing V&V activities, the method of manufactured solutions for code verification, development and use of hierarchical validation diagrams, and the construction and use of validation metrics incorporating statistical measures. Some of the implementation topics addressed are the needed management initiatives to better align and team computationalists and experimentalists in conducting validation activities, the perspective of commercial software companies, the key role of analysts and decision makers as code customers, obstacles to the improved effectiveness of V&V, effects of cost and schedule constraints on practical applications in industrial settings, and the role of engineering standards committees in documenting best practices for V&V. There are 207 references cited in this review article.
Reference207 articles.
1. DoD (1994), DoD directive No 5000.59: Modeling and Simulation (M&S) Management, Defense Modeling and Simulation Office, Office of the Director of Defense Research and Engineering.
2. DoD (1996), Verification, Validation, and Accreditation (VV&A) Recommended Practices Guide, Defense Modeling and Simulation Office, Office of the Director of Defense Research and Engineering.
3. Cohen ML, Rolph JE, and Steffey DL (eds) (1998), Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements, National Academy Press, Washington DC.
4. IEEE (1984), IEEE Standard Dictionary of Electrical and Electronics Terms, ANSI/IEEE Std 100-1984, New York.
5. ANS (1987), Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, Am Nuc Soc, ANSI/ANS-10.4-1987, La Grange Park IL.
Cited by
403 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献