2018V V Program
2018V V Program
CONFERENCE
May 16 – 18, 2018
Hyatt Regency Minneapolis Minneapolis, MN
Program
www.asme.org/events/vandv
Our goal is to provide you and your fellow engineers and scientists—who might Scott Doebling
normally never cross paths—with the unique opportunity to interact: by exchanging Los Alamos National Laboratory
ideas and methods for verification of codes and solutions, simulation validation and
assessment of uncertainties in mathematical models, computational solutions and Kevin Dowding
experimental data. Sandia National Laboratories
The presentations have been organized both by application field and technical goal
Luis Eça
and approach. We are pleased that you are here with us and your colleagues to
IST
share verification and validation methods, approaches, successes and failures and
ideas for the future. Tina Morrison
U.S. Food and Drug Administration
Thanks again for attending. We look forward to your valued participation.
Christopher Roy
Virginia Tech
Sincerely,
Ryan Crane
ASME
2
Contents
GENERAL INFORMATION......................................................................................................................4
WORKSHOPS ............................................................................................................................................ 6
SPONSORS ..............................................................................................................................................56
NOTES ........................................................................................................................................................58
3
General Information
ACKNOWLEDGEMENT
The Verification and Validation Symposium is sponsored by ASME. All technical sessions and conference
events will take place at Hyatt Regency Minneapolis. Please check the schedule for event times and locations.
HOTEL BUSINESS
SERVICES TAX DEDUCTIBILITY
The business center is located on the Expenses of attending a professional meeting such as registration fees and costs of technical publications
first floor in the lobby. are tax deductible as ordinary and necessary business expenses for U.S. citizens. Please note that tax code
changes in recent years have affected the level of deductibility.
Business hours are:
EMERGENCY
In case of an emergency in the hotel, pick up any house phone which rings directly to Service Express.
From there, operator can then dispatch.
V&V 20 Subcommittee on Verification and Validation in Computational ASME V&V Standards Committee – Verification and Validation in
Fluid Dynamics and Heat Transfer, Great Lakes A1, 4th Floor Computational Modeling and Simulation
V&V 30 Subcommittee on Verification and Validation in Computational Interested applicants should contact Kate Hyam, [email protected]
Simulation of Nuclear System Thermal Fluids Behavior, Great Lakes A3,
4th Floor ASME V&V 10 – Verification and Validation in Computational
Solid Mechanics
V&V 60 Computational Modeling in Energy Systems, Lake Harriet,
4th Floor Interested applicants should contact Michelle Pagano,
[email protected]
WEDNESDAY, MAY 16, 6:30PM-9:00PM Interested applicants should contact Fred Constantino,
[email protected]
V&V Standards Committee on Verification and Validation in
Computational Modeling and Simulation. Networking Reception will
ASME V&V 60 - Verification and Validation of Computational Modeling
follow and all Symposium attendees are invited, Lake Harriett, 4th Floor
for Energy Systems
5
Workshops
Presented by: Dr. William Oberkampf and Prof. Christopher Roy AABME CONNECT: WHERE BIOMEDICINE AND ENGINEERING COME
TOGETHER
This seminar presents modern terminology and effective procedures for (space is limited – if you are interested in receiving an invitation, please
verification of numerical simulations, validation of mathematical models, email [email protected]
and an introduction to uncertainty quantification of nondeterministic
simulations. The techniques presented in this course are applicable to a
wide range of engineering and science applications, including fluid
dynamics, heat transfer, solid mechanics, and structural dynamics.
6
Workshops
PLEASE JOIN US
WEDNESDAY, MAY 16
HYATT REGENCY
MEETING 6:30PM – 7:30PM
GREAT LAKES B, 4TH FLOOR
RECEPTION 7:30PM – 9:00PM
GREAT LAKES C, 4TH FLOOR
09:00 AM – 05:00 PM V&V 10 Subcommittee on Verification and Validation in Computational Solid Mechanics
08:00 AM – 05:00 PM V&V 20 Subcommittee on Verification and Validation in Computational Fluid Dynamics and
Heat Transfer
10:25 AM to 12:30 PM 2-1 Development and Application of Verification and Validation Standards. Session 1
10:25 AM to 12:30 PM 4-1 Uncertainty Quantification, Sensitivity Analysis, and Prediction: Session 1
01:30 PM to 03:35 PM 2-2 Development and Application of Verification and Validation Standards. Session 2
04:00 PM to 06:05 PM 11-1 Verification for Fluid Dynamics and Heat Transfer
04:00 PM to 06:05 PM 8-1 Verification and Validation for Impact, Blast, and Material Response
TRACK ROOM
TRACK ROOM
TRACK ROOM
10:25 AM to 12:30 PM 1-5 ASME 2018 V&V Verification, Validation, and Uncertainty Quantification: The Unanswered
Questions
10:25 AM to 12:30 PM 13-2 ASME V&V 40 Subcommittee -- Verification and Validation in Computational Modeling of
Medical Devices
01:30 PM to 03:35 PM 4-3 Uncertainty Quantification, Sensitivity Analysis, and Prediction: Session 3
01:30 PM to 03:35 PM 5-1 Validation for Fluid Dynamics and Heat Transfer
04:00 PM to 06:05 PM 1-2 V&V BENCHMARK PROBLEM —TWIN JET COMPUTATIONAL FLUID DYNAMICS (CFD)
NUMERIC MODEL VALIDATION
04:00 PM to 06:05 PM 9-1 Validation Methods for Solid Mechanics and Structures
10:30 AM to 12:35 PM 1-3 Industry Challenges in Uncertainty Quantification: Bridging the Gap Between Simulation
and Test
10:30 AM to 12:35 PM 4-2 Uncertainty Quantification, Sensitivity Analysis, and Prediction: Session 2
10
Schedule at a Glance
TRACK ROOM
TRACK ROOM
11
Session Chairs
TRACK-1 CHALLENGE PROBLEM WORKSHOPS AND PANEL TRACK-4 UNCERTAINTY QUANTIFICATION, SENSITIVITY ANALYSIS,
SESSIONS AND PREDICTION
1-1 Unsteady Flow Workshop 4-1 Uncertainty Quantification, Sensitivity Analysis, and Prediction:
Session 1
Luis Eca, IST, Lisbon, Portugal
Michelle Pagano, ASME, New York, NY, United States Kyung Choi, University Of Iowa, College Of Engineering, Iowa City, IA,
United States
1-2 V&V Benchmark Problem —Twin Jet Computational Fluid
Rafael Ruiz, Universidad de Chile, Santiago, Chile
Dynamics (CFD) Numeric Model Validation
4-2 Uncertainty Quantification, Sensitivity Analysis, and Prediction:
Hyung Lee, Bettis Laboratory, West Mifflin, PA, United States
Session 2
Richard Schultz, Consultant, Pocatello, ID, United States
Ilias Bilionis, Purdue University, West Lafayette, IN, United States
1-3 Industry Challenges in Uncertainty Quantification: Bridging the
Paul Gardner, University of Sheffield, Sheffield, United Kingdom
Gap Between Simulation and Test
4-3 Uncertainty Quantification, Sensitivity Analysis, and Prediction:
Mark Andrews, SmartUQ, Madison, WI, United States
Session 3
Peter Chien, SmartUQ, Madison, WI, United States
Joseph Beck, Perceptive Engineering Analytics, LLC, Minneapolis, MN,
1-4 Code Verification for Applicants in a Regulated Field
United States
Marc Horner, ANSYS, Inc., Evanston, IL, United States Kevin OFlaherty, SmartUQ, Madison, WI, United States
David Moorcroft, Federal Aviation Administration, Oklahoma City, OK,
United States
Michelle Pagano, ASME, New York, NY, United States TRACK-5 VALIDATION FOR FLUID DYNAMICS AND HEAT TRANSFER
1-5 ASME 2018 V&V Verification, Validation, and Uncertainty 5-1 Validation for Fluid Dynamics and Heat Transfer
Quantification: The Unanswered Questions
V. Gregory Weirs, Sandia National Laboratories, Albuquerque, NM, United
Scott Doebling, Los Alamos National Lab, Los Alamos, NM, United States
States
Daniel Israel, Los Alamos National Lab, Los Alamos, NM, United States
Kenneth Aycock, US Food and Drug Administration, Silver Springs, MD,
United States
Donna Guillen, Idaho National Laboratory, Idaho Falls, ID, United States
12 William Oberkampf, W L Oberkampf Consulting, Georgetown, TX, United
States
Session Chairs
TRACK-9 VALIDATION METHODS FOR SOLID MECHANICS AND 13-2 ASME V&V 40 Subcommittee -- Verification and Validation in
STRUCTURES Computational Modeling of Medical Devices
9-1 Validation Methods for Solid Mechanics and Structures Tina Morrison, Food and Drug Administration, Silver Spring, MD,
United States
Zhong Hu, South Dakota State University, Brookings, SD, United States Marc Horner, ANSYS, Inc., Evanston, IL, United States
Duane Cronin, University of Waterloo, Waterloo, ON, Canada
TRACK-13 VERIFICATION AND VALIDATION FOR BIOMEDICAL
ENGINEERING
13
Plenary Sessions
WEDNESDAY, MAY 16 • 8:00AM – 10:00AM • GREAT LAKES B PLENARY 2: VALIDATION AND PREDICTIVE CAPABILITY OF
IMPERFECT MODELS WITH IMPRECISE DATA
PLENARY 1: VERIFICATION, VALIDATION, AND UNCERTAINTY
QUANTIFICATION – ARE WE MAKING ANY PROGRESS?
Dr. Scott Ferson
Professor, University of Liverpool
Dr. Ben H. Thacker Director of Institute of Risk and Uncertainty
Director of the Materials Engineering Department
Southwest Research Institute
Scott Ferson is director of the Institute for Risk and Uncertainty at the
University of Liverpool in the UK. For many years he was senior scientist
Dr. Ben H. Thacker is the Director of the Materials Engineering at Applied Biomathematics and an adjunct professor at the School of
Department at Southwest Research Institute. The department is quite Marine and Atmospheric Sciences at Stony Brook University. He was
diverse with many active applied R&D projects in surface engineering, recently a visiting fellow at the Université de Technologie de Compiègne
materials development, failure analysis, computational materials, additive in France. He holds a Ph.D. in Ecology and Evolution from the State
manufacturing, environmental effects, biomechanics, life prediction, University of New York (SUNY) and an A.B. in biology from Wabash
energy storage, mechanical performance, and uncertainty quantification. College. His professional interests include statistics when empirical
His technical contributions have primarily been in the development and information is very sparse, medical risks and population biology, and risk
implementation of advanced probabilistic methods, application of analysis. Ferson has five published books, ten commercially distributed
probabilistic methods to high-consequence engineering problems, and software packages, and over a hundred scholarly publications, mostly in
model verification and validation (V&V). He has over 130 publications in environmental risk analysis, uncertainty propagation, and conservation
these general areas. He is a Registered Professional Engineer in the State biology. He is a fellow of the Society for Risk Analysis and was recently
of Texas, a Fellow of AIAA, and a founding member of the ASME named Distinguished Educator by the Society. Ferson has been the
Subcommittee for V&V of Computational Solid Mechanics. He currently central figure in the development of probability bounds analysis, an
serves as Vice Chair of the ASME V&V Standards Committee for V&V. approach to reliably computing with imprecisely specified probabilistic
Dr. Thacker obtained his B.S. from Iowa State University, his M.S. from models. His research over the last decade, funded primarily by the
the University of Connecticut, and his Ph.D. from the University of Texas National Institutes of Health, NASA, and Sandia National Laboratories, has
at Austin. focused on developing reliable mathematical and statistical tools for risk
assessments and uncertainty analysis when empirical information is very
Abstract: A huge amount of R&D has been and continues to be focused sparse, including methods for quality assurance for Monte Carlo
on reducing the time and cost required to develop new materials, assessments, exact methods for detecting clusters in very small data sets,
products and systems. Augmenting testing with modeling and simulation backcalculation methods for use in remediation planning, and distribution-
clearly enables this; however, that is the easy part. The much harder part free methods of risk analysis. He serves on editorial boards for several
is trusting model predictions. This is precisely what motivated the journals, and he has served on many expert panels in the United States
development of formal standards for model VVUQ. In addition to our and internationally.
ASME standards, NASA, AIAA, FAA, and USAF have all produced similar
or complementary documents. We collectively should be proud of this Abstract: Many sophisticated models in engineering today incorporate
progress because the body of knowledge represents a high-level of randomness or stochasticity and make predictions in the form of
understanding and consensus of VVUQ, but what impact has it had? probability distributions or other structures that express predictive
Quantifying the confidence and predictive accuracy of a model is clearly uncertainty. Validation of such models must contend with observations
important, but does it establish trust or credibility in the predictions? Are that are often sparse or imprecise, or both. The predictive capability of
we making any progress? This presentation will review a few of the more these models, which determines what we can reliably infer from them, is
salient points associated with UQ in V&V, and conclude with several assessed by whether and how closely the model can be shown to yield
real-world examples that utilized VVUQ concepts during the model predictions conforming with available empirical observations beyond
development process. those data used in the model calibration process. Interestingly, a
validation match between the model and data can be easier to establish
when the predictions or observations are uncertain, but the model’s
predictive capability is degraded by either uncertainty. It is critical that
measures used for validation and estimating predictive capability not
confuse variability with lack of knowledge, but rather integrate these two
kinds of uncertainties (sometimes denoted ‘aleatory’ and ‘epistemic’) in a
way that leads to meaningful statements about the fit of the model to data
and the reliability of predictions it generates.
14
Keynotes
THURSDAY, MAY 17 • 8:00AM – 10:00AM • GREAT LAKES B BD’s Corporate CAE group in Research Triangle Park, NC has been
utilizing simulation for over 30 years to impact the company. The three
PLENARY 3: STRATEGIC UTILIZATION OF MODELING TO IMPACT BD primary modeling areas include injection molding simulation, structural
analysis and computational fluid dynamics. Historically, the use of
modeling for BD’s high-volume disposable medical devices started with
injection molding simulation to ensure design for manufacturability, part
Anita Bestelmeyer quality and manufacturing robustness. The group then implemented
Director, Corporate Computer-Aided Engineering (CAE) structural analyses due to the need to include the effects of nonlinear
BD Technologies and Innovation (BDTI) materials, large deformations, fracture and damage and contact between
multiple components. Finally, the use of computational fluid dynamics has
been essential to predicting fluid flow effectiveness through BD’s
products and the human body. Each of these modeling areas are
Anita Bestelmeyer currently serves as the Director of BD’s Corporate leveraged on a day-to-day basis by the Corporate CAE group and
Computer-Aided Engineering (CAE) group in Research Triangle Park, NC successful case studies and the approach to verification and validation
at BD Technologies and Innovation (BDTI) and has been in this role since will be shared.
June 2012. The Corporate CAE group is the center-of-expertise in
simulation, optimization, advanced prototyping, CT scanning, material BD has been at the forefront of promoting the acceptance and use of
testing coordination, web-enabled tools and exponential technologies. simulation in the medical industry. The Corporate CAE group has been an
The group’s mission is to deliver innovative, optimized and robust products active participant and voting member on the Verification and Validation
to market faster. (V&V) 40 committee in Computational Modeling of Medical Devices. The
goal of this committee is to coordinate, promote, and foster the
Anita started her career at BD in 1991 and has taken on increasing levels of development of standards that provide procedures for assessing and
responsibility within the company the last 26 years. Anita has been a quantifying the accuracy and credibility of computational models and
champion in deploying CAE technologies to impact the company and simulations by device manufacturers and regulatory agencies. BD has
successfully built a BDX network across businesses, regions and functions also been strong partners in the Medical Device Innovation Consortium
to leverage these tools successfully. She is actively involved in cross- (MDIC) since its foundation in 2012. This first-ever public-private
company initiatives such as the Technology Leadership Development partnership was created with the sole objective of advancing medical
Program Steering (TLDP) Committee, “Build Pipeline” - Diversity Action device regulatory science for patient benefit.
Team, Product Lifecycle Management (PLM) initiative, BDU teacher for
Manager Essentials, etc. She was the recipient of the BD Becton Quality Over the years, BD has effectively leveraged simulation for medical device
Award in 2014 and the BD Howe Technology Award in 2017. development throughout the various stages of product development.
Simulation is generally used to evaluate new design concepts, identify
Anita is recognized externally for her in-depth expertise and leadership optimal solutions and realistically predict potential outcomes. This
through invited keynote and panel presentations at industry conferences simulation-based approach enables BD to drive informed-based decisions
such as the Society of Women Engineers, Design of Medical Devices, and in order to mitigate risk and minimize the number of physical design
the Dassault Systemes Simulia Customer Conference. She is also on the iterations required. Ultimately, the final selected designs are tested
Medical Device Innovation Consortium Modeling and Simulation Steering experimentally and simulation results are not generally submitted as
Committee, ASME V&V 40 Committee setting guidelines for computational regulatory-grade evidence during the regulatory submission process.
modeling used in regulatory submission and the BMES/FDA Frontiers in
Medical Devices: Innovations in Modeling and Simulation conference
leadership. This participation is important to staying on the forefront of
the industry in advancing translational and regulatory science. In the future, BD’s Corporate CAE group plans to grow how simulation-
based evidence is used to impact the translational science process,
Prior to BD, Anita worked in the aerospace industry at TRW Inc. Space and regulatory submissions and ultimately patient outcomes. With BD’s
Technology group in Redondo Beach, CA from 1984 to 1991. Anita recently expanded product portfolio, the opportunities to tap into the
received a B.S. in Civil Engineering from the University of Illinois, Urbana- benefits of modeling and simulation are quite significant. The group is
Champaign in 1983 and a M.S. from the University of California, Berkeley excited about driving change in the industry and revolutionizing how
in Civil Engineering, Structural Mechanics in 1984. modeling and simulation can play a role in advancing the world of health.
15
Plenary Sessions
Early work with Dr. Arthur Guyton characterized the interaction of renal
function, body fluids and hemodynamics in the genesis of chronic
hypertension. The lead publication became a Citation Classic in 1986.
Drs. Coleman and Guyton were among the first to analyze complex
physiological systems using mathematical models and computer
simulation. In his American Physiological Society Bowditch Lecture in
1975. Dr. Coleman described the role of these methods in modern
scientific method.
(2) Karl Popper has argued that a scientific theory can never be proven to
be unquestionably true. But it can easily be proven to be unquestionably
false. If we consider a mathematical model to be only a theory, then this
argument can be applied modeling. A mathematical model can never be
fully validated, but it can readily be invalidated.
(4) Is hearsay admissible data? Some very important insights don’t make it
all the way to refereed, published literature. But the formal literature is
often not physiologically relevant and can simply be wrong. So hearsay
data may not be attractive but it can be important.
(5) What about important physiological processes that are just not visible?
The model builder can use his or her imagination in building the invisible
component, but validation can involve no more than the visible (potential)
implications of the invisible.
TRACK 2 DEVELOPMENT AND APPLICATION OF VERIFICATION AND as well as meeting delivery schedules and cost objectives. To achieve
VALIDATION STANDARDS these quality objectives, an organization must integrate a wide range of
disparate capabilities and resources across the enterprise, for example,
computer aided design, simulation software capability, simulation data
management, technical staff competencies, computer resources,
2-1 DEVELOPMENT AND APPLICATION OF VERIFICATION AND
VALIDATION STANDARDS. SESSION 1 experimental characterization of materials, testing of subsystems and
4TH FLOOR, GREAT LAKES A1 10:25AM - 12:30PM products, supplier and manufacturing capabilities, meeting regulatory
requirements, and product delivery, support and maintenance capabilities.
Process Evaluation for Certification by Analysis As simulation takes more responsibility for product quality, organizations
must learn how to incorporate and adapt to the specialized needs and
Oral Presentation. VVS2018-9379 10:25AM - 10:50AM data integration needed for simulation. The concept of simulation
governance as discussed in this presentation addresses how simulation
David Moorcroft, Federal Aviation Administration, Oklahoma City, OK, must be integrated over a wide range of existing business operations so it
United States can yield the benefits trumpeted by marketing phrases such as “virtual
product development” and “digital twins”. A key success factor is the
The Federal Aviation Administration (FAA) has standards and regulations development by the top management of their strategic vision of the
that are designed to protect aircraft occupants in the event of a crash. For simulation role for future and legacy products. We will discuss the
many components, compliance is demonstrated via physical testing; integration of diverse elements of simulation, for example, conceptual and
however, the FAA permits numerical modeling results to also be used to mathematical model formulations, code and solution verification, model
show compliance. While these data fall under existing process controls validation, uncertainty quantification, risk assessment, and the integration
for certification data, new challenges are arising that require additional of experimental testing and model parameter calibration. Finally, simula-
guidance. In general, there are two approvals required: approval to tion governance includes the estimation of model predictive uncertainty,
manufacture and approval to install. The evaluation for approval to including both aleatoric and epistemic uncertainties, and an assessment
manufacture involves both an evaluation of the design and the materials, of the attributes of model predictive capability.
to include material variability and factors of safety. For physical testing,
there are evaluations of the laboratory that generate the data. This
includes an evaluation of the technical capabilities (hardware/software), An Agile Verification and Validation Process for Generating Regulato-
human capabilities, and the documented processes. For modeling data, ry-Grade Evidence
similar evaluations are necessary to build confidence in the results. This
process evaluation of the modeling capabilities is a corollary to the testing Oral Presentation. VVS2018-9377 11:15AM - 11:40AM
capability evaluation. The applicant should have a developed quality
procedure that controls how models will be built, sources of data, Paulina Rodriguez, Seyed Ahmad Reza Dibaji, Matthew Myers and Tina
verification and validation activities, quality control, acceptability Morrison U.S. Food and Drug Administration, Silver Spring, MD, United
requirements, and how the build process is documented. Of particular States, Bruce Murray, SUNY At Binghamton, Binghamton, NY, United
interest to regulators is the internal decision making process, for example, States
what happens if the model doesn’t meet an accuracy requirement. The
question that the FAA is asking applicants is does your analytical process Computational modeling is a promising tool for advancing medicine and
result in a similar decision to your physical process. As with other healthcare. One key aspect to ensuring its adoption by regulators is trust
regulatory agencies, the FAA is also considering how applicants can in the predictive capability of the modeling, as outlined in the new ASME
demonstrate their abilities, with site inspections, challenge problems, and V&V40 standard. Credibility is established through verification and
pilot projects. Overall, the FAA wants to know that a company has a validation (V&V); demonstrating that the computational model is solved
quality procedure and that they follow it leading to the design of safe correctly and accurately and that it correctly represents the reality of
systems. interest. In many cases, demonstrating these aspects is one “obligatory”
step before using the computational model the way the developer
intended. V&V is typically performed as a linear, step-by-step process
Simulation Governance across the Product Lifecycle near project completion. Iterative V&V can be more efficient, but a
framework is lacking. Therefore, we gathered methods from the software
Oral Presentation. VVS2018-9407 10:50AM - 11:15AM arena called “Agile” and established a truly iterative V&V process in
accordance with the FDA’s guidance on computational modeling. Our
William Oberkampf, W L Oberkampf Consulting, Georgetown, TX, United team developed and applied an agile management approach to the V&V
States, Jean Francois Imbert, SIMconcept Consulting, Toulouse, France process on a single-phase flow and heat transfer computational model of
an electronic drug delivery system using commercial software; using agile
Computer simulations are increasingly relied upon to inform business will ensure a truly iterative model development and V&V. Our modified
management, project managers, and product designers regarding the agile approach includes four key components: Adaptive Software
quality of engineering products produced by their organization. Here we Development (ASD) methods to manage the project through consecutive
18
take a broad view of product quality by including whether the product iteration cycles; Scrum to manage the team; a modified Phenomenological
meets specifications for performance, reliability, safety, and productibility, Identification and Ranking Table (PIRT) that monitors the flux in knowledge
Wednesday Technical Program
through the iteration cycles, guides model development and experimental the simulation process and the validation of the simulation process for
design, and drives project decisions; lastly, Trello a web-based manage- the QM were defined in addition to the code verification, the solution
ment tool which enables flexible organization and project tracking. We verification, and the validation of the simulation for the M&S. By using
integrated critical elements of the ASME V&V 40 framework into the agile these definitions, the procedures for the QM in the verification and that
V&V process to enhance decision-making and develop credibility for the in the validation were successfully defined in the V2UP in accordance
context-of-use (COU) of the model, especially as the COU evolves with with the requirements in the guideline of the AESJ and the standards of
the addition of credible evidence. While the agile V&V process required the QM.
continuous planning, it reduced the overall time and resources spent. The
time spent on model development, experimental design, troubleshooting,
and documentation was reduced due to frequent interactive communica- It is Too Complex to Validate!
tion by the team. Additionally, unnecessary simulations and experiments
were avoided by iterative decision-making driven by our modified PIRT. Oral Presentation. VVS2018-9419 12:05PM - 12:30PM
The integration of the new V&V 40 standard with agile methods has
increased the quality of communication amongst the team, improved Chris Rogers, Crea Consultants Ltd., Buxton Sk176ay, United Kingdom,
knowledge management with PIRT and Trello, supported smart deci- William Oberkampf, W L Oberkampf Consulting, Georgetown, TX, United
sion-making about limited resources, and is leading us toward an iterative States, Joshua Kaizer, U.S. Nuclear Regulatory Commission, Abingdon,
end-to-end regulatory-grade computational model. MD, United States, Ryan Crane, ASME, New York, NY, United States
This is the response that has been given to many regulatory, safety and
Development of V2UP (Verification & Validation plus Uncertainty licensing assessors when asking for validation of simulation.
Quantification and Prediction) Procedure - Implementation of Quality
Management Process for Modeling and Simulation V&V Validation is a process which has significant importance with respect to
measuring how correct a simulation is; setting bounds of applicability; and
Oral Presentation. VVS2018-9405 11:40AM - 12:05PM predicting potential applicable variances. Validation is an end user
requirement as it is impossible for the software developer to know how
Masaaki Tanaka, Japan Atomic Energy Agency, O-Arai, Ibaraki, Japan the user is going to use software. That is, the software developer does
not have the insight to prescribe when valid results are obtained for
In order to enhance the simulation credibility, implementation of verifica- specific applications of the software.
tion and validation (V&V) including the uncertainty quantification is
indispensable process in development of numerical simulation codes. A The applicable international quality standard is ISO 9001:2015 Quality
procedure named V2UP (Verification and Validation plus Uncertainty management systems - Requirements. For a simulation to be ISO 9001
quantification and Prediction) by referring to existing guidelines on the compliant, Clause 8.3.4 is mandatory, namely:
V&V (ASME V&V-10 and 20) and the methodologies of the safety assess-
ment (CSAU, ISTIR, EMDAP) has been developed at the Japan Atomic 8.3.4 Design and development controls
Energy Agency. On July 2016, Guideline for Credibility Assessment of
Nuclear Simulations (AESJ-SC-A008:2015) was published by the Atomic The organization shall apply controls to the design and development
Energy Society of Japan (AESJ). The guideline of AESJ describes a process to ensure that:
fundamental concept of the V&V for Modeling and Simulation in four
elements: (1) development of conceptual model, (2) mathematical d) Validation activities are conducted to ensure that the resulting products
modeling, (3) physical modeling, and (4) assessing predictability of and services meet the requirements for the specified application or
simulation model. In addition, the fundamental concept of the prediction intended use.
process after the V&V, and the implementation of the quality management
(QM) based on ISO9001 are prescribed. Previously, methodology for the This presentation will provide the results of work carried out by the
uncertainty quantification in the V&V process of the V2UP has been NAFEMS Analysis Management Working Group, supported by ASME V&V
investigated. In this study, implementation of the QM process for the committee members, to demonstrate that all engineering simulation can
activities in the V&V of the V2UP was investigated. The standards of the be satisfactorily validated to the satisfaction of ISO 9001.
V&V for the QM published by the Japan Society for Computational
Engineering and Science (JSCES) and the guideline of a model procedure It recognizes that any engineering calculation, however simple it might be,
of the QM in the safety assessment of nuclear power plant published by is a simulation. The results of this work therefore apply to all physical
the Japan Nuclear Safety Institute (JANSI) were referred. In order to engineering simulations ranging from pencil and paper hand calculation
maintain high quality in the implementation of the V&V by the outsourcing, through to multi-physics, multi-processor supercomputing. This work also
a consecutive process for the QM consisting of the activities from the covers extreme conditions such as space vehicles where solid evidence
order to the delivery in the purchasing process is defined in these of pre-flight conditions is not available.
documents. In the V2UP, the QM process in the verification and that in the
validation has to be separately defined because the verification and the
19
validation are independently carried out. The definitions of the verification
and the validation for the QM were reconsidered and the verification of
Technical Program Wednesday
TRACK 4 UNCERTAINTY QUANTIFICATION, SENSITIVITY ANALYSIS, i.e., specific connections between individual inputs and outputs, of the
AND PREDICTION two models.
estimation (AKDE) are obtained by considering limited output test data. dynamic estimators in piezoelectric harvester (deterministic performance
As a result, reliability becomes uncertain and thus follows certain estimators) but taking into account the random error associated to the
probability distribution. This distribution will provide quantitative bounds mathematical model and the uncertainties on the model parameters. The
on reliability, i.e., the assessed reliability as a function of confidence level. framework presented could be employed to perform Posterior Robust
Once the epistemic uncertainty distribution of the reliability is obtained, Stochastic Analysis, which is the case when the harvester can be tested or
the user can select a target confidence level. At the target confidence it is already installed and the experimental data is available. In particular, it
level, the confidence-based target output PDF and reliability can be is introduced a procedure to update the electromechanical properties of
obtained, which are confidence-based estimations of the true output PDF PEHs based on Bayesian updating techniques. The mean of the updated
and reliability. Finally, the target output PDF (i.e., UQ) is used to measure electromechanical properties are identified adopting a Maximum a
the biasness of the simulation and surrogate models. Posteriori estimate while the probability density function associated is
obtained by applying a Laplace?s asymptotic approximation (updated
properties could be expressed as a mean value together a band of
Sensitivity Analysis of a Nuclear Reactor System Finite Element confidence). The procedure is exemplified using the experimental
Model characterization of 20 PEHs, all of them with same nominal characteristics.
Results show the capability of the procedure to update not only the
Technical Publication. VVS2018-9306 11:15AM - 11:40AM
electromechanical properties of each PEH (mandatory information for the
Gregory Banyay, Jason Young, Stephen Smith, Westinghouse Electric prediction of a particular PEH) but also the characteristics of the whole
Company sample of harvesters (mandatory information for design purposes).The
results reveal the importance to include the model parameter
The structures associated with the nuclear steam supply system (NSSS) of uncertainties in order to generate robust predictive tools in energy
a pressurized water reactor (PWR) warrant evaluation of various non- harvesting. In that sense, the present framework constitutes a powerful
stationary loading conditions which could occur over the life of a nuclear tool in the robust design and prediction of piezoelectric energy
power plant. These loading conditions include that associated with a loss harvester’s performance.
of coolant accident and seismic event. The dynamic structural system is
represented by a finite element model consisting of significant epistemic
and aleatory uncertainties in the physical parameters. To provide an Separability of Mesh Bias and Parametric Uncertainty for a Full
enhanced understanding of the influence of these uncertainties on model System Thermal Analysis
results, a sensitivity analysis is performed. This work demonstrates the
Technical Presentation. VVS2018-9339 12:05PM - 12:30PM
construction of a computational design of experiment which runs the finite
element model a sufficient number of times to train and verify a unique Benjamin Schroeder, Humberto Silva III and Kyle D. Smith, Sandia
aggregate surrogate model. Adaptive sampling is employed in order to National Laboratories, Albuquerque, NM, United States
reduce the overall computational burden. The surrogate model is then
used to perform both global and local sensitivity analyses. When making computational simulation predictions of multi-physics
engineering systems, sources of uncertainty in the prediction need to be
acknowledged and included in the analysis within the current paradigm
Bayesian Framework to Quantify Uncertainties in Piezoelectric of striving for simulation credibility. A thermal analysis of an aerospace
Energy Harvesters geometry was performed at Sandia National Laboratories. For this
analysis a verification, validation and uncertainty quantification workflow
Technical Presentation. VVS2018-9318 11:40AM - 12:05PM
provided structure for the analysis, resulting in the quantification of
Patricio Peralta, Viviana Meruane and Rafael Ruiz, Univerisdad de Chile, significant uncertainty sources including spatial numerical error and
Santiago, Chile material property parametric uncertainty. It was hypothesized that the
parametric uncertainty and numerical errors were independent and
The dynamic description of piezoelectric energy harvesters (PEHs) has separable for this application. This hypothesis was supported by
been widely studied in the last decade. Different deterministic modelling performing uncertainty quantification simulations at multiple mesh
techniques and simplifications have been adopted to describe their resolutions, while being limited by resources to minimize the number
electro-mechanical coupling effect in order to increase the accuracy on of medium and high resolution simulations. Based on this supported
the output power estimation. Although it is a common practice to use hypothesis, a prediction including parametric uncertainty and a
deterministic models to predict the input-output behavior of PEHs, perfect systematic mesh bias are used to make a margin assessment that
predictions are not expected since these devices are not exempt of avoids unnecessary uncertainty obscuring the results and optimizes
uncertainties. The accuracy of the output estimation is affected mainly by computing resources.
three factors: (1) the mathematical model used, (2) the uncertainties on the
mathematical model parameters and (3) the uncertainties related to the
excitation. These uncertainties should be taken into account in order to
generate robust and more plausible predictions. Nevertheless, only a
limited attention has been paid in the uncertainty quantification related to
21
model parameters in piezoelectric energy harvesters. The interest of this
work is to describe a framework that allows the use of the well-known
Technical Program Wednesday
TRACK 10 VERIFICATION AND VALIDATION OF NUCLEAR POWER Estimation of Threshold Parameter and Its Uncertainty Using
APPLICATIONS Multi-Variable Modeling Framework for Response Variable with
Binary Experimental Outcomes
10-1 VERIFICATION AND VALIDATION OF NUCLEAR POWER Oral Presentation. VVS2018-9353 10:50AM - 11:15AM
4TH FLOOR, GREAT LAKES A3 10:25AM - 12:30PM
Leonid Gutkin, Douglas Scarth, Kinectrics Inc., Toronto, ON, Canada
Introducing V&V&C In Nuclear Thermal-Hydraulics
Technical Presentation. VVS2018-9321 10:25AM - 10:50AM A number of different approaches are used in computational modeling to
estimate the model parameters and their uncertainties. In some cases,
Francesco D’Auria, University of Pisa, Pisa, Italy, Marco Lanfredini, direct statistical assessment of relevant experimental data obtained for
GRNSPG-UNIPI, San Piero a Grado (PI), Italy the parameter of interest may be possible. In other cases, it may be
necessary to estimate the model parameter and its uncertainty from the
V&V constitutes a powerful framework to demonstrate the capability of response variable of another model containing the parameter of interest
computational tools in several technological areas. Passing V&V and developed for this purpose. An example of the latter approach is
requirements is a needed step before applications. Let’s focus hereafter discussed in this presentation, which outlines the recently developed
to the area of (transient) Nuclear Thermal-hydraulic (NTH) and let’s identify framework for estimation of a threshold parameter and its uncertainty in
V1 and V2 as acronyms for Verification and Validation, respectively. probabilistic evaluations of crack initiation from in-service flaws in CANDU
nuclear reactors.
Now, V1 is performed within NTH according to the best available
techniques and may not suffer of important deficiencies if compared with Each one of several hundred fuel channels in the core of a CANDU reactor
other technological areas. This is not the case of V2. Three inherent includes a Zr-2.5%Nb pressure tube, containing nuclear fuel and
limitations shall be mentioned in the case of Validation in NTH: pressurized heavy water coolant. During operation, the pressure tubes
may become susceptible to delayed hydride cracking (DHC) due to the
1. Validation implies comparison with experimental data: available
increasing content of hydrogen, in the form of deuterium, generated by
experimental data cover a (very) small fraction of the parameter range
the corrosion reaction of the Zr-based material with the heavy water.
space expected in applications of the codes; this can be easily seen if
Therefore, the in-service flaws in pressure tubes are evaluated for DHC
one considers data in large diameter pipe, high velocity and high
initiation. The threshold stress for DHC initiation at the flaw tip depends
pressure or high power and power density. Noticeably, the scaling issue
on the flaw geometry and the material resistance to DHC initiation, and is
must be addressed in the framework of V2 which may result in
predicted using models based on the process-zone approach. One of the
controversial findings.
material parameters required to apply the process-zone predictive models
2. Water is at the center of the attention: the physical properties of water is the threshold stress for DHC initiation at planar surfaces.
are known to a reasonable extent as well as large variations in values of
In DHC initiation experiments, a surface flaw is required to produce local
quantities like density or various derivatives are expected within the
stress concentration and ensure predictable and reproducible
range of variation of pressure inside application fields. Although not
precipitation of hydrides. Therefore, obtaining reliable experimental data
needed for current validation purposes (e.g. validation ranges may not
for DHC initiation at planar surfaces is extremely challenging. This
include a situation of critical pressure and large heat flux) physically
problem has been addressed by means of developing a multi-variable
inconsistent values predicted by empirical correlations outside
modeling framework based on the closed-form process-zone
validation ranges, shall not be tolerated.
representation of the threshold stress for DHC initiation. The developed
3.Occurrence of complex situations like transition from two-phase critical modeling framework predicts a higher probability of DHC initiation for
flow to ‘Bernoulli-flow’ (e.g. towards the end of blow-down) and from film more severe flaws and for lower material resistance to DHC initiation, and
boiling to nucleate boiling, possibly crossing the minimum film boiling it can be applied to statistically assess the binary outcomes of DHC
temperature (e.g. during reflood). initiation experiments performed on specimens containing flaws of
varying severity. Using this framework, the threshold stress for DHC
Therefore, whatever can be mentioned as classical V2 is not or cannot be initiation at planar surfaces can be derived as a distributed parameter for
performed in NTH. So, the idea of the present paper is to add a the probabilistic evaluations of crack initiation. The developed framework
component to the V&V. This component, or step in the process, is called also allows for potential correlation between the threshold stress for DHC
‘Consistency with Reality’, or with the expected phenomenological initiation at planar surfaces and the threshold stress intensity factor for
evidence. The new component may need to be characterized in some DHC initiation from a crack.
cases and is indicated by the letter ‘C’. Then, the V&V becomes V&V&C.
V&V&C aims at increasing the robustness and the capabilities of
concerned models addressing topics like those mentioned above.
The purpose of the paper is to clarify the motivations at the bases of the
V&V&C
22
Wednesday Technical Program
Validation of MARS-KS Code for the Analysis of Pressure Transition in works in the building contaminated with radioactive material during the
Passive Safety Injection System Using Pressure Balancing Line decommissioning process. The purpose of this study attempts using
RESRAD computer code to research in the verification and validation of
Oral Presentation. VVS2018-9367 11:15AM - 11:40AM simulation models for residual radioactive contamination from
decommissioning nuclear power plants. The results of this study can
Yu-na Kim, Sunil Lee, Sung-Jae Yi and Sung Uk Ryu Korea Atomic Energy
provide the actual experiment techniques and simulation methodologies
Research Institute, DAEJEON, Korea (Republic)
with using RESRAD computer code designed in the protectiveness of the
Recently, a lot of research on inherent safety and passive safety systems human health and environmental impact on radiological safety during the
of nuclear power plant has been conducted in the nuclear society. H-SIT decommissioning process.
(Hybrid-Safety Injection Tank), which is being developed for APR+ and
In this study, there is a characterization survey in the actual building
IPOWER, has been enhanced to allow passive safety injection without
contaminated with radioactive material first. The characterization survey
significantly changing the design of the APR1400 by adding PBL (Pressure
includes the dimensions of the building, density of the structures,
Balancing Line).It is important to predict and verify the system pressure
distribution of the radioactive contamination, radiation dose of the
tendency through the PBL, since the performance of passive safety
workers and backgrounds. The result of characterization survey in a room
injection depends on the pressure balance between the primary side
of the building shows that the east and south walls of the room
and the upper side of H-SIT. Therefore, this study is to confirm that the
contaminated, west and north walls of the room uncontaminated, ceiling
MARS-KS code, a validation code commonly used in nuclear society,
and floor of the room uncontaminated. This study aims to investigate the
well predicts the pressure balance in this system with appropriate models.
east and south contaminated walls with analysis of radionuclide source
When the accident scenario is assumed to be SBLOCA (Small Break Loss
terms, analysis of radioactive source strength impact, analysis of core
Of Coolant Accident), the code predicts that the pressure balance will
samples from the contaminated walls, analysis of radionuclide activity
occur through the following processes. As soon as the accident starts, hot
and concentration.
water extracted from the cold leg is injected into H-SIT through the PBL.
In the deginning, flashing phenomenon occurs shortly due to the sudden The result of the analysis of radioactive source strength impact shows that
pressure difference, so H-SIT is rapidly pressurized. However, after the there is non-uniform distribution of radioactive contamination on the
pressure is almost equalized, only hot water, which is ineffective to surfaces of walls. The result of the analysis of core samples from the
pressurize, is supplied through the PBL, so that the upper pressure of contaminated walls shows that there is non-uniform distribution of
H-SIT could not achieve pressure equilibrium with the primary side. In radioactive contamination in the structures of walls. Furthermore, we have
order to represent this complex pressure balance mechanism, all models the decontamination work inside the building in order to decrease the
should accurately reflect pressure drop, phase change, flashing radiation dose incurred by an individual who works in the building. We
phenomena and direct condensation. For validation of the MARS-KS, we also design the radiation shielding outside the building in order to avoid
conducted the validation test and produced experimental data to compare the radiation leakage to environment. In this study, we use the RESRAD
with the analytic results. The experimental facility was established by computer code for designing the simulation models of an actual building
adding the PBL design to ATLAS (Advanced Thermal-hydraulic Test Loop with radioactive contamination, and then we use the results of actual
for Accident Simulation) facility simulating APR1400. The various main experiments for verifying and validating the computer simulation models.
parameters such as local pressure, temperature and water level were We also compare the radiation doses of computer simulation results with
measured. As a result of the comparison, the MARS-KS code predicts the the radiation doses of actual measurement results, and the minimum
pressure tendency measured in the experiment, and it is expected that relative error of verification result is 10.10 %.
it can be used for H-SIT performance validation if the heat loss and the
break flow prediction are supplemented.
Residual radioactive contamination from decommissioning nuclear power Gregory Banyay, Westinghouse Electric Company, Cranberry Township,
plants is a part of important issues in radiological safety. There are lots of PA, United States
23
international nuclear power plants using the RESRAD computer code for
evaluating the potential radiation dose incurred by an individual who The effects of flow-induced vibration (FIV) is a major design consideration
Technical Program Wednesday
for the newer generations of commercial nuclear power plants that are Fluid Dynamics and Heat Transfer (ASME V&V 20) to quantify the laser
currently coming on line, as well as for operating plant life extensions package validation uncertainty.
analyses and next generation plant designs in development. FIV effects
are particularly important relative to reactor vessel internals (RVI) and The default implementation of the laser package is unable to predict the
steam system component low-cycle and high-cycle fatigue design and experimental validation metrics (i.e. absorbed and scattered laser
analyses. The U.S. Nuclear Regulatory Commission (NRC) Regulatory energies from the capsule surface and ablation front position and velocity
Guide (RG) 1.20 outlines specific guidance to conduct a Comprehensive with time) and validation uncertainty is dominated by experimental
Vibration Assessment Program (CVAP), which is used to confirm the uncertainties. However, the validation comparison error is consistently
fatigue performance of the RVI, and with recent revisions to the RG, the larger than the validation uncertainty, indicating a systematic model form
fatigue performance of the steam system. The RG outlines four primary error (i.e. missing physics). We argue cross-beam energy transfer (CBET),
aspects of the CVAP: vibration and fatigue analyses, vibration one of many un-modeled laser-plasma interactions from the laser system,
measurement, component inspections, and the correlation of all results. is a significant source of the missing physics. High priority should be
While full RVI CVAPs have been completed in the past, along with a few placed on updating the laser package to include CBET predictive
smaller CVAPs for specific RVI components, the last full RVI design CVAP capability.
was completed over 20 years ago in both engineering and nuclear
The ICF community commonly employs nonphysical calibration methods
regulatory environments that were notably different than that of today’s
to account for the effects of CBET through tuning the thermal flux limiter,
restrictive environments.
tuning the incident laser energy, or both. Although validation is at odds
This presentation reviews these four primary aspects of a CVAP from both with calibration, we assess both methods to inform the community of
a practical engineering perspective as well as a regulatory perspective, in strengths, weaknesses, and modeling subtleties. Both methods show
light of changes over the past two decades. While differences between increased model accuracy and support CBET as a source of missing
historical and current CVAPs are discussed, the emphasis of the physics. However, the incident laser energy tuning method more
discussion is on the current engineering methods used for both the accurately overcomes CBET deficiencies.
analytical/numerical and measurement portions of the work needed for a
Sensitivities to other model physics are also assessed. Preliminary results
successful program. Of particular interest for discussion is the need to
show insignificant sensitivity to most code setup parameters, including
analyze and correlate numerical predictions and measurement data from a
calculation setup, mesh resolution, material LTE opacities, ionization
system perspective, as well as consider interactions between major
models, laser configuration, and material EoS.
component subsystems.
Much of this review is based on the successful experience with the CVAP
for the new Westinghouse AP1000® PWR RVI. However, the review is The usefulness of credibility assessment frameworks
equally applicable generically to other reactor internals design type
CVAPs (e.g., BWR, SMR). Oral Presentation. VVS2018-9347 2:20PM - 2:45PM
a surrogate model for the observed outputs in terms of the model inputs; data space. Three independent error measures that associated with
both methods use data collected from a set of previously conducted physically meaningful characteristics (phase, magnitude, and slope) are
experiments. The bias surrogate method implements a predictor- extracted for the dimension reduced function, and then Bayesian interval
corrector approach where a surrogate model is built for the bias term, hypothesis testing is performed on the reduced difference data to make
which is then used to correct the simulation model prediction at each time an objective decision with considering the conflicting validation results
step. Alternatively, in the observation surrogate method, the bias term is between the different principle components and assess the model validity.
combined with the simulation model, and a single surrogate model is built The proposed method resolves some critical drawbacks of the previous
for the experimental output. A neural network-based surrogate modeling methods and adds some desirable properties of a model validation metric
technique is employed to implement the proposed methodology. From for multivariable dynamic systems, such as symmetry and functionality. A
the experiments, high volume data are available for the surrogate model real-world dynamic system with multiple, functional responses is used to
training. The neural network trains on the entire dataset to produce either demonstrate this new approach, and shows its potential in promoting the
a static or dynamic surrogate model depending on whether training is continual improvement of virtual prototype testing.
performed in a batch manner or not. The proposed methodology is
illustrated for an air cycle machine (ACM), which is a refrigeration unit
commonly on board aircraft.
Assessment of Model Validation and Calibration Approaches in the
This work is funded by the Air Force Research Lab (AFRL). Presence of Uncertainty
Research on a Functional Validation Method for Multivariate Dynamic Christopher Roy, Nolan Whiting, Virginia Tech, Blacksburg, VA, United
Systems from the Perspective of Function States
Oral Presentation. VVS2018-9401 2:20PM - 2:45PM Model validation is the process of determining the degree to which a
model is an accurate representation of the real world from the perspective
Yudong Fang, Jun Lu, Junqi Yang and Zhenfei Zhan, Chongqing of the intended uses of the model. The results of a model validation study
University, Chongqing, China can be used to either quantify the model form uncertainty or to improve/
calibrate the model. However, the model validation process can become
Computer modeling and simulations are playing an increasingly important complicated if there is uncertainty in the simulation and/or experimental
role in complex engineering system applications such as reducing vehicle outcomes. These uncertainties can be in the form of aleatory uncertainties
prototype tests and shortening product development time. Increasing due to randomness or epistemic uncertainties due to lack of knowledge.
computer models are developed to simulate vehicle crashworthiness, We will use three different approaches for addressing model validation: 1)
noise vibration and harshness, and fuel efficiency. As a process to assess the area validation metric, 2) a modified area validation metric with
the validity and predictive capabilities of computer models in its potential confidence intervals, and 3) the standard validation uncertainty from
usage by comparing the computer output with test data, the model ASME V&V 20. To provide an unambiguous assessment of these different
validation needs to be conducted before applying these computer models approaches, synthetic experimental values were generated from
for product development. In the virtual prototype environment, validation computational fluid dynamics simulations of a multi-element aircraft wing.
of computational models with multiple and correlated functional A simplified model was then developed using a combination of thin airfoil
responses needs to solve some tough issues: the nonlinear correlation theory and lifting line theory. This simplified model was then assessed
between different functional responses, the decision-making with conflict using the synthetic experimental data. The quantities examined include
validation results for multivariate responses and objective robust metrics. the lift and moment coefficients for the wing with varying angles of attack
In addition the responses of complex dynamic system is continuous in a and flap deflection angles.
continuous time domain the existing validation methods based on discrete
method may lead to the regardless of the functional data features. Aiming
to solve the aforementioned problems based on Bayesian interval
hypothesis testing theory, functional data analysis method, functional An Evaluation of Validation Metrics for Probabilistic Model Outputs
kernel principal component analysis, and subjective matter experts’ based
Technical Publication. VVS2018-9327 3:10PM - 3:35PM
threshold definition and transformation, this paper proposes an integrated
validation method for multivariate dynamic system under virtual prototype Paul Gardner, Charles Lord, Robert, J. Barthorpe, University of Sheffield,
environment. In the proposed method, all of the time history responses of Sheffield, United Kingdom
the computer simulation model and physical test are firstly represented by
functions. These functional representations are transformed to a lower Probabilistic modelling methods are increasingly being employed in
dimensional feature space using functional kernel principal components engineering applications. These approaches make inferences about the
analysis. The employment of the functional kernel principal components distribution, or summary statistical moments, for output quantities. A
analysis handles multivariate nonlinear correlation and it also improves the challenge in applying probabilistic models is validating the output
efficiency for the subsequent decision-making of the model validation at distributions. This is of particular importance when using uncertainty
26 the same time. The subjective matter experts’ based threshold definition quantification (UQ) methods that incorporate a method for determining
and transformation is used to decide the threshold interval in the reduced model discrepancy. This term defines the difference between a physical
Wednesday Technical Program
model with true parameters and the observed physical process. Methods In the present work, six cases (flow over a flat plate for Reynolds numbers
for learning model discrepancy are often formulated as non-parametric of 10000000, 100000000 and 1000000000; flow around the NACA 0012
regression. This results from the assumption that the functional form of the hydrofoil at Reynolds number of 6×1000000 and angles of attack of 0º,
model discrepancy is unknown a priori; otherwise it would be included in 4º and 10º) are simulated by using OpenFOAM. All numerical cases are
the physical model. The inferred model discrepancy distribution will affect statistically steady flows of an incompressible fluid that were simulated
the predictive output distribution and can be a good indicator of model in several geometrically similar grid sets. For flow over a plate, 10 set
form errors within the physical model. Additionally, approaches to UQ grids and each set has 13 levels of grid refinement are simulated with
often assume a distribution for output predictions, e.g. that the output three eddy-viscosity turbulence models: Spalart & Allmaras one-equation
distribution is Gaussian. Hence, validation of the predictive distributions is model; Shear-stress transport (SST) k-w two-equation and k-kl-w
important in determining the limitation of these assumptions, and whether three-equation, based on SIMPLE algorithm. In terms of flow around
an alternative method with different assumptions is required. For these NACA 0012 hydrofoil, 4 set grids and each set has 9 levels of grid
reasons an ideal validation metric is one that intuitively provides refinement are applied with Spalart & Allmaras one-equation model
information on key divergences between the output and validation based on same algorithm.
distributions. Furthermore, it should be interpretable across different
problems in order to informatively select the ideal UQ method. A difficulty For solution verification, the uncertainty estimators which are Grid
in validating probabilistic models is that the number of validation samples Convergence Index (GCI) method, Factor of Safety method (FS) and Least
obtained must be sufficient to accurately understand the underlying Squared Root methods (LSR) are applied to evaluate the uncertainty of the
distribution of the physical process. Consequently, validation metrics that results. This paper contains five sections. The first section is introduction
require density estimation prior to calculation may be more susceptible to and then the mathematical formations of RANS equations and uncertainty
errors caused by limited validation data. In this paper, two families of verification methods are shown in the second section. The third part is
metrics for quantifying differences between distributions are compared: numerical model, the simulation models and the detail information of
f-divergence and integral probability metrics (IPM). Traditionally, these grid groups are introduced in this section. The fourth section is
f-divergence metrics have been more widely used, most notably the result, the iterative error and uncertainty analysis results from above three
Kullback-Leibler divergence, whilst IPMs have been mainly confined to uncertainty estimators are shown in this section. Furthermore, the
kernel machine methods. In order to compare these families of metrics a estimation of discretization errors and discretization error bars are figured
case study on a representative five story building structure is presented. out. Finally, the conclusion and discussion are drawn in last section.
This case study applies Bayesian history matching, a UQ method with
model discrepancy, to identify the distribution of natural frequencies
under different pseudo-damage scenarios. The output from this approach ASSESSMENT OF DISCRETIZATION UNCERTAINTY ESTIMATORS
is validated against specific metrics from each of the two families of BASED ON GRID REFINEMENT STUDIES
metrics. Discussions and evaluation of these metrics are performed with
comments on ease of computation, interpretability and quantity of Oral Presentation. VVS2018-9340 1:55PM - 2:20PM
information provided.
Luis Eca, IST, Lisbon, Portugal, Guilherme Vaz, MARIN, Wageningen,
Netherlands, Martin Hoekstra, Maritime Research Institute Netherlands,
Wageningen, Netherlands, Scott Doebling, Los Alamos National Lab,
TRACK 11 VERIFICATION FOR FLUID DYNAMICS AND HEAT Los Alamos, NM, United States, V. Gregory Weirs, Sandia National
TRANSFER Laboratories, Albuquerque, NM, United States, Tyrone Phillips, University
of British Columbia, Blacksburg, VA, United States, Christopher Roy,
11-1 VERIFICATION FOR FLUID DYNAMICS AND HEAT TRANSFER Virginia Tech, Blacksburg, VA, United States
4TH FLOOR, GREAT LAKES A3 1:30PM - 3:35PM
Recently, a set of data from the calculation of the flow over a flat plate
Uncertainty Study Of Flow Over Plate And Around Hydrofoil with and around the NACA 0012 airfoil has been proposed for the assessment
OpenFOAM of the discretization uncertainty estimators based on grid refinement
studies [1].
Oral Presentation. VVS2018-9333 1:30PM - 1:55PM
In this paper, we assess the performance of 8 discretization uncertainty
Shanqin Jin, Memorial University of Newfoundland, St. John’s, estimates: the method proposed by Xing and Stern in [2]; a revised
NL, Canada version of the previous method [3]; the implementation of the GCI method
of Roache [4] of the Workshop organizers; the GCI as described in the
Computational Fluid Dynamics (CFD) has developed into an important
ASME V&V Standard of 2009 [5]; a generalization of the GCI method
engineering tool that is currently applied to make project decisions. The
proposed in [6] and its most recent version that incorporates weighted fits
popularity of OpenFOAM for various CFD applications is rapidly growing
taken from robust statistics; the method proposed in [7] and the Robust
in recent years. As consequence, reliability and credibility of numerical
Verification Analysis proposed in [8].
simulations with OpenFOAM is an unavoidable issue, a logical step forward
is then to pay due attention to the assessment of numerical uncertainties The performance of the different methods is evaluated determining the
27
(Verification). This paper presents solution verification exercise for the flow number of cases where the uncertainty cannot be estimated and the ratio
over a flat plate and around the NACA 0012 hydrofoil with OpenFOAM. R between the estimated uncertainty and the error E, which is based on
Technical Program Wednesday
an estimate of the exact solution presented in [1, 9]. code a set of 2D transient heat conduction problems approximating the
3D problem in the casing, which encloses the WP; 3) the cryogenic circuit
Keywords: Discretization uncertainty, Grid refinement. module provides self-consistent boundary conditions to the WP: it solves
transient non-linear Euler-like equations in the 1D components (pipes) and
References
mass and energy balance in the 0D components (manifolds, etc.), using
[1] http://web.tecnico.ulisboa.pt/ist12278/Discretization/Workshop_ OpenModelica. Although the 4C code has already been successfully
discretization_2017.htm validated against experimental data covering a wide range of transients
and time-scales, a systematic verification exercise has never been
[2] Xing T., Stern F. - Factors of Safety for Richardson Extrapolation ASME pursued so far.
Journal of Fluids Engineering, Volume 132, June 2010, pp. 061403:1-13.
In this work, we apply the method of the manufactured solutions to 4C, to
[3] Xing T., Stern F. - Closure to Discussion of Factors of Safety for achieve its first solution verification. We adopt a systematic procedure,
Richardson Extrapolation ASME Journal of Fluids Engineering, Volume 133, verifying first every single module and then the coupling between them. In
Dec 2011, pp. 115502:1-6. particular, we begin verifying the multi-conductor module starting from a
single conductor under isothermal conditions: we show that the computed
[4] Roache, P.J. - A Method for Uniform Reporting of Grid Refinement solution agrees with the analytical (manufactured) solution. Then, the
Studies - Proc. of Quantification of Uncertainty in Computation Fluid verification of the energy equation is performed, still for a single
dynamics, Edited by Celik, et al., June 1993, ASME Publ. No. FED-Vol. 158. conductor. The multi-conductor module verification is achieved
considering the inter-conductors thermal coupling. The method is applied
[5] ASME, Standard for Verification and Validation in Computational Fluid
separately to the stand-alone circuit module, and then to the conductor
Dynamics and Heat Transfer, 2009.
and circuit modules coupled together. Finally, after the verification of the
[6] Eça L., Hoekstra M., A Procedure for the Estimation of the Numerical structure module, the coupling between the WP and the structures is
Uncertainty of CFD Calculations Based on Grid Refinement Studies, verified, completing the 4C solution verification journey.
Journal of Computational Physics, Vol. 261, 2014, pp:104-130.
Beyond our interest in applying the method of manufactured solutions for
[7] Phillips T.S., Roy C.J. - A New Extrapolation-Based Uncertainty the 4C code verification, this work also aims at raising awareness in
Estimator for Computational Fluid Dynamics - ASME Journal of Verification, developers and analysts in our field about the feasibility of and need for
Validation and Uncertainty Quantification, Volume 1, Number 4, 2017. such verification exercise, being it an indispensable step for confirming
the reliability of any code.
[8] Rider, W. J., Witkowski, W., Kamm, J. R. and Wildey,T. - Robust
Verification Analysis - Journal of Computational Physics, Volume 307,
February 2016, pp. 146-163.
Verification for Hypersonic, Reacting Turbulent Flow
[9] Eça L., Vaz G. and Hoekstra M. - RANS Benchmark Solutions for the
Oral Presentation. VVS2018-9406 2:45PM - 3:10PM
Assessment of Discretization Error Estimators - submitted to the ASME
Journal of Verification, Validation and Uncertainty Quantification, 2017. Brian Carnes, Brian Freno, Tom Smith, V. Gregory Weirs Sandia National
Laboratories, Albuquerque, NM, United States, Marco Arienti, Sandia
Labs, Livermore, U S Minor Island, Erin Mussoni, Sandia Labs, Livermore,
4C code solution verification CA, United States
Oral Presentation. VVS2018-9390 2:20PM - 2:45PM A new simulation code for hypersonic, reacting turbulent flow (SPARC) is
being developed at Sandia National Laboratories. This talk discusses
Roberto Zanino, Roberto Bonifetto, Laura Savoldi, Dipartimento Energia, recent work to verify the correctness of the implementation of physics and
Politecnico Di Torino, Torino, Italy, Andrea Zappatore, Politecnico di Torino, algorithms in SPARC in the context of significant validation efforts. We
Torino, Torino, Italy discuss several code verification activities designed to support the
validation work. These include tests with exact solutions for inviscid flow
The worldwide research on nuclear fusion reactors is currently relying on features such as shocks and expansion regions, and tests using
several experimental devices and computational tools that should help in manufactured solutions to verify design order accuracy. Of particular
the design and realization of the first demonstrator power plant, like, e.g., interest are verification of boundary conditions, reacting flow, and laminar
the European DEMO. The 4C code is the state-of-the-art tool for the and turbulent boundary layers. For turbulent flow, some verification
thermal-hydraulic simulation of tokamak superconducting magnet systems problems are based on benchmark problems from the NASA Turbulence
based on Low Critical Temperature Cable-in-Conduit Conductors (CICC) Modeling Resource website. We also provide examples of solution
cooled by forced-flow supercritical helium. It was developed at Politecnico verification applied to simulations of validation experiments with
di Torino over the last ~10 years and it includes three different modules: 1) hypersonic laminar flow over a double cone. Here careful consideration
the multi-conductor module solves the 1D transient non-linear Euler-like is given to iterative convergence, mesh design for 2D axisymmetric flow
set of partial differential equations in each cooling channel of the winding in a 3D code, and extrapolation of surface quantities such as heat flux
28
pack (WP), coupled with the 1D transient heat conduction equation for the and pressure.
solids; 2) the structure module solves with the open-source Freefem++
Wednesday Technical Program
Contact Discontinuities in xRage opinions and published agency policy, code developers discussing
the challenges they face, members of research organizations providing
Oral Presentation. VVS2018-9410 3:10PM - 3:35PM guidance based on their experience, and commercial organizations
providing their experience/challenges with code verification.
Robert Singleton, Los Alamos National Laboratory, Los Alamos, NM,
United States Panel Participants:
Prasanna Hariharan, US Food and Drug Administration
Banks, Aslam, and Rider (2008) have shown that contact discontinuities
Josh Kaizer, US Nuclear Regulatory Commission
have a convergence rate of p/(p+1) for a p-th order accurate code. The Los
David Moorcroft, US Federal Aviation Administration
Alamos code xRage is 2nd order accurate, and we therefore expect to see
Marc Horner, Ansys
a convergence rate of 2/3 for the contact. However, in simulations of the
Kristian Debus, Siemens PLM Software
Sod problem, one must run at extremely fine resolutions before this
Ashley Peterson, Medtronics
theoretical convergence rate is observed. This resolution is much smaller
John Dong, The Boeing Company
than one would typically use in a numerical simulation of an engineering
or physics problem, which leads to the conclusion that one almost always
runs far outside the range of convergence in most simulations of interest.
This also seems to hold true for shocks and rarefactions, both of which
TRACK 3 TOPICS IN VERIFICATION AND VALIDATION
have a theoretical convergence rate of 1. However, by choosing a small
window about the contact over which to perform the convergence
analysis, rather than the entire domain of the problem, the density variable
reaches the theoretical limit of 2/3 almost immediately at quite coarse 3-1 TOPICS IN VERIFICATION AND VALIDATION
resolutions. This windowing method, therefore, might be useful for 4TH FLOOR, GREAT LAKES A2 4:00PM - 6:05PM
observing the theoretical rates for discontinuities in other problems as
Models, Uncertainty, and the Sandia V&V Challenge Problem
well, although there is a catch. While the density appears to converge as
expected, the velocity and pressure show anomalous convergence
Technical Publication. VVS2018-9308 4:00PM - 4:25PM
behavior at all resolutions. Application of the windowing method will be
shown for density, velocity, and pressure with hypotheses for the George Hazelrigg, Independent Author, Vienna, VA, United States,
anomalous convergence behavior tested. Georgia-Ann Klutke, National Science Foundation, Alexandria, VA,
United States
In this paper, we argue that the Sandia V&V Challenge Problem is ill-posed
TRACK 1 CHALLENGE PROBLEM WORKSHOPS AND PANEL in that the answers sought do not, mathematically, exist. This effectively
SESSIONS discredits both the methodologies applied to the problem and the results,
regardless of the approach taken. We apply our arguments to show the
1-4 CODE VERIFICATION FOR APPLICANTS IN A REGULATED types of mistakes present in the papers presented in J. of VVUQ along
FIELD with the Challenge Problem. Further, we show that, when the problem is
4TH FLOOR, GREAT LAKES A1 4:00PM - 6:05PM properly posed, both the applicable methodology and the solution
techniques are easily drawn from the well-developed mathematics of
Code Verification for Applicants in a Regulated Field [Panel]
probability and decision theory. The unfortunate aspect of the Challenge
Oral Presentation. VVS2018-9395 4:00PM - 6:05PM Problem as currently stated is that it leads to incorrect and inappropriate
mathematical approaches that should be avoided and corrected in the
Marc Horner, ANSYS, Inc., Evanston, IL, United States, David Moorcroft, current literature.
Federal Aviation Administration, Oklahoma City, OK, United States
2. What needs to be done if using Commercial-off-the-Shelf software? Shantanu Shahane, Soham Mujumdar, Namjung Kim, Pikee Priya,
Narayana Aluru, Shiv G Kapoor, Placid Ferreira, Surya Vanka University
How much is enough? Who?s responsible (developer vs applicant)? of Illinois at Urbana Champaign, Urbana, IL, United States
Does it need to be evaluated on the applicant?s system?
Die casting is a type of metal casting in which liquid metal is solidified in a
3. What documentation should be provide to a regulatory body for reusable die. Automotive and housing industries are main consumers of
acceptance? How much documentation? die cast products. Simulations and experiments are used to understand
29
the physics and improve product quality in manufacturing. Computer
The panel will include members of regulatory agencies providing research simulations are convenient and financially viable compared to full scale
Technical Program Wednesday
experiments. Alloy material properties, interface conditions at the mold, reinhardtii strain cc125 were cultured in triplicate with different culture
thermal boundary conditions etc. affect product quality in die casting. media via indirect biophotolysis. Experimental biomass and hydrogen
Hence all these are used as inputs to simulations. However, in such a concentrations were then used to adjust the specific microalgae growth
complex process, measuring and controlling these process parameters is and hydrogen production coefficients based on residual sum of squares
difficult. Conventional deterministic simulations are insufficient to and the direct search method.
completely estimate the effect of stochastic variation in the process
parameters on product quality. In this research, a framework to simulate
the effect of stochastic variation together with verification, validation,
Thermal Response of Open-Cell Porous Materials: A Numerical Study
sensitivity analysis and uncertainty quantification is proposed. This
and Model Assessment
framework includes high speed numerical simulations, micro-structure
and mechanical properties predication models along with experimental Technical Publication. VVS2018-9317 5:15PM - 5:40PM
inputs for calibration and validation.
Kevin Irick, Nima Fathi, The University of New Mexico, Albuquerque, NM,
Three-dimensional finite volume based software to solve Navier-Stokes United States
and energy equations is developed. Fluid flow, natural convection, heat
transfer and solidification physics is modeled. Since complex geometries The evaluation of effective material properties in heterogeneous materials
can be meshed with smaller number of unstructured elements and lesser (e.g., composites or multicomponent structures) has direct relevance to a
stair-casing error compared to Cartesian orthogonal structured mesh, an vast number of applications, including nuclear fuel assembly, electronic
unstructured mesh is used in this software. Algebraic multigrid method is packaging, municipal solid waste, and others. The work described in this
used to accelerate convergence. Parallelization is done on multiple CPU paper is devoted to the numerical verification assessment of the thermal
cores using MPI. Temperature gradients and cooling rates thus estimated behavior of porous materials obtained from thermal modeling and
are used as input to empirical models of micro-structure parameters simulation. Two-dimensional, steady state analyses were conducted on
including dendrite arm spacing and grain size. Experimental data helps to unit cell nano-porous media models using the finite element method
calibrate these empirical models for a given alloy. Published numerical (FEM). The effective thermal conductivity of the structures was simulated,
and experimental results are used to verify and validate respectively the encompassing a range of porosity. The geometries of the models were
entire framework. Practical engineering problems have hundreds of generated based on ordered cylindrical pores in four different porosities.
process parameters and performing simulations or experiments to The dimensionless effective thermal conductivity was compared in all
quantify the effect of stochastic variation of each process parameter on simulated cases. In this investigation, the method of manufactured
the product quality is difficult. Hence sensitivity analysis is performed to solutions (MMS) is used to perform code verification, and the grid
identify critical process parameters which affect the product quality convergence index (GCI) is employed to estimate discretization
significantly and thus control them within desired tolerance. uncertainty (solution verification). The system response quantity (SRQ)
under investigation is the dimensionless effective thermal conductivity
This framework employs both experimental data and stochastic variation across the unit cell. Code verification concludes a roughly second order
in process parameters with numerical modeling and thus enhances the accurate solver. It was found that the introduction of porosity to the
utility of traditional numerical simulations used in die casting to have a material reduces effective thermal conductivity, as anticipated. The
better prediction of product quality. Although the framework is being approach can be readily generalized to study a wide variety of porous
developed and applied to die casting, it can be easily generalized to any solids from nano-structured materials to geological structures.
manufacturing process or other engineering problems as well.
TRACK 8 VERIFICATION & VALIDATION FOR IMPACT, BLAST, AND 12a impact monitoring mouthguard, verifying these doses fall within the
MATERIAL RESPONSE laboratory calibration ranges, understanding the athlete’s individual
dose-response profile and creating a data analysis firmware that more
accurately computes skull kinematics from high frequency (100Hz+) bare
head impacts. The ultimate goal of our work is to correlate trustworthy
8-1 VERIFICATION &VALIDATION FOR IMPACT, BLAST, AND
spatial and temporal estimates of single and cumulative head impact
MATERIAL RESPONSE
doses with computational models of tissue level brain damage
4TH FLOOR, GREAT LAKES A3 4:00PM - 6:05PM
mechanisms.
Laboratory system testing uncertainty of a twelve accelerometer
channel (“12a”) impact monitoring mouthguard to quantify head
impacts for athletes and soldiers
Simulation and Assessment of a Novel Sphere-on-Glass Ballistic Impact
Experiment
Oral Presentation. VVS2018-9330 4:00PM - 4:25PM
Non-trivial processing algorithm, mouthguard customization, knowledge A finite element model of the experimental tests was created in a
of accelerometer positions and orientations, and linearization of individual commercial explicit finite element code (LS-Dyna). Three formulations
accelerometer outputs were required to achieve this level of certainty. including Lagrangian with erosion, Smooth Particle Hydrodynamics (SPH),
and Element Free Galerkin (EFG) were investigated with the widely used
The 12a impact monitor is currently optimized for computation of skull
Johnson-Holmquist 2 constitutive model. The discretization error was
kinematics during the primary acceleration-deceleration phase of impact
quantified using a mesh convergence study at three finite element mesh
in helmeted sports where nominal frequency content is 20-80Hz. Higher
sizes: 1.00mm, 0.50mm and 0.25mm. The residual projectile velocity was
measurement uncertainty, especially in unpadded testing, was due to high
used as the assessment parameter. A Richardson extrapolation was used
frequency signals above 100Hz.
to estimate the converged solution. It is accepted that discontinuities such
as element erosion and shock waves may invalidate the underlying
The 12a Impact Monitoring Mouthguard scalar outputs are suitable for low
assumptions used in the Richardson extrapolation. To improve
uncertainty measurement and computational modeling of single head
interpretation of the results, intermediate mesh sizes were also assessed
impact scalar PLA and PAA in American football and bare head activities
to confirm the asymptotic behavior of the final solution with respect to
so long as human on-field impact conditions are within laboratory
mesh size.
calibration ranges.
contained the measured residual velocity. Both the SPH and EFG models Importance of Finite Element Mesh Resolution and Response Metrics to
had tighter confidence intervals (219±26 m/s and 444±15m/s respectively), Model Blunt Thoracic Impact
but did not contain the measured residual velocity. The models predicted
regions of comminution consistent with test results, but did not accurately Oral Presentation. VVS2018-9392 5:15PM - 5:40PM
predict the radial fracture patterns seen in the experiments, which is a
Jeffrey B. Barker, Duane Cronin, University of Waterloo, Waterloo, ON,
known limitation of the constitutive model. The Sphere-on-Glass tile tests
Canada
proved to be a novel validation dataset, incorporating real-time
visualization and tracking of damage propagation and provided key Human body models (HMB) have become increasingly important in the
quantitative data which can be used to improve future ballistic models. assessment and development of human safety systems. A critical aspect
of detailed finite element HBM is verification and validation of the models
over a representative range of impact conditions; however, often mesh
V&V of Under-body Blast Analysis Methodology for Army Ground sensitivity studies are not undertaken due to pragmatic limitations such as
Vehicles computational time. In this study, the effect of the muscle tissue mesh
density on blunt thoracic impact kinetics and kinematics were investigated
Oral Presentation. VVS2018-9371 4:50PM - 5:15PM using a previously developed detailed finite element thorax model.
Andrew Drysdale, Douglas Howle, US Army Research Laboratory, The detailed thorax model (325,065 elements) comprised: the outer
Aberdeen Proving Ground, MD, United States muscle tissue, sternum, rib cage, costal cartilage, lungs, heart,
mediastinum and spine. Two loading regimes with corresponding PMHS
The US Army Research Laboratory (ARL) has developed, through a experimental data were applied to the thorax model. The first load case
multi-year program of dedicated funding, an analysis process for was a frontal pendulum impact (23.4 kg, 150mm diameter) with impact
consideration of ground vehicle survivability against buried bare-charge velocities of 4.3 m/s, 6.7 m/s, and 10.0 m/s. In the second case, three
threats. The key, and most innovative, products of ARL’s under-body blast baton impacts (140g, 37mm diameter) at 20 m/s and 40m/s; and (30g, 37
methodology (UBM) process are occupant injury predictions along the mm diameter) at 60 m/s were investigated. The impacts were centered on
most relevant injury modes. Additionally, UBM is designed to be the sternum at the height of the 8th thoracic vertebrae. The original finite
multi-stage and modular so that it can be customized to the application as element mesh was reduced in size twice by splitting the elements
much as possible. Modeling challenges include the complex geometries resulting in three finite element mesh densities for the muscle tissue
and failure modes of relevant target vehicles; the violent, impulsive (Coarse: 26,142 elements; Intermediate: 209,108 elements; Fine: 1,672,836
loading of the threat environment; non-linear energy-absorbing elements). For each case, the model force and displacement responses
components along key load paths; difficulty in characterization of were sampled at the same frequency as in the corresponding
occupant response to known loading; and the high degree of sensitivity of experimental cases.
outputs to small changes in input conditions that are difficult to measure.
Because of these sources of uncertainties, and in response to regulatory The pendulum impacts showed similar responses for the plateau force
mandates from the Army, a comprehensive V&V of the process is essential. with varying mesh density, but did have higher initial peak force (20%) for
Both verification and validation of the UBM process encountered stubborn the coarse mesh relative to the fine mesh. The model response was within
challenges. Reliance on commercial finite-element modeling software the experimental response corridors. Changes in response were more
limited the scope of verification in most stages of the analysis process. pronounced for the smaller diameter baton impacts. The peak force
The inherent unrepeatability and expense of the phenomena under decreased with smaller element size while the responses for the 140g
investigation made the acquisition of robust data sets for validation impact at 20 m/s and 40 m/s were within the published response corridors,
difficult. The pioneering nature of the model’s application meant that a the responses for the 30g (60 m/s) impact were outside the response
consensus regarding appropriate evaluation metrics and pass/fail corridors. This may indicate a need for improved high deformation rate
thresholds was elusive. The eventual path to accreditation for use in Army material properties and further mesh refinement. Refining the muscle
live-fire test and evaluation (LFT&E) required creative solutions to each of tissue element size reduced the response stiffness for all impact cases,
these issues. This presentation gives an overview of Army LFT&E, how with larger differences identified for smaller impact areas and lower mass
UBM fits into that program, how the UBM process is designed to work, and projectiles. This was attributed to improved mass distribution with a
an example of nominal outputs. It delves into specific challenges faced in refined mesh, and an improved prediction of local strain rate in the muscle
the V&V process and how they were addressed in order to satisfy tissue material. Thus, the required finite element size for a given problem
accreditation requirements. Emphasis is placed on its application to depends on the impact scenario and must be evaluated to ensure
evaluation of the Joint Light Tactical Vehicle, the context for which this meaningful numerical results.
initial V&V was conducted.
32
Wednesday Technical Program
33
Technical Program
Thursday, May 17, 2018
Thursday Technical Program
TRACK 1 CHALLENGE PROBLEM WORKSHOPS AND PANEL the Sierra/SolidMechanics code at Sandia National Laboratories. These
SESSIONS models have undergone extensive verification testing that synthesizes an
understanding of the models, continuum mechanics, and the finite
element formulation in the code to rigorously verify their implementation.
Rate independent and rate dependent models, with assorted hardening
1-5 ASME 2018 V&V VERIFICATION, VALIDATION, AND
laws and both isotropic and anisotropic yield surface descriptions, have
UNCERTAINTY QUANTIFICATION: THE UNANSWERED
QUESTIONS been implemented and verified through a series of uniaxial tension and
4TH FLOOR, GREAT LAKES A1 10:25AM - 12:30PM pure shear boundary value problems. The verification of the plasticity
models is presented in detail, but more importantly techniques and
Paul Deacon, Siemens PLM Software, Aaron Koskelo, Los Alamos approaches are described that we believe can be extended to other
National Lab, Sankaran Mahadevan, Vanderbilt University, Chris Roy, classes of constitutive models.
Virginia Tech University, Daniel Segalman, Michigan State University
Error Quantification for Large-Deformation Solid Mechanics with A Case Study in Mesh Verification
Adaptive Remeshing
Oral Presentation. VVS2018-9409 11:40AM - 12:05PM
Oral Presentation. VVS2018-9411 11:15AM - 11:40AM
Adam Johnson, Honeywell FM&T, Overland Park, KS, United States
Andrew Stershic, Lauren Beghini, Guy Bergel, Alex Hanson, Kevin
Manktelow, Sandia National Laboratories, Livermore, CA, United States At the Kansas City National Security Campus (KCNSC), simulation analysts
are often called upon to provide quick-turn solutions to manufacturing
Several manufacturing processes relevant to the aerospace industry, such roadblocks. This environment presents challenges in the execution of
as forging and welding, consist of high-temperature loading of stock model verification, such as mesh verification studies. A tool was
material and result in large local deformation. When such processes are developed to automate the process of performing Richardson’s
modeled computationally using the finite element method, high mesh extrapolation to provide the analyst a discretization error estimate. This
distortion reduces the model’s accuracy and robustness. Adaptive tool was designed to balance the underlying pitfalls of Richardson’s
remeshing capabilities present an attractive solution to address the issue extrapolation and deliver feedback when certain requirements are not
of high mesh distortion without significantly increasing the model’s met. This presentation provides an example of how automation has been
computational cost. used to promote model verification in a manufacturing environment while
highlighting the inherent challenges involved.
In this work, we verify the simulation process that uses adaptive
remeshing to model thermo-mechanical forging and welding in SIERRA, a The Kansas City National Security Campus is operated by Honeywell
scalable multi-physics finite element code produced by Sandia National Federal Manufacturing & Technologies, LLC for the United States
Laboratories. Department of Energy under Contract No. DE-NA0002839
References:
In this presentation, we will summarize some code verification studies that
[1] O. C. Zienkiewicz and J. Z. Zhu, “A simple error estimator and adaptive address both classical and manufactured solutions for solid mechanics.
procedure for practical engineering analysis,” International Journal of Problems with hyperelastic material models are often amenable to
Numerical Methods in Engineering, vol. 24, no. 2, pp. 337-357, 1987. manufacturing a solution, as the strong form of the boundary-initial-value
problem is governed by differential equations. Starting simple, we
consider quadratic displacement fields which can be thought of as being
36 one step past a “patch test” for lower order elements. Another approach
has been to manufacture a solution starting with the displacement field
Thursday Technical Program
from a classical solution, and then applying the governing nonlinear that BD Corporate Computer-Aided Engineering (CAE) has leveraged to
differential equations to “manufacture the corresponding problem.” The tackle the challenge of establishing confidence in computational modeling
classical stress concentration problem for the stress around a circular hole and simulation results is to conduct and present verification studies to
is examined, for which convergence results using both the classical and a various teams within the company. The current presentation will discuss
manufactured solution are compared. verification case studies in computational fluid dynamics as well as some
common terminology used to communicate the significance of the
When constitutive models include history dependence, (e.g., hypoelastic verification work to non-technical audiences.
and elastoplastic models), the governing strong form of the boundary-
initial-value problem includes integrodifferential equations, which Jeff Bischoff, Zimmer Biomet: Application of the V&V40 philosophy to
significantly complicate the manufacturing of solutions. When the problem evaluate the primary stability of a shoulder prosthesis.
also includes contact, the corresponding kinematic constraints can render
a meaningful manufactured solution practically intractable. We consider a Jeff Bodner, Medtronic Restorative Therapies Group: Simulation and
simplification to the manufacturing process (semi-manufactured) to make a rigorous VVUQ in a regulatory submission that addressed a CAPA related
hypoelastic problem (without contact) more tractable. For a problem with to an implantable drug delivery pump.
both contact and plasticity, we take a more pragmatic approach, applying
Prasanna Hariharan, FDA: Application of computational modeling for a
solution verification to examine the convergence tendency of the code.
tumor ablation procedure. A computational heat transfer model was used
The strength of the test is its closer proximity to the actual application
to show that the temperature rise during the ablation will not cause
space, but the verification cannot claim convergence to the correct
damage to sensitive structures around the ablation site. The risk and
solution, only to “a solution.”
credible assessment process as demonstrated by V&V40 showed that the
model was not credible enough to answer the question of interest.
However, because of the credibility assessment, additional bench tests
were conducted to satisfactorily address the question of interest.
TRACK 13 VERIFICATION AND VALIDATION FOR BIOMEDICAL
ENGINEERING Ali Kiapour, 4WEB Medical: Application of a validated FEA model of a
lateral truss inter-body spine fixation device for worst case identification
and design evaluation/assessment; the computational analysis aimed to
13-2 ASME V&V 40 SUBCOMMITTEE — VERIFICATION AND show that the lateral cage is not a new worst case compared to a
VALIDATION IN COMPUTATIONAL MODELING OF MEDICAL previously FDA cleared truss device, thus not requiring additional
DEVICES mechanical testing for 510k submission. Computational modeling was
4TH FLOOR, GREAT LAKES A3 10:25AM - 12:30PM used as an efficient tool to justify the mechanical performance of the
lateral device in the 510k submission, eliminating the need for
ASME V&V 40 Subcommittee — Verification and Validation in experimental testing, 4WEB Medical saved time and money and
Computational Modeling of Medical Devices successfully launched the product into the U.S. market.
Oral Presentation. VVS2018-9426 Tina Zhao & Paul Schmidt, Edwards Lifesciences: Three examples of V&V
of structural FEA of cardiovascular medical devices, covering low, medium,
Tina Morrison, Food and Drug Administration, Silver Spring, MD, United
and high modeling risk categories.
States, Marc Horner, ANSYS, Inc., Evanston, IL, United States, Ryan
Crane, ASME, New York, NY, United States
Presentations will be 12-15 minutes, followed by a group Q&A. Speakers Principled Use of Expert Judgment for Uncertainty Estimation
are presented in alphabetical order by last name.
Oral Presentation. VVS2018-9384 1:30PM - 1:55PM
Payman Afshari, Depuy Synthes: Application of a computational model to
determine the MR conditional Radiofrequency labeling parameters for the William Rider, Sandia National Laboratories, Albuquerque, NM, United
Anterior Cervical Plates implanted in patients being scanned in a 1.5T and States
3T MRI machines; DePuy Synthes Spine received 510k clearance based
To avoid the sort of implicit assumption of ZERO uncertainty one can use
on the submitted evidence from the computational model. 37
(expert) judgment to fill in the information gap. This can be accomplished
Christopher Basciano, BD Technologies and Innovations: An approach in a distinctly principled fashion and always works better with a basis in
Technical Program Thursday
evidence. The key is the recognition that we base our uncertainty on a development. This effort focuses on the development of a surrogate
model (a model that is associated with error too). The models are fairly model for predicting blade mode shapes for use in an IBR ROM.
standard and need a certain minimum amount of information to be Surrogate modeling mode shapes is particularly challenging due to the
solvable, and we are always better off with too much information making it high dimensionality of the output data. Each blade component can still
effectively over-determined. Here we look at several forms of models that contain thousands of degrees of freedom (DOFs). Furthermore, the inputs
lead to uncertainty estimation including discretization error, and statistical to the surrogate are point cloud data of as-manufactured blades that also
models applicable to epistemic or experimental uncertainty. contain thousands of DOFs. Latent variable models are an attractive
choice out of hopes that a few latent variables not directly observed in the
We recently had a method published that discusses how to include expert original data are capable of an input/output mapping. One such approach
judgment in the determination of numerical error and uncertainty using is Canonical Correlation Analysis (CCA). This method seeks to identify the
models of this type. This model can be solved along with data using linear relationship between N samples of two co-occurring
minimization techniques including the expert judgment as constraints on multidimensional, random variables. However, obtaining reliable CCA
the solution for the unknowns. For both the over and the under- estimates requires N to be 40-60 times larger than the dimensionality of
determined cases different minimizations one can get multiple solutions to the mode shape and point cloud data. Since this is infeasible, a Bayesian
the model and robust statistical techniques may be used to find the “best” CCA approach is utilized that augments the small sample size with prior
answers. This means that one needs to resort to more than simple curve distributions in the model. This approach creates a shared latent variable
fitting, and least squares procedures; one needs to solve a nonlinear space between the mode shape and point cloud data and two latent
problem associated with minimizing the fitting error (i.e., residuals) with variable spaces specific to each data set. In addition to providing accurate
respect to other error representations. mode shape predictions, the Bayesian formulation allows quantification of
the uncertainty in the prediction at each location in the mode shape.
A lot of this information is probably good to include as part of the analysis Results are presented for two industrial IBR geometries.
when you have enough information too. The right way to think about this
information is as constraints on the solution. If the constraints are active
they have been triggered by the analysis and help determine the solution.
Uncertainty Quantification and Digital Engineering Applications in
In this way the solution can be shown to be consistent with the views of
Turbine Engine System Design and Life Cycle Management
the expertise. A key to this entire discussion is the need to resist the
default uncertainty of ZERO as a principle. It would be best if real problem Oral Presentation. VVS2018-9400 2:20PM - 2:45PM
specific work were conducted to estimate uncertainties, the right
calculations, right meshes and right experiments. If one doesn’t have the Kevin OFlaherty, Mark Andrews, SmartUQ, Madison, WI, United States
time, money or willingness, the answer is to call upon experts to fill in the
gap using justifiable assumptions and information while taking an Essentially every government and private engineering group in involved in
appropriate penalty for the lack of effort. This would go a long way to the US Aerospace and Defense Industry has some form of ongoing digital
improving the state of practice in computational science, modeling and engineering activity. The vision for full implementation of these digital
simulation. engineering efforts is to connect research, development, production,
operations, and sustainment to improve the efficiencies, effectiveness,
and affordability of aerospace systems over the entire life cycle. This
presentation discusses the basic capabilities required of a model-based
Surrogate Modeling Of Blade Mode Shape Spatial Variation Due To digital engineering approach to successfully achieve this vision:
Geometric Uncertainties
-An end-to-end system model – ability to transfer knowledge upstream
Oral Presentation. VVS2018-9396 1:55APM - 2:20PM and downstream and from program to program
Joseph Beck, Perceptive Engineering Analytics, LLC, Minneapolis, MN, -Application of reduced order response surfaces and probabilistic
United States, Jeff Brown, US Air Force Research Laboratory, Wright analyses to quantify uncertainty and risks in cost and performance at
Patterson AFB, OH, United States, Alex Kaszynski, Universal Technology critical decision points
Corporation, Dayton, OH, United States
-Single, authoritative digital representation of the system over the life
Mistuning due to manufacturing variations and uneven in-service wear of cycle – the authoritative digital surrogate “truth source”
Integrally Bladed Rotors (IBRs) can result in rogue blade failures in the
field. Predicting the response of mistuned IBRs requires a probabilistic This presentation illustrates both conceptual and practical applications of
approach due to the randomness of mistuning. To do this, reduced order using Uncertainty Quantification (UQ) techniques to perform probabilistic
models (ROMs) are desired that are capable of running in a fraction of time analyses. The application of UQ techniques to the output from
compared to full finite element models. ROMs often employ sub- engineering analyses using model-based approaches is essential to
structuring approaches that divide the IBR into components. Solving each providing critical decision-quality information at key decision points in an
component can still require significant computational resources for aerospace system’s life cycle. Approaches will be presented for the
geometric mistuning since the mode shapes and natural frequencies are continued collection and application of UQ knowledge over each stage of
38 needed for each component. Therefore, creating surrogate models of a generalized life cycle framework covering system design, manufacture,
component mode shapes and frequencies is desirable in ROM and sustainment. The use of this approach allows engineers to quantify
Thursday Technical Program
and reduce uncertainties systematically and provides decision makers calibrated parameters used. The size of this region can have important
with probabilistic assessments of performance, risk, and costs which are numerical stability implications if the path of the post-detonation
essential to critical decisions. As an illustration, a series of probabilistic expansion wave travels through invalid regions of phase space. This work
analyses performed as part of the initial design of a turbine blade will be suggests that the positivity of c^2 should be included as an additional
used to demonstrate the utility of UQ in identifying program risks and constraint in EOS calibration to minimize the risk of invalid phase-space
improving design quality. The application of UQ concepts to life cycle regimes.
management will be addressed, highlighting the benefits to decision
makers of having actionable engineering information throughout a DISTRIBUTION A. Approved for public release: distribution unlimited
system’s life cycle. (96TW-2018-0022)
Sensitivity of Global Equation of State Validity to Calibrated A Bayesian semi-parametric method for inferring physics models and
Parameter Sets in the Simulation of Energetic Materials associated uncertainty
Oral Presentation. VVS2018-9356 2:45PM - 3:10PM Oral Presentation. VVS2018-9378 3:10PM - 3:35PM
Michael Crochet, Air Force Research Laboratory/University of Dayton Stephen A. Andrews, Andrew M. Fraser, Los Alamos National Laboratory,
Research Institute, Valparaiso, FL, United States Los Alamos, NM, United States
Engineering hydrodynamic computational codes are commonly used to We present a method for inferring functions to represent physics models
simulate the behavior of energetic materials, where chemical reaction is which best predict the results of multiple independent
initiated by a mechanical stimulus. The underlying continuum models experiments, as well as their associated uncertainty. As an example, we
require an algebraic relation among the material state properties, known consider the equation of state for the products of detonation of
as an equation of state (EOS). The Mie-Gruneisen EOS is used extensively high explosives. The relationship between pressure and specific volume
in energetics modeling to characterize both the reactant and product along an isentrope is represented in a semi-parametric manner where the
materials. According to this description, the material mass-specific energy function is the product of a vector of coefficients and a large set of basic
varies linearly with pressure for a fixed specific volume. These quantities functions. We develop a Bayesian algorithm which determines the best set
are referenced to pressure and specific energy functions that correspond of coefficients given data from multiple experiments. Our Bayesian
to isentropic expansion curves following a detonation event. The various analysis also allows us to determine both the total uncertainty in the
EOS models employed in energetics modeling (e.g., Jones-Wilkins-Lee, function given all experimental data as well as the regime in which the
Davis wide-ranging, etc.) therefore differ in the functional forms of the data from each experiment most tightly constrain the function.
expansion isentropes and the manner in which these are calibrated to
experiments.
In addition to the parameterization of the reference curves using TRACK 5 VALIDATION FOR FLUID DYNAMICS AND HEAT TRANSFER
experimental data, it is crucial that the square of the material sound speed
c^2>0 throughout the relevant domain of phase space for both reactants
and products. Otherwise, the calculation of complex sound speeds
5-1 VALIDATION FOR FLUID DYNAMICS AND HEAT TRANSFER
prevents the propagation of acoustic waves and causes computational
4TH FLOOR, GREAT LAKES A2 1:30PM - 3:35PM
codes to fail. However, this requirement is not explicitly considered during
the calibration process, instead focusing on ensuring consistency among Implementation of Multiphase Particle-in-Cell (MP-PIC) Methodology
the respective pressures and densities of the reactants and products in MFiX
under highly compressive, or overdriven, loading conditions. As a result,
the domain over which the square of the sound speed is positive is Oral Presentation. VVS2018-9336 1:30PM - 1:55PM
sensitive to the parametrization of both the reactants and products EOS
MaryAnn Clarke, Avinash Vaidheeswaran, West Virginia University
and the uncertainties in these values.
Research Corporation, Morgantown, WV, United States, Jordan Musser,
National Energy Technology Laboratory, Morgantown, WV, United States,
In this work we present an analysis of the domains of validity for the
William Rogers, NETL, Morgantown, WV, United States
explosive PBX-9502, showing results for two sets of calibrated parameters
for the Davis wide-ranging EOS. Here, we define a strong condition for
There is considerable need for efficiency and accuracy when modeling
the positivity of c^2 by requiring the pressure and density to be non-
industrial-scale multiphase flows. The Eulerian-only methodologies,
negative, ensuring physically meaningful values. The results of the
where the solids-phase is represented as a secondary continuum, are
analysis indicate the existence of substantial invalid regions at high
known to be computationally efficient. However, using a continuum
densities, consistent with overdriven detonation regimes. Such a result is
approximation to describe discrete entities can be problematic, leaving
somewhat expected since the functional forms of the expansion
solution verification of Eulerian-only codes far from being fully addressed.
isentropes are extrapolated from experimental data in this domain. 39
On the other hand, the discrete element method (DEM) based on an
However, the size of the invalid region changes significantly with the set of
Eulerian-Lagrangian mathematical approach is known for solution
Technical Program Thursday
accuracy. However, DEM becomes computationally burdened at high Laboratory experiments were performed in conjunction with
particle counts. Consequently, for dense particle configurations, computational fluid dynamic (CFD) simulations to provide insight into the
especially in the limit of close-packing, neither of these approaches are fluid dynamic and heat transfer processes that affect melt rate of waste
considered ideal. A more desirable alternative is the multiphase Particle- slurry during vitrification. Large, Joule-heated melters will be used to
in-Cell (MP-PIC) methodology where particles are grouped into vitrify radioactive tank waste generated over nearly five decades of
computational parcels. Intra-parcel interactions are managed with nuclear weapons production at the Hanford site into a stable borosilicate
constitutive relations instead of detailed Newtonian mechanics thereby glass waste form for disposal. Tank waste is mixed with silica and other
accelerating computation. Some solution fidelity is lost, but for this glass-forming and glass-modifying additives to form a slurry that is fed to a
concession, considerable computational speed is gained. In this work, the melter that operates at 1150°C. When the slurry comprised of tank waste
implementation of MP-PIC in MFiX is discussed along with its validation to (consisting of 40 to 60% water) and glass formers is poured into the melter
illustrate computational and predictive performance. from above, a cold cap (sometimes referred to as the batch blanket layer)
forms, covering ~90-95% of the melt surface. A sensitivity study was
performed using a small-scale, Inconel-lined melter to provide insight into
the foaming behavior of the slurry during vitrification. Air is injected into
Quantitative Assessment of Pulsatile Flow through a Sudden
the melter using a bubbler inserted from the top down to the base of the
Contraction Using Computational Fluid Dynamics and Particle Image
melter. Due to the high viscosity of the molten waste glass simulant, large
Velocimetry
bubbles form that rise and interact with the foam layer at the base of the
Oral Presentation. VVS2018-9391 1:55PM - 2:20PM cold cap. For simplicity, the cold cap is approximated as a rigid solid with a
slip boundary condition and a volumetric gas source below the cold cap to
Stephen Gent, Aaron Propst South Dakota State University, Brookings, simulate foaming. Using the DAKOTA toolkit, Latin hypercube sampling
SD, United States, Tyler Remund, Patrick Kelly, Sanford Health, Sioux was performed to assess the sensitivity of the foam layer to glass viscosity
Falls, SD, United States and thermal conductivity, bubbling flow rate, and foaming rate. Quantities
of interest are the horizontal and vertical velocity, temperature gradient
The objective of this study was to assess and compare the velocity beneath the cold cap, and thickness of the foam layer. The computational
profiles of an incompressible fluid traveling through a sudden contracting results are validated by comparison with X-ray tomography images and
pipe in a transient, pulsatile flow. In this study, a closed-loop benchtop theoretical calculations based upon boundary layer theory.
experiment was constructed with a clear, noncompliant acrylic with a
proximal diameter of 46mm and a distal diameter of 20mm. A variable flow
rate pump supplied transient waveform to provide a pulsatile flow through
the test region. Particle Image Velocimetry (PIV) was used to measure the A Validation Study for a Hypersonic Flow Model
velocity profiles immediately proximal and distal to the contraction. A
Oral Presentation. VVS2018-9414 2:45PM - 3:10PM
comparable CFD model was constructed with the same geometric features
as the test section. The fluid properties in the CFD model were assigned Brian Carnes, Derek Dinzl, Micah Howard, Sarah Kieweg, William Rider,
to be the same as the test fluid, and the inlet boundary conditions of the Tom Smith, V. Gregory Weirs Sandia National Laboratories, Albuquerque,
test loop were set to replicate the flow rate of the pump. A sensitivity NM, United States, Jaideep Ray, Sandia National Laboratories, Livermore,
analysis was performed to determine the effects of each variable in CA, United States
affecting the velocity, including the uncertainties in flow rate, proximal and
distal diameters, and fluid density and viscosity, to name a few. The A new simulation code for hypersonic, reacting turbulent flow (SPARC) is
velocity profiles of the experiment and the CFD models were compared at being developed at Sandia National Laboratories. This presentation
multiple locations proximal and distal to the contraction, for several time focuses on validation efforts for hypersonic reacting laminar flows over a
steps in the pulsatile flow. The results indicate agreement within five double-cone configuration. This is work in progress and will be presented
percent between the CFD and experimental results. The intention of this partly as a case study to highlight how plans are adjusted and objectives
study is to validate the results of the CFD model with experimental data to rescoped as new information and results are obtained.
demonstrate the modeling approach employed is suitable for an internal
pulsatile flow. The overarching goal of this study is to be able to show the Initial verification & validation plans were developed. The verification
validity of the CFD modeling approach in the simulation of pulsatile flow in effort has seen only minor adjustments and is described in a companion
both native vessels and implantable cardiovascular devices. talk. However, the evolution of the validation plan has been driven by the
available experimental data - free stream conditions as a simulation
boundary condition and surface pressure and heat transfer measurements
as quantities of interest (QoIs). Validation evidence, through uncertainty
Sensitivity Study of Foaming Behavior of Simulant Tank Waste during
characterization and propagation, sensitivity analysis, and validation
Vitrification Tests in a Laboratory-Scale Melter
metrics, is presented via the QoIs.
Oral Presentation. VVS2018-9417 2:20PM - 2:45PM
A naïve attempt to reproduce the measurements via simulation failed and
Donna Guillen, Alexander Abboud, Idaho National Laboratory, the freestream boundary conditions that modeled the experiment were
40 Idaho Falls, ID, United States, Richard Pokorny, UCT Prague, Prague, found to be open to question. The assumption that the solution data
Czech Republic should agree with the experimental data, on the forward cone of a
Thursday Technical Program
double-cone configuration upstream of any separated flow, provided the reactions. Water resources research, 43(12), 2007.
motivation to infer the freestream conditions from surface measurements
via an inverse problem. A sensitivity analysis was undertaken to identify [2] Benoit B Mandelbrot and John W Van Ness. Fractional brownian
which of the freestream quantities could be estimated from the motions, fractional noises and applications. SIAM review, 10(4):422{437,
experimental measurements. Given the sparsity and uncertainty in the 1968.
experimental data, a Bayesian inverse problem was formulated, and a
three-dimensional joint probability density function (PDF) computed for
the freestream density, velocity and temperature. Computations involved
constructing a polynomial chaos expansion emulator for SPARC and TRACK 12 VERIFICATION METHODS
performing Markov chain Monte Carlo sampling to realize the joint PDF.
Posterior predictive tests showed a significant narrowing of prediction
uncertainty post-calibration; further, the experimental measurements were
12-1 VERIFICATION METHODS
fully bracketed by the posterior predictive ensemble.
4TH FLOOR, GREAT LAKES A3 1:30PM - 3:35PM
Observations on how the context shapes the validation process and the Convergence Checks and Error Estimates for Finite Element Stresses
conclusions that can be drawn from it will be presented. at Stress Concentrations: Effects of Different Mesh Refinement
Factors
References References:
1. Sinclair, G. B., Beisheim, J.R., and Roache, P. J., “Effective Convergence [1]. Verney, in Behavior of Dense Media under High Dynamic Pressures,
Checks for Verifying Finite Element Stresses at Two-Dimensional Stress Symposium, H.D.P., IUTAM, Paris 1967, Gordon & Breach, New York,
Concentrations,” ASME J. Verification, Validation and Uncertainty pp. 293-303 (1968).
Quantification, Vol. 1, pp. 0410031-8 (2016).
[2]. Kamm et al. LA-14379, 2008.
2. A SME, Guide for Verification and Validation in Computational Solid
Mechanics, American Society of Mechanical Engineers, New York,
Standard No. ASME V&V10 (2006).
EOSlib: Software Reference Implementation for Equations of State
Aaron Krueger, Yassin Hassan Texas A&M University, College Station, TX, EOSlib is a software library written at Los Alamos National Laboratory to
United States, Vincent Mousseau, Sandia National Laboratories, serve as a reference implementation of various analytic equations of state
Albuquerque, NM, United States (EOSs). Core utilities include querying basic thermodynamic quantities,
such as pressure, temperature, entropy, etc., and the library also provides
The use of solution verification methods has rapidly increased within the tools for computing loci at constant entropy and temperature. EOSlib
past two decades. While these methods, such as GCI and Least Squared includes a user-extensible database of parameters for many different
GCI, have matured and proven to be useful, the methods rely on solutions thermodynamic models, as well as tools for combining these into mixtures.
that are within the asymptotic range. The asymptotic range is defined as EOSlib can also be used for calculations on nonequilibrium systems
the range of discretization sizes that force the leading truncation error containing internal degrees of freedom, such as a reacting mixture
term to be dominant. While this definition might be perceived as governed by a rate law. EOSlib has a long track record of use for
straightforward, the practical implementation of determining the thermodynamic calculations involving high explosives. The library is
asymptotic range is not. This study shows some of the key variables that implemented in C++ with both Python and command-line interfaces. We
impact a simulation’s ability to be within the asymptotic range. This first will present the core features of EOSlib and demonstrate examples of how
part of the study calculates the higher order terms individually using the it can be used as part of a verification workflow.
modified equation analysis. This calculation is important when determining
if the leading truncated terms are dominating the higher order terms. The
second part of the study assesses different solution verification methods Finite Element Method Solution Uncertainty, Asymptotic Solution,
ability to estimate the discretization error outside the asymptotic range, and a New Approach to Accuracy Assessment (*)
around the start of the asymptotic range, and inside the asymptotic range.
Technical Publication. VVS2018-9320 3:10PM - 3:35PM
and then developing numerical algorithms and easy-to-use metrics to Computational Studies of Turbulent Flow Interaction Between Twin
assess the solution accuracy of all candidate solutions. In this paper, we Rectangular Jets with OpenFOAM
present a new approach to FEM verification by applying three
mathematical methods and formulating three metrics for solution accuracy Oral Presentation. VVS2018-9351
assessment. The three methods are: (1) A 4-parameter logistic function to
Han Li, Yassin Hassan, N.K. Anand, Texas A&M University, College
find an asymptotic solution of FEM simulations; (2) the nonlinear least
Station, TX, United States
squares method in combination with the logistic function to find an
estimate of the 95 % confidence bounds of the asymptotic solution; and (3) Two or multiple parallel jets system is an important flow structure which
the definition of the Jacobian of a single finite element in order to compute could accomplish rapid mixing. The mixing feature of parallel jets that can
the Jacobians of all elements in a FEM mesh. Using those three methods, be found in many engineering applications. For example, in Very-High-
we develop numerical tools to estimate (a) the uncertainty of a FEM Temperature Reactor (VHTR), the coolants merge in upper or lower
solution at one billion d.o.f., (b) the gain in the rate of PRE per d.o.f. as the plenum after passing through the reactor core, in Sodium Fast Reactor
asymptotic solution approaches very large d.o.f.’s, and (c) the estimated (SFR), the jets mixing of different temperature can cause thermal stresses
mean of the Jacobian distribution (mJ) of a given mesh design. Those and flow induced vibration in rod bundle. Computational Fluid Dynamics
three quantities are shown to be useful metrics to assess the accuracy of (CFD) simulations are extensively incorporated when it comes to studying
candidate solutions in order to arrive at a so-called “best” estimate with parallel jets mixing phenomenon. Therefore, validation of various turbulent
uncertainty quantification. Our results include calibration of those three models is of importance to make sure that the numerical results could be
metrics using problems of known analytical solutions and the application trusted and serve as a guide for the future design. In the past validation
of the metrics to sample problems, of which no theoretical solution is and verification studies, the steady state Reynolds Averaged Navier-
known to exist. Stokes Equations (RANS) simulations were performed and investigated
boundary condition sensitivities with the realizable k-epilson model. The
(*) Contribution of National Institute of Standards & Technology. Not
results showed the importance of boundary conditions, not only velocity
subject to copyright.
profile but also turbulent quantities such as k and epilson profile that could
affect merging point. In this study, an open source CFD library,
OpenFOAM was utilized to perform the numerical simulation with a
Partially-Averaged Navier Stoke Equations model and a Large Eddy
TRACK 1 CHALLENGE PROBLEM WORKSHOPS AND PANEL Simulation (LES) model.
SESSIONS
Partially-Averaged Navier Stoke Equation (PANS) models were considered
as a hybrid model. In present study, with time varying boundary conditions
mapped from PIV measurement, k-epilson PANS model were used to
perform transient simulations and compared to Unsteady Reynolds
1-2 V&V BENCHMARK PROBLEM – TWIN JET COMPUTATIONAL
Averaged Navier-Stokes Equations (URANS) model. As a result, the PANS
FLUID DYNAMICS (CFD) NUMERIC MODEL VALIDATION
results showed a good agreement in terms of the merging point (4.3%).
4TH FLOOR, GREAT LAKES A1 4:00PM - 6:05PM
Power spectrum density (PSD) analysis was performed based on the
Ubiquitous application of CFD motivate the need to verify and validate velocity at four sample locations to compare resolved frequencies
CFD models and quantify uncertainty in the results. The objective of between the PANS and the URANS models. It was observed that PANS
this paper is to present results from a verification and validation study model presented better capabilities in resolving higher turbulence flow
of the ASME benchmark turbulent twinjet problem using the commercial frequencies compared with the URANS, based on the PSD analysis.
CFD code ANSYS Fluent. The twinjet is modeled in two- and three-
Large Eddy Simulation (LES) technique was also used on a mesh with
dimensions, and results are compared with experimental data. Depending
32 Million cells. A fluctuating boundary condition was used. LES simulation
upon the solver set-up, the 2-D model converges to a non-physical
showed a good agreement with PIV in merging point. Proper Orthogonal
solution, a result not found in the 3-D simulation. However, choosing the
Decomposition (POD) analysis was applied on a sample plane. POD
SIMPLEC algorithm in the 2-D model led to a realistic solution and was
analysis visualized the interaction between two jets and the multi-scale
subsequently applied in both the 2-D and 3-D models. A grid refinement
vortical structures. By comparing with the POD analysis from PIV
study was performed to estimate the numerical uncertainty via Richardson
experiment, the POD from LES showed similar structures and spectral
Extrapolation and the Grid Convergence Index method. The model
frequencies.
sensitivity to input parameters was addressed by ranking the importance
factors associated with the inputs to the k-epsilon turbulence model,
nozzle geometry, and mass-flow-rate at the inlet. In the near jet region,
results of this study suggest the turbulent length scale implemented
by the turbulence model will influence the re-circulation region and
merge point of the two jet flows, associated with this complex turbulent
flow regime.
43
Technical Program Thursday
TRACK 9 VALIDATION METHODS FOR SOLID MECHANICS AND method. Hence, the modeling error could be associated with different
STRUCTURES scales and locations, giving a spatial view of the error influenced zone.
Causality analysis is used to identify possible causes for the modeling
errors. This technique is used for model validation, improves the
understanding and confidence of the analysis. Summary of presentation:
9-1 VALIDATION METHODS FOR SOLID MECHANICS AND
STRUCTURES motivation, wavelet multiscale analysis, wavelet decomposition
4TH FLOOR, GREAT LAKES A2 4:00PM - 6:05PM of stressed structure, modeling error fitting, causality analysis,
and conclusion.
Modeling and Validating Residual Stresses in Thick-Walled Cylinders
Oral Presentation. VVS2018-9309 4:00PM - 4:25PM The Dynamics of Fluid Conveying Hydraulic Hose
Zhong Hu, South Dakota State University, Brookings, SD, United States Oral Presentation. VVS2018-9352 4:50PM - 5:15PM
An ever-increasing industrial demand for pressurized thick-walled Jari Hyvarinen, Epiroc Rock Drills AB, Orebro, Sweden
cylindrical components drives research and practice to increase their
strength-weight ratio, extend their fatigue life, or to increase their Fatigue failure of hydraulic hose and its connections, caused by violent
pressure-carrying capacity. This can be achieved through an energy- vibrations, is a large factor creating operational and maintenance cost
efficient and safe swage autofrettage process by generating a favorable for the end user of rock drill equipment. Hydraulic hoses are used as a
compressive residual hoop stress field in the inner layer of the cylinder parts of the energy feeding system in rock drills, for mining and civil
prior to use. In this work, the swage autofrettage processes of thick-walled construction operations, developed by Epiroc Rick Drill AB in Sweden.
cylinders were numerically investigated based on finite element analysis. The work presented here show an approach taken to create an
An elastic nonlinear strain hardening plastic material model with the understanding of the dynamic behavior of a selected hydraulic hose.
Bauschinger effect was adopted. The residual stresses in swage The performed investigation include an evaluation, of if it was sufficient
autofrettaged thick-walled cylinders were predicted. The results from to use bending and torsional rests to evaluate equivalent the stiffness
computer modeling were compared with the experimental results from properties of the steel wire reinforced rubber hose. The analysis approach
Sachs boring technique, a technique for the measurement of axisymmetric use in the work presented include a numerical analysis using BEM
residual stresses from the analysis of strain relaxations during the (boundary element method) to describe the interface between the fluid
incremental removal of layers of material from an axisymmetric and the structure to evaluating the dynamics of pressurized hose with
component. Furthermore, in order to comparing the modeling results with conveying fluid. Experimental modal analysis was used to validate the
the measurements from the neutron diffraction method (a high resolution numerical model. In addition the damping characteristics, of the
nondestructive measurements of atomic structure of a material) taken in pressurized hose with internal fluid flow at different flow rates, was
the disks cutting from the cylinders, the ring-cutting procedures for investigated. The validation approach currently used is using small
preparing neutron diffraction testing were virtually realized in the accelerometers together with LMS Test. Lab based experimental modal
modeling, and the residual stresses and strains were rearranged during hammering and evaluation. Pre-tension and pressure induced tension is
the ring-cutting procedure. The modeling results were compared with monitored with an in-house developed strain gauge based load cell. High
the neutron diffraction measurements. Finally, the modeling method speed camera and point laser measurements have also been utilized as
was validated. backup. The analysis and experiments show that a complex coupling, of
pure structural bending modes, appear when the hose is subjected to
internal flow. Some of the mode shapes show a circular motion of the hose
cross sections. As show in this presentation, these coupled modes seem
Stress Error Affected Zone of Finite Element Results by Wavelet to become increasingly sensitive to external or internal excitation with
Multiscale Analysis increasing flow rate.
Walter Ponge-Ferreira, Escola Politecnica Da Universidade De Sao Paulo, Code Verification for Solid Mechanics Problems including
São Paulo, São Paulo, Brazil Superelastic Nitinol
Wavelet multiscale analysis is used to decompose the stress field of Oral Presentation. VVS2018-9374 5:15PM - 5:40PM
structures with stress concentrations and detect the error affected zone
on the structure. First, the numerical solution of plane stress structures is Kenneth Aycock, US Food and Drug Administration, Silver Spring, MD,
decomposed by wavelet multiscale analysis to separate the near field United States, Nuno Rebelo, Dassault Systemes Simulia Corp, Santa
stress concentration from the far field. The wavelet decomposition Clara, CA, United States, Brent Craven, US Food and Drug Administration,
separates the stress into different scales, weights the stress in different Silver Spring, MD, United States
stress paths, and locates the stress concentration position on the
structure. This approach was applied to a plate with different stress paths Although much progress has been made in advancing and standardizing
44 verification, validation, and uncertainty quantification (VVUQ) practices in
and stress concentration at circular holes. Modeling error is fit to the
wavelet multiscale representation of the stress field by least square recent years, examples of rigorous code verification for solid mechanics
Thursday Technical Program
problems in the literature are sparse - particularly for non-trivial, Critical derailment incidents associated with crude oil and ethanol
large-deformation analyses involving nonlinear materials. transport have led to a renewed focus on improving the performance of
tank cars against the potential for puncture under derailment conditions.
We present code verification of the commercial finite element software Proposed strategies for improving accident performance have included
ABAQUS for elastostatic solid mechanics problems relevant to medical design changes to tank cars, as well as, operational considerations such
devices. Specifically, method of manufactured solutions (MMS) verification as reduced speeds.
is performed for infinitesimal and finite strain formulations with linear
elastic and hyperelastic constitutive models. A separate method of In prior publications, the authors have described the development of a
exact solutions (MES) verification is also performed for the superelastic novel methodology for quantifying and characterizing the reductions in
constitutive model in ABAQUS, which is commonly used to simulate nitinol risk that result from changes to tank car designs or the tank car operating
medical devices. The MES is performed in lieu of MMS since the rate- environment. The methodology considers key elements that are relevant
based equations for superelasticity cannot be represented in closed-form. to tank car derailment performance, including variations in derailment
scenarios, chaotic derailment dynamics, nominal distributions of impact
To perform the MMS verification, a three-dimensional unit cube composed loads and impactor sizes, operating speed differences, and variations in
of C3D8I elements is considered. Analytical source terms are generated tank car designs, and combines these elements into a consistent
by substituting a prescribed, three-dimensional displacement field framework to estimate the relative merit of proposed mitigation strategies.
composed of trigonometric functions into the governing equations using
a symbolic math package (SymPy/Python or Mathematica). The generated The modeling approach involves detailed computer simulations of
source terms are then implemented in ABAQUS simulations as point loads derailment events, for which typical validation techniques are difficult to
with appropriate nodal volume weighting, a grid refinement study is apply. Freight train derailments are uncontrolled chain events, which are
performed, and the results are post-processed to extract error norms and prohibitively expensive to stage and instrument; and their chaotic nature
the observed orders of convergence. makes the unique outcome of each event extremely sensitive to its
particular set of initial and bounding conditions. Furthermore, the purpose
The MES verification of the superelastic constitutive model is instead of the modeling was to estimate the global risk reduction expected in the
carried out on an affine deformation problem using a single C3D8I U.S. from tank car derailments, not to predict the outcome of a specific
element. Because the underlying equations for the constitutive model are derailment event.
cast in rate-form and cannot be integrated analytically, numerical results
are compared to surrogate analytical solutions for linear transformation These challenges call into question which validation techniques are most
behavior at points of theoretically exact equivalence: the initiation, appropriate, considering both the modeling intent as well the availability
midpoint, and endpoint of austenite-martensite phase transformation. and fidelity of the data sets available for validation. This paper provides an
overview of the verification and validation efforts that have been used to
The linear and hyperelastic MMS results show excellent agreement enhance confidence in this methodology.
between the observed and theoretical orders of convergence for both the
small and large strain formulations. The MES results also show excellent
agreement between analytical and numerical calculations, providing
evidence of proper implementation of the superelastic constitutive model. TRACK 13 VERIFICATION AND VALIDATION FOR BIOMEDICAL
Validation activities can be now performed with greater confidence that ENGINEERING
coding errors do not influence simulation predictions.
Acknowledgements:
13-1 VERIFICATION AND VALIDATION FOR BIOMEDICAL
This study was funded by the U.S. FDA Center for Devices and ENGINEERING
Radiological Health (CDRH) Critical Path program. The mention of 4TH FLOOR, GREAT LAKES A3 4:00PM - 6:05PM
commercial products, their sources, or their use in connection with
A threshold-based approach for determining acceptance criteria
material reported herein is not to be construed as either an actual or
during computational model validation
implied endorsement of such products by the Department of Health and
Human Services. Oral Presentation. VVS2018-9416 4:00PM - 4:25PM
acceptability. The approach provides a well-defined acceptance criteria, Quantification of Uncertainties in System Properties for Prediction of
which is a function of the proximity of the simulation and experimental Core Temperature During Unplanned Perioperative Hypothermia
results to the safety threshold. The acceptance criteria developed Using a Three-Dimensional Whole Body Model
following the threshold approach is not only a function of the Error, E
(defined as the difference between experiments and simulations), but also Oral Presentation. VVS2018-9386 4:50PM - 5:15PM
considers the risk to patient safety because of E. The method is applicable
Anup Paul, Mark Burchnall, Robert States, Harbinder Pordal, Clinton
for scenarios in which a safety threshold can be clearly defined (e.g., the
Haynes Stress Engineering Services, Inc., Mason, OH, United States
viscous shear-stress threshold for hemolysis in blood contacting devices).
Introduction: Unplanned hypothermia, i.e. core temperature less than 36
The applicability of this new validation approach was tested using the degrees C, in surgical patients receiving anesthesia can cause
example of blood flow through the FDA-developed nozzle geometry. The complications [1]. The physiological thermoregulation response of the
context of use (COU) was to evaluate if the instantaneous viscous shear body is altered in an anesthetized patient, thus exacerbating heat loss
stresses present during flow though the nozzle at Reynolds numbers (Re) from the extremities. Hypothermia related perioperative complications
of 3500 and 6500 were below a commonly accepted threshold for include wound infections, altered drug metabolism, impaired blood
hemolysis. The CFD results (“S”) for velocity and viscous shear stress were clotting and prolonged recovery time. Maintaining normothermia using
compared with inter-laboratory experimental measurements (“D”). The active warming devices, especially during the intraoperative period, can
uncertainties in the CFD and experimental results due to input parameter help prevent complications and also reduce hospitalization costs. Current
uncertainties were quantified following the ASME V&V 20 standard. warming methods include blankets, fluid warmers, warmed IV fluids and
forced-air warming blankets. Although forced-air warming blankets have
The credibility of the CFD models for both Re=3500 and 6500 conditions demonstrated superior clinical performance, its effectiveness may be
could not be sufficiently established by performing a direct comparison limited by the size and location of the surgical site. Therefore, it is
between the CFD and experimental results using a statistical Student’s necessary to continue developing and evaluating devices with improved
t-test. However, following the threshold-based approach, a Student’s t-test efficiency in maintaining normal perioperative body temperature. Effective
comparing |S-D| and |Threshold-S| for Re=3500 showed that the model using of credible computational modeling and simulations can enable
could be considered sufficiently credible for the COU. For Re=6500, at faster and safer pathways to market while reducing the size of animal and
certain geometrical locations of the flow domain where the shear stress human clinical trials [2]. A key factor in the risk-informed credibility
values are near the hemolysis threshold, the CFD model could not be assessment of computational models is the quantification of uncertainties
considered sufficiently credible for the COU. Analysis showed that the in model outputs due to uncertainties in system properties (input
credibility of the model could be sufficiently established either by parameters).
reducing the uncertainties in the experiments, the simulations, and the
threshold defined value, or by increasing the sample size for the Methods: In this study we utilize a computational whole body model [3] to
experiments and simulations. Our threshold approach can be applied to predict the drop in core temperature during the first 60 minutes of the
all types of computational models and provides an objective method for surgical procedure. The model has two components: the Pennes bioheat
determining model credibility in the evaluation of medical devices. equation to simulate tissue temperature and an energy balance equation
to determine the change in blood temperature. The uncertainty in the
predicted core temperature due to variabilities in the tissue parameters,
metabolic rate and boundary conditions is calculated using the sensitivity
Validation and Verification on Interventional Implantable Devices
coefficient (local) method for parameter sensitivity propagation. The
Oral Presentation. VVS2018-9342 4:25PM - 4:50PM sensitivity coefficients will be obtained using a second-order finite
difference approximation.
Hui Zuo, Chenxi Wang Suzhou Medical Implant Mechanics CO. Ltd.,
Suzhou, Jiangsu, China, Xiaoyan Gong, Suzhou InnoMed Medical Device Results: The expected results from this study will identify and quantify the
Co. Ltd., Suzhou, Jiangsu, China uncertainties in the input parameters and the uncertainties propagated to
the predicted core temperature drop. The importance factors will also be
Two kinds of interventional products, a stent and an occlude are used as computed to assess the relative importance of the input parameters on
examples for verification and validation of non-linear finite element the model uncertainty.
analysis. Mesh density, element type, computational algorithms,
implementations of non-linear constitutive laws, and experiment noises Conclusion: The assessment of model uncertainties is a key component of
are considered during the study to gain confidence on the fatigue safety the verification and validation evidence to support the use of the
predictions of the devices. computational model for medical device development.
References:
Immersion and Exercise Scenarios: Application of a Tissue Blood Loughran, Excelen: Center for Bone & Joint Research and Education,
Interactive Whole-Body Model, Numerical. Heat Transfer, Part A, 68(6), Minneapolis, MN, United States, Kumar Kartikeya, ANSYS, Inc., Pune,
pp. 598-618. India, Jason Inzana, Zimmer Biomet, Broomfield, CO, United States, Anup
Gandhi, Zimmer Biomet, Westminster, CO, United States
A Framework for Generating Mitral Valve Finite Element Models that ASTM standardized test methods are used extensively to evaluate the
Match Diseased States mechanical performance of spinal devices, particularly to support their
regulatory clearance. By contrast, finite element analysis (FEA) is primarily
Oral Presentation. VVS2018-9393 5:15PM - 5:40PM performed during spinal device development and is infrequently used as
part of the regulatory submission. In this study, experimental results for
Reza Salari, Sella Yunjie Wang, Mahesh Kailasam, Yev Kaufman,
ASTM F1717 (Standard Test Methods for Spinal Implant Constructs in a
Thornton Tomasetti, Cupertino, CA, United States
Vertebrectomy Model) were compared to interlaboratory computational
Finite element-based evaluation and prediction of the behavior of simulations of the F1717 test method. In order to develop best practices for
implanted medical devices by including interactions of the devices with computational modeling of this test method, we investigated the influence
human body organs has emerged as an essential technique in the march of several modeling input parameters on the predicted compression-
towards patient-specific treatments. One of the main challenges in this bending response.
progress is the ability to create finite element models of organs or parts
that match various diseased states, with patient-specific finite element ASTM F1717 testing was performed on generic pedicle screw constructs
models being the ultimate goal. One approach to creating diseased-state consisting of monoaxial pedicle screws (Ti-6Al-4V), set screws (Ti-6Al-4V),
finite element models of an organ or body part is to use imaging data sets spinal rods (Ti-6Al-4V), and test blocks (UHMWPE) in compression-
with segmentation-based methods to generate corresponding 3D models bending. An interlaboratory FEA study replicating the experimental testing
and meshes. Despite significant improvements in segmentation software, was performed by FDA, ANSYS, and Excelen. Each lab was blinded to
this process can still be time consuming and very importantly may not experimental results. Material properties were obtained from the literature
capture all the details that would be required in a finite element model for for the UHMWPE components. Titanium spinal rods were modeled as an
the simulated behavior to match the diseased organ or body part isotropic material using either a multilinear isotropic hardening approach
behavior, especially when there is motion involved, such as with cardiac (MISO model) or by fitting the experimental test data to a bilinear
motion. An alternative approach, one that can even be used in elasto-plastic model, the parameters of which were determined from
conjunction with the segmentation approach, is to modify a pre-existing, tensile testing per ASTM E8 of dogbone specimens. Initial simulations
reasonably representative, finite element model in a manner that would assumed perfectly-bonded contact at all interfaces in the spinal construct.
allow the simulated behavior of an organ or body part to match targeted After comparing initial simulations experimental results, various model
behavior. In this study, we demonstrate that a shape matching, inverse parameters (e.g., element type, discretization, interconnection between
finite element framework can be applied to develop simulations of the construct components) were altered to understand the influence of these
mitral valve (MV) that are a closer match to observed behavior, such as parameters on the mechanical behavior of the construct.
diseased states and eventually patient-specific behaviors. We start with
the Living Heart Human Model (LHHM) as the baseline finite element There was less than 10% difference in the ASTM F1717 force-displacement
model fairly accurately representing a healthy heart, from which we create curves submitted by the three modeling groups. Compared to
a submodel of the mitral region, consisting of the mitral annulus, leaflets, experimental force-displacement curves, all simulations exhibited slightly
and chordae, for efficiency reasons. In the next step, the differences stiffer behavior (less than 15% difference) throughout.In conclusion, it is
observed between the mitral valve leaflet positions in this baseline critical to establish a standardized framework for computational stress
submodel and defined targeted leaflet positions is minimized through an analysis to increase the use of modeling in orthopedic regulatory
optimization process which adjusts chordae lengths as design parameters, submissions. This study is working towards that goal by outlining the
although additional parameters such as the material properties of the influence of various model assumptions and parameters on the
chordae or locations of attachment points could also be considered. This mechanical response of pedicle screw-rod constructs, along with their
automated and scalable (additional parameters can be included as associated uncertainties. This information will further elucidate best
needed) framework offers an efficient solution for generating finite practices for conducting FEA for this product area.
element models that more closely match targeted disease states and
observed patient-specific behavior.
Marc Horner, ANSYS, Inc., Evanston, IL, United States, Srinidhi Nagaraja,
47
G.Rau Inc., Santa Clara, CA, United States, Andrew Baumann, U.S. Food
and Drug Administration, Silver Spring, MD, United States, Galyna
Technical Program
Friday, May 18, 2018
Friday Technical Program
TRACK 1 CHALLENGE PROBLEM WORKSHOPS AND PANEL cylinder for an incompressible fluid with Reynolds number Re = 100 is the
SESSIONS test case used in the 2nd Workshop on iterative Error in Unsteady Flow
Simulation. The Nuclear Engineering Modelling (NEMO) group of the
Politecnico di Torino participates to the Workshop using the commercial
software STAR-CCM+ v.12.06.010-R8. Different combinations of pre-set
1-1 UNSTEADY FLOW WORKSHOP
grids and time steps have been considered in the simulations, using for
4TH FLOOR, GREAT LAKES A1 8:00AM - 10:05AM
each combination four different numbers of inner iteration (i.e. 10, 50, 100
Session Organizer: Luis Eca, IST, Libon, Portugal and 200) per time step. Following the requests of the workshop
organizers, the drag, the lift and the pressure coefficients are monitored
Introduction to the Workshop on Iterative Error in Unsteady Flow during the transient, up to the point when the solution reaches the
Simulations periodicity (after ~ 200 s from the initial condition of quiet fluid). The angle
of separation of the flow from the cylinder is also evaluated monitoring the
Oral Presentation. VVS2018-9310 surface point where the shear stress becomes zero. The results of the
simulations will be made available in due time before the workshop, to
Luis Eca, IST, Lisbon, Portugal, Guilherme Vaz, MARIN, Wageningen,
contribute to the comparison between different participants.
Netherlands, Martin Hoekstra, Maritime Research Intitute Netherlands,
Wageningen, Netherlands
Workshop on Iterative errors in Unsteady Flow Simulations: STAR- Vuko Vukcevic, Zeljko Tukovic, Hrvoje Jasak, Faculty of Mechanical
CCM+ results Engineering and Naval Architecture, University of Zagreb, Zagreb, Croatia
(Hrvatska)
Oral Presentation. VVS2018-9368
The group from Faculty of Mechanical Engineering and Naval Architecture
Laura Savoldi, Andrea Bertinetti, Roberto Zanino, Andrea Zappatore (University of Zagreb, Croatia) developing and maintaining foam-extend, a
Dipartimento Energia, Politecnico Di Torino, Torino, (TO), Italy, Rosa community driven fork of the OpenFOAM software will perform unsteady
Difonzo, NEMO group, Dipartimento Energia, Politecnico di Torino, Torino, computations for the 2nd Workshop on Iterative Errors in Unsteady Flow
Italy 49
Simulations. The simulations will be performed for the 2D laminar flow past
a cylinder according to the guidelines. Recently developed time and
The simulation of a laminar two-dimensional external flow around a
Technical Program Friday
under-relaxation consistent segregated solution algorithms will be used. and non-linear regression models will be trained on measured training
The results will be submitted by May 1st, 2018. data and then validated using test data. The process of verification and
validation of these models will be presented and then generalized in order
to aid in developing the VVUQ guidelines.
Verification-Validation And Uncertainty Quantification Methods For resolutions to each type of issue/problem. An activity flowchart for DDM
Data-Driven Models In Advanced Manufacturing based on supervised learning has been developed and its associated
documentation, including an example, is being created. The flowchart and
Oral Presentation. VVS2018-9424 8:50AM - 9:15AM document provide a clear VV-UQ process for DDM and provide the
foundation for a general guideline, which will enable practitioners of
Ronay Ak, Yung-Tsun Lee, Guodong Shao National Institute of Standards
VV-UQ to better assess and enhance the credibility of their data-driven
and Technology, Gaithersburg, MD, United States, Rumi Ghosh, Robert
and hybrid models built to solve advanced manufacturing problems.
Bosch, LLC, Palo Alto, CA, United States, Heather Reed, Thornton
Tomasetti - Weidlinger Applied Science Practice, New York, NY, United
States, Laura Pullum, Oak Ridge National Laboratory, Oak Ridge, TN,
United States ACKNOWLEDGMENTS
The Verification and Validation of Computational Modeling for Advanced The authors would like to acknowledge the support of the ASME V&V 50
Manufacturing, or V&V 50, is one of the subcommittees under the Subcommittee members Gaurav Ameta from NIST, and Mahmood
American Society of Mechanical Engineers (ASME) Verification and Tabaddor from UL in preparation of the framework/guidance which is still
Validation standards committee. The charter of the V&V 50 is to provide in progress.
procedures for verification, validation, and uncertainty quantification in
computational models including predictive models and simulations for
advanced manufacturing. The Verification, Validation, and-Uncertainty
Terminology, Concepts, Relationships and Taxonomy for VVUQ in
Quantification (VV-UQ) methods in data-driven and hybrid models
Manufacturing
working group, or VV-UQ WG, under the V&V 50, addresses the
applications in advanced manufacturing with focus on VV-UQ methods in Oral Presentation. VVS2018-9425 9:15AM - 9:40AM
data-driven and hybrid models. The mission of the working group is to
provide a framework and guidance to the VV-UQ issues/problems related Sankaran Mahadevan, Prof., Nashville, TN, United States, Yung-Tsun Lee,
to data-driven and hybrid models that manufacturing industry tackle. The National Institute of Standards and Technology, Gaithersburg, MD, United
manufacturing industry has become significantly data-intensive in recent States, Gaurav Ameta, Dakota Consulting, Silver Spring, MD, United
years. Continuous improvements in sensor technologies and data States, Sanjay Jain, George Washington University, Washington, DC,
acquisition systems allow the manufacturing industry to effectively and United States
efficiently collect large and diverse volumes of data. Data analytics has
demonstrated its great potential for transforming raw data into information This presentation summarizes the ongoing work of V&V 50
and knowledge for smart decision making during design, manufacturing, Subcommittee’s task group on terminology, concepts, relationships, and
use, and post-use. The objective of this presentation is to describe the taxonomy for VVUQ in advanced manufacturing applications. The task
VV-UQ WG’s ongoing activities with the focus predominantly on the group is charged with the following activities: (1) survey the definitions in
VV-UQ aspect of data-driven modeling. A data-driven model (DDM) in the existing V&V standards and guides (e.g., ASME, IEEE, AIAA, ISO, DoD etc.);
manufacturing domain can be built using data analytics techniques to (2) explore applicability of existing definitions to advanced manufacturing;
analyze the data generated by the manufacturing processes or system. (3) suggest adaptations or extensions of existing definitions to advanced
Data analytics techniques include, but are not limited to, statistical, data manufacturing; and (4) suggest definitions of new concepts unique to
mining, and machine learning descriptive and predictive models. The advanced manufacturing. The terminology being surveyed is divided into
objective of a DDM is to find an empirical map between the input and four groups: verification, validation, calibration, and uncertainty
output with or without explicit knowledge of the physical behavior of the quantification. Within verification, the focus is on concepts related to code
process or system. For model credibility, the DDM must be verified and verification, solution verification, error estimation, and accuracy
validated, and the uncertainties associated with the model should be requirements. Within validation, the focus is on concepts related to system
quantified and their effects propagated to the outcome quantities of response quantities of interest, validation domain vs. application domain,
interest. In this presentation, we will discuss the technical approach and accuracy requirements, validation metrics, and validation hierarchy. Within
up-to-date progress of the VV-UQ WG. The working group aims to define calibration, the concepts relate to model parameters, model discrepancy,
the general guideline for VV-UQ by performing the following tasks: (i) physics based vs. data-driven vs. hybrid models, calibration data issues,
investigate existing VV-UQ standards/procedures and data-mining and fusion of heterogeneous data. Within uncertainty quantification, the
process models like ASME V&V 10 and Cross-Industry Standard Process focus is on both aleatory and epistemic uncertainty sources, uncertainty
for Data Mining (CRISP-DM), and adapt them to advanced manufacturing aggregation and roll up towards system level prediction, model predictive
VV-UQ where applicable; (ii) study different use cases of DDMs defined by capability assessment, and quantification of margins and uncertainty
industry and academia; (iii) uncover the commonalities in the patterns of (QMU). This activity will liaison with other task groups within V&V50, as
VV-UQ for advanced manufacturing and generalize issues/problems in well as build on previous and ongoing work by other V&V subcommittees.
advanced manufacturing; and (iv) provide generic recommendations/
51
Technical Program Friday
Computational and Experimental Efforts to Quantify Uncertainty of TRACK 4 UNCERTAINTY QUANTIFICATION, SENSITIVITY ANALYSIS,
Turbomachinery Components AND PREDICTION
Jeff Brown, US Air Force Research Laboratory, Wright Patterson AFB, OH, 4-2 UNCERTAINTY QUANTIFICATION, SENSITIVITY ANALYSIS, AND
United States PREDICTION: SESSION 2
4TH FLOOR, GREAT LAKES A2 10:30AM - 12:35PM
This presentation reviews recent activities to understand the variation in
manufactured turbomachinery components, their structural and Optimal Information Acquisition Algorithms for Inferring the Order of
aerodynamic models, and the experimental data used for validation. Sensitivity Indices
Results from structured light geometry measurements for high pressure
turbine and compressor airfoils are reviewed and assessed with principal Oral Presentation. VVS2018-9350 10:30AM - 10:55AM
component analysis. The modes from this analysis are used as variables
Piyush Pandita, Ilias Bilionis, Purdue University, West Lafayette, IN, United
to emulate the structural frequency, mode shape, steady aerodynamic
States, Jesper Kristensen, General Electric Co, Niskayuna, NY, United
efficiency, and unsteady surface pressure. These results are compared to
States
structural bench testing and rotating rig aerodynamic testing. A new
tetrahedral-element based approach for constructing the as- Numerous engineering problems are characterized by objective functions,
manufactured components is discussed and a compared with the results where the evaluation of the objective via experiment and/or simulation
from more conventional hexahedral models. Efforts are also shown to comes at the expense of vast resources. In some problems, running a
improve structural emulator accuracy through inclusion of gradient simple instance of the objective can take days or weeks. The goal of
information gathered from a computationally efficient eigensensitivity these problems is to identify regions of design space which satisfy a set of
solution. New strategies to emulate spatial field responses such as the criteria defined at the outset (which includes optimization cases). In many
airfoil structural mode shapes are discussed. To conclude, future engineering applications, it is of key interest to understand which design
directions and challenges are discussed. variables drive changes in the objectives and to compute the relative
order of importance. This question is answered via a combination of
data-driven modeling and global sensitivity analysis where so-called
sensitivity indices are computed. A relative comparison of the size of the
TRACK 1 CHALLENGE PROBLEM WORKSHOPS AND PANEL indices indicates which variables are more important than others in driving
SESSIONS the objective. Towards this, Bayesian global sensitivity analysis (BGSA)
constructs a computationally cheap probabilistic surrogate of the
expensive objective function(s) using Gaussian Process (GP) regression.
1-3 INDUSTRY CHALLENGES IN UNCERTAINTY QUANTIFICATION: The GP surrogate(s) provides approximate samples, requiring relatively
BRIDGING THE GAP BETWEEN SIMULATION AND TEST few resources, of the underlying function from which the usual variance
4TH FLOOR, GREAT LAKES A1 10:30AM - 12:35PM based sensitivity index is computed for each sample. The Bayesian
sensitivity indices are obtained by averaging over all the above samples.
Session Organizer: Mark Andrews, SmartUQ, Madison, WI, United States In this work, we develop an optimal acquisition strategy for obtaining the
Session Co-Organizer: Peter Chien, SmartUQ, Madison, WI, United States most relevant regions of the design space if one mainly cares about
obtaining accurate ranking of the sensitivity indices. We propose an
By applying advanced statistical methods such as Uncertainty algorithm that evaluates the merit of a hypothetical measurement towards
Quantification (UQ), simulation models have become a trustworthy source the segregation of the individual sensitivity indices. This framework guides
of information for decision analytics. Unfortunately, there are significant the designer towards evaluating the objective function to acquire
cultural and technical challenges which prevent organizations from information about the sensitivities sequentially. We verify and validate the
utilizing UQ methods and techniques in their engineering practice. This proposed methodology by applying it on synthetic test problems with
tutorial will provide an overview of UQ concepts and methodology and known solutions. We then demonstrate our approach on a real-world
discuss the strategies for addressing these challenges. One of the industry engineering problem of optimizing a compressor for oil
strategies is performing statistical calibration to understand how well applications. The problem is characterized by an expensive objective (lab
numerical simulation represents reality. Using a case study for illustration, tests; each taking 1-2 days to complete) and a high-dimensional input
the tutorial will sequentially walk through the statistical calibration process space with on the order of 30 input variables. This is an important problem
used to quantify uncertainties for simulations and physical experiments. since it provides an essential step towards enabling the customer to
Attendees should leave the tutorial with an understanding of UQ concepts propose more competitive products in the market by more carefully
and techniques, how to apply statistical calibration to their combined analyzing and quantifying the compressor capabilities.
simulation and testing environments, and the fundamental value that UQ
brings. As a purely educational tutorial, SmartUQ software will only be
used for illustration of the methods and examples presented.
52
Friday Technical Program
Experimental Data UQ and QMU for Stochastic Systems characterized Quantify the Uncertainty: 95% Confidence in Aerodynamic Model
through Sparse Unit Testing involving Variability and Uncertainties in Predictions
Measurements, Loading, and System Properties
Oral Presentation. VVS2018-9375 11:20AM - 11:45AM
Oral Presentation. VVS2018-9373 10:55AM - 11:20AM
Seth S. Lawrence, Earl P.N. Duque, Intelligent Light, Rutherford, NJ,
Vicente Romero, Sandia National Laboratories, Albuquerque, NM, United States, Andrew Cary, John A. Schaefer, Boeing Research and
United States Technology, St. Louis, MO, United States
An extension to the Coleman & Steele [1] and ASME PTC-19.1 [2] The uncertainty in model results may be quantified through rigorous
experimental data uncertainty methodologies is illustrated for sparse verification, validation and uncertainty quantification (VVUQ) procedures.
replicate tests involving stochastically varying systems with small random This presentation will present the application of the Oberkampf and Roy
variations in system properties (geometries, material properties, etc.). The Uncertainty framework that was applied to several aerodynamics studies.
tests also involve small load-control variations and measurement errors/ The first UQ study was a NACA0012 airfoil case at zero lift, demonstrating
uncertainties on experimental inputs and outputs. Some of the the underlying principles behind a total uncertainty study (numerical,
uncertainties are described by intervals and others by probability input, and model form uncertainty). Numerical uncertainty was found via
distributions. The methodology is demonstrated on the Data UQ portion grid refinement using the OVERFLOW 2 solver to carry out computations
of the Sandia Cantilever Beam End-to-End UQ problem [3,4]. A small and Dakota was used to perform mixed UQ statistical input analysis of the
number of beams are randomly drawn from a large population and then solver input parameters. A comparative analysis was made using
deflection-tested. The tests involve substantial aleatory and epistemic polynomial chaos expansion (PCE) to identify potential differences
uncertainties from the sources mentioned above. Uncertainty of deflection associated with uncertainty quantification methods. Model form
response and of the probability of exceeding a specified deflection uncertainty was found using validation of the model results via
threshold are estimated for: A) the whole population of beams; and B) a experimental data and the Area Validation Metric methodology
random beam selected from the population. Results are compared to the techniques. The second UQ study considered the two AIAA High Lift
truth quantities for several random trials involving different realizations of Prediction workshop 3 configurations. Common Research Model and the
the uncertain quantities in the experiments. JAXA standard model. Simulations were performed to compute 95%
confidence levels in lift, drag and pitching moment predictions by the
[1] C
oleman, H.W., and Steele, Jr., W.G., Experimentation and Uncertainty OVERFLOW 2 solver. The implementation of VVUQ techniques to complex
Analysis for Engineers, 2nd Edition, John Wiley & Sons, New York, NY, engineering problems was shown to necessitate the need for new
1999. adaptive workflows that can take advantage of the large computational
resources required for a thorough UQ analysis.
[2] ASME PTC 19.1-2005, Test Uncertainty.
[3] R
omero, V., B. Schroeder, M. Glickman, Cantilever Beam End-to-End
UQ Test Problem: Handling Experimental and Simulation Uncertainties Discrete-Direct Calibration, Real-Space Validation, and Predictor-
in Model Calibration, Model Validation, Extrapolative Prediction, and Corrector Extrapolation applied to the Cantilever Beam End-to-End
Risk Assessment, Sandia National Laboratories document SAND2017- UQ Problem
4689 O, version BeamTestProblem-34.pdf, Jan. 2018.).
Oral Presentation. VVS2018-9376 11:45AM - 12:10PM
[4] R
omero, V., Cantilever Beam End-to-End UQ Test Problem and
Evaluation Criteria for UQ Methods Performance Assessment, Sandia Vicente Romero, Sandia National Laboratories, Albuquerque, NM, United
National Laboratories document SAND2017-4592 C presented at 2017 States
ASME V&V Symposium, May 3-5, Las Vegas, NV (in ASME V&V
Symposium proceedings archive). A novel set of coordinated methods comprising a systems approach
to model calibration, validation, and extrapolative prediction will be
illustrated on the Sandia Cantilever Beam End-to-End UQ problem [1,2].
The problem involves many challenging uncertainty treatment aspects
- Sandia National Laboratories document SAND2018-0732 A. This summarized below. The problem emphasizes difficult paradigm and
abstract is a work of the United States Government and is not subject to strategy issues encountered in real end-to-end UQ problems while being
copyright protection in the U.S. computationally trivial so that approaches and methodologies can be
focused on. The talk will present a set of practical calibration, validation,
- S
andia National Laboratories is a multi-mission laboratory managed and
and extrapolation approaches that suitably extend end-to-end,
operated by National Technology and Engineering Solutions of Sandia,
demonstrated on the Beam problem with standard EXCEL spreadsheet
LLC, a wholly owned subsidiary of Honeywell International, Inc., for the
tools. Other approaches are sought that extend end-to-end in a
U.S. Department of Energy’s National Nuclear Security Administration
satisfactory manner. Initial examination of methods in the literature
under contract DE-NA0003525.
has revealed a lack of other suitable end-to-end UQ frameworks for the
Beam problem.
53
Technical Program Friday
The Beam problem is a simplified prototype problem for stochastic Deep UQ - Learning deep neural network surrogate models for
physical systems with scalar inputs and outputs. Responses are to be uncertainty quantification
predicted for a population of nominally identical cantilever beams with
small variations in material stiffness and geometry (beam lengths, widths, Oral Presentation. VVS2018-9381 12:10PM - 12:35PM
heights). To estimate material stiffness variability in the population of
Rohit Tripathy, Ilias Bilionis, Purdue University, West Lafayette, IN, United
beams, model calibrations [3] are conducted for a small subset of tested
States
beams selected at random from the larger population. The tests at
temperature T1 measure deflection under a prescribed target load. Small State-of-the-art computer codes for simulating real physical phenomena
control variations exist in the actual applied loads in the four tests. are often characterized by vast numbers of input parameters. Often, these
Random and systematic errors/uncertainties exist in the measurements of input parameters are uncertain, and one needs to rigorously assess the
load magnitude and beam dimensions and deflection. The random effect of these uncertainties on the outputs from the computer codes.
measurement errors vary from test to test according to a specified Performing uncertainty quantification (UQ) tasks with Monte Carlo (MC)
probability distribution. Systematic measurement errors are strongly methods is almost always infeasible because of the need to perform
correlated across the tests and are described by a prescribed interval hundreds of thousands or even millions of forward model evaluations in
range of uncertainty. Substantial epistemic uncertainty concerning the order to obtain convergent statistics. One, thus, tries to construct a
aleatory distribution of material stiffness properties exists due to the small cheap-to-evaluate surrogate model to replace the forward model solver.
number of tests. The material stiffness variability/uncertainty description However, for systems with large numbers of input parameters, one has to
from the calibrations is used with the differential equation based physics deal with the curse of dimensionality - the exponential increase in the
model to predict responses at different circumstances as follows. Model volume of the input space, as the number of parameters increases linearly.
validation is conducted at two configurations: A) a single test at the This necessitates the application of suitable dimensionality reduction
calibration temperature T1 but different beam dimensions and techniques. A popular class of dimensionality reduction methods are
configurationally different loading; B) two tests at temperature T2 those that attempt to recover a low dimensional representation of the high
(temperature is suspected to affect material stiffness) with the beam dimensional feature space. Such methods, however, often tend to
dimensions in A but different loading. Information from the validation overestimate the intrinsic dimensionality of the input feature space. We
assessments may be used in further predictions of beam population demonstrate the use of deep neural networks (DNN) to construct
responses at the geometry and loading conditions in A and B but surrogate models for numerical simulators. We parameterize the structure
temperature extrapolation to T3. Predictions are sought for beam of the DNN in a manner that lends the DNN surrogate the interpretation of
population tip displacements and the proportion of beams that exceed a recovering a low dimensional nonlinear manifold. The model response is a
critical displacement threshold. parameterized nonlinear function of the low dimensional projections of the
input. We think of this low dimensional manifold as a nonlinear
[1] Romero, V., B. Schroeder, M. Glickman, Cantilever Beam End-to-End UQ
generalization of the notion of the active subspace. Our approach is
Test Problem: Handling Experimental and Simulation Uncertainties in
demonstrated with a problem on uncertainty quantification in two-
Model Calibration, Model Validation, Extrapolative Prediction, and Risk
dimensional, single phase, steady-state flow representing flow in an
Assessment, Sandia National Laboratories document SAND2017-4689
idealized oil reservoir. The dynamics of porous media flow are governed
O, version BeamTestProblem-34.pdf, Jan. 2018.
by Darcy’s law, which is parameterized by a permeability tensor.
[2] Romero, V., Cantilever Beam End-to-End UQ Test Problem and Uncertainties in the soil structure makes this high-dimensional
Evaluation Criteria for UQ Methods Performance Assessment, Sandia permeability tensor uncertain. Mathematically, the task of performing UQ
National Laboratories document SAND2017-4592 C presented at 2017 in this idealized oil reservoir reduces to the task of solving a stochastic
ASME V&V Symposium, May 3-5, Las Vegas, NV (in ASME V&V elliptic partial differential equation (SPDE) with uncertain diffusion
Symposium proceedings archive). coefficient. We deviate from traditional formulations of the SPDE problem
by not imposing a specific covariance structure on the random diffusion
[3] Romero,
omero, V.J., Discrete-Direct Model Calibration and Propagation coefficient. Instead we attempt to solve a more challenging problem of
Approach addressing Sparse Replicate Tests and Material, Geometric, learning a map between an arbitrary snapshot of the permeability tensor
and Measurement Uncertainties, Sandia National Laboratories and the pressure field.
document SAND2017-12524 C, Soc. Auto. Engrs. 2018 World Congress
(WCX18) paper 2018-01-1101, April 10-12, Detroit.
ASME products and services include our renowned codes and standards,
certification and accreditation programs, professional publications,
technical conferences, risk management tools, government regulatory/ ASME STAFF
advisory, continuing education and professional development programs. Ryan Crane, P.E., Director, S&C Initiatives, Standards & Certification
These efforts, guided by ASME leadership and powered by our volunteer
networks and staff, help make the world a safer and better place today, Kathryn Hyam, P.E., S&C Project Engineering Manager
and for future generations.
Marian Heller, Business Development Manager, Healthcare
Visit www.asme.org
Michelle Pagano, S&C Engineer
Visit www.asme.org/kb/standards
55
Sponsors
SILVER SPONSOR
Medtronic
Medtronic is a mission-driven company dedicated to the contribution of human welfare by application of biomedical engineering in the research, design,
manufacture, and sale of instruments or appliances that alleviate pain, restore health, and extend life.
The world’s largest medical technology company, Medtronic was founded in a Minneapolis garage by Earl Bakken and his brother-in-law Palmer
Hermundslie in 1949. Our first life-changing therapy — a wearable, battery-powered cardiac pacemaker — was the foundation for many more Medtronic
therapies that use our electrical stimulation expertise to improve the lives of millions of people. Over the years, we developed additional core
technologies, including implantable mechanical devices, drug and biologic delivery devices, and powered and advanced energy surgical instruments.
Today, our 85,000 employees develop technologies that are used to treat over 60 medical conditions in 160 countries around the world. Our products
improve the lives of two people every second of every day.
We’re proud to sponsor this year’s conference. To learn more, visit us at www.medtronic.com
SMARTUQ
SmartUQ’s software provides powerful uncertainty quantification and engineering analytics solutions for simulation, testing, and complex systems. Using
cutting edge statistics and probabilistic methods, SmartUQ dramatically accelerates analysis cycles by reducing design iterations, improves design
robustness, and maximizes insight of complex systems by quantifying uncertainties.
SmartUQ’s industry-leading software features an innovative suite of uncertainty quantification and analytics, including modern and unique design of
experiments, accurate and flexible emulations, statistical calibration, design space exploration, sensitivity analysis, optimization, uncertainty propagation
and inverse analysis. With world-class expertise in analytics and data science, SmartUQ creates and applies cutting edge technologies to meet clients’
needs where no existing solution can.
BRONZE SPONSOR
Wolf Star Technologies understands the challenges of Product Development and the pitfalls of structural and dynamic issues that plague every Product
Development project. The unique strength that Wolf Star Technologies brings to the table is our first to market solutions that meet the fundamental
needs of how an engineer / analyst needs to work with their FEA tools. Our product offerings of True-Load, True-QSE and True-LDE bring understanding
to dynamic loading of structures and how to extract decision ready data from FEA models.
56
Sponsors
SUPPORTING SPONSOR
NAFEMS
NAFEMS focuses on the practical application of numerical engineering simulation techniques such as the finite element method for structural analysis,
computational fluid dynamics, and multibody simulation.
ANSYS, INC.
If you’ve ever seen a rocket launch, flown on an airplane, driven a car, used a computer, touched a mobile device, crossed a bridge, or put on wearable
technology, chances are you’ve used a product where ANSYS software played a critical role in its creation. ANSYS is the global leader in engineering
simulation. We help the world’s most innovative companies deliver radically better products to their customers. By offering the best and broadest
portfolio of engineering simulation software, we help them solve the most complex design challenges and engineer products limited only by imagination.
Founded in 1970, ANSYS employs nearly 3,000 professionals, many of whom are expert M.S. and Ph.D.-level engineers in finite element analysis,
computational fluid dynamics, electronics, semiconductors, embedded software and design optimization. Our exceptional staff is passionate about
pushing the limits of world-class simulation technology so our customers can turn their design concepts into successful, innovative products faster and
at lower cost. As a measure of our success in attaining these goals, ANSYS has been recognized as one of the world’s most innovative companies by
prestigious publications such as Bloomberg Businessweek and FORTUNE magazines
LANYARD SPONSOR
BD
BD is one of the largest global medical technology companies in the world and is advancing the world of health by improving medical discovery,
diagnostics and the delivery of care. The company develops innovative technology, services and solutions that help advance both clinical therapy for
patients and clinical process for health care providers. BD and its 65,000 employees have a passion and commitment to help improve patient outcomes,
improve the safety and efficiency of clinicians’ care delivery process, enable laboratory scientists to better diagnose disease and advance researchers’
capabilities to develop the next generation of diagnostics and therapeutics. BD has a presence in virtually every country and partners with organizations
around the world to address some of the most challenging global health issues. BD helps customers enhance outcomes, lower costs, increase
efficiencies, improve safety and expand access to health care.
57
Notes
58
Floor Plan
FOURTH FLOOR
59
V&V 2018
ASME VERIFICATION AND VALIDATION SYMPOSIUM