William J. Rider's research while affiliated with Sandia National Laboratories and other places

What is this page?


This page lists the scientific contributions of an author, who either does not have a ResearchGate profile, or has not yet added these contributions to their profile.

It was automatically created by ResearchGate to create a record of this author's body of work. We create such pages to advance our goal of creating and maintaining the most comprehensive scientific repository possible. In doing so, we process publicly available (personal) data relating to the author as a member of the scientific community.

If you're a ResearchGate member, you can follow this page to keep up with this author's work.

If you are this author, and you don't want us to display this page anymore, please let us know.

Publications (124)


Numerical Approximations Formulated as LES Models
  • Chapter

December 2022

·

79 Reads

·

3 Citations

·

·

William J. Rider

Underresolved simulations are typically unavoidable in high Reynolds (Re) and Mach (Ma) number turbulent flow applications at scale. Implicit Large-Eddy Simulation (ILES) often becomes the effective strategy to capture the dominating effects of convectively driven flow instabilities. ILES can be based on effectively codesigned physics and numerical models solving the compressible conservation equations with nonoscillatory finite-volume algorithms. We evaluate three distinct numerical strategies for ILES and assess their impact simulating onset, development, and decay of turbulence: (i) the Harten–Lax–van Leer Riemann solver applying Strang splitting and a Lagrange-plus-Remap formalism to solve the directional sweep – denoted split; (ii) the Harten–Lax–van Leer–Contact Riemann solver using a directionally unsplit strategy and parabolic reconstruction – denoted unsplit; (iii) the unsplit with a Low-Ma Correction – denoted unsplit*, addressing excessive numerical dissipation ∼1/Ma associated with upwinding in mixing applications driven by weakly-compressible local dynamics. Modified equation analysis, a technique for generating approximate equations for the computed solutions, is used to elucidate the effective subgrid models associated with the algorithms underlying ILES. Case studies considered are the Taylor–Green Vortex prototyping transition to turbulence, and Rayleigh–Taylor driven flow prototyping turbulent material mixing development. For given spatiotemporal resolution, significantly more accurate predictions (reduced numerical uncertainties) are provided by the unsplit discretizations, specially when augmented with the Low-Ma Correction. Relevant comparisons of ILES based on Euler and Navier–Stokes equations are presented. Overall, the unsplit* scheme reveals instrumental in capturing the spatiotemporal development of the Taylor–Green Vortex and Rayleigh–Taylor flows and their validation at prescribed Re on coarser grids.

Share

High fidelity coupling methods for blast response of thin shell structures

December 2022

·

39 Reads

Finite Elements in Analysis and Design

·

Jesse D. Thomas

·

·

[...]

·

Martin Heinstein

Computational simulation of structures subjected to blast loads requires integration of computational shock physics and structural response with finite deformations. The authors’ particular application of interest is blast loads on thin shell structures, which often deform concurrently with the shock or pressure wave evolution over the structure. This necessitates two-way coupled algorithms. We combine state-of-the-art shock physics, structural modeling with shell finite elements, and an immersed boundary method that accommodates arbitrarily thin structures. Building on successful techniques in the literature, we focus on accuracy and convergence rates of novel extensions to time step coupling approaches and immersed boundary treatments. The new techniques are developed to easily integrate with typical, industry-standard production analysis codes. The final recommended technique combines centered-difference structural time integration, predictor–corrector fluid time integration, level set surface tracking, and the Half Riemann Immersed Boundary method. Examples are given that show robust and accurate modeling of shock and pressure wave problems, and verification problems highlight good convergence properties for both fluid and fluid–structure applications.


Exascale applications: Skin in the game
  • Article
  • Full-text available

January 2020

·

416 Reads

·

94 Citations

Philosophical Transactions A

Philosophical Transactions A

As noted in Wikipedia, skin in the game refers to having ‘incurred risk by being involved in achieving a goal’, where ‘ skin is a synecdoche for the person involved, and game is the metaphor for actions on the field of play under discussion’. For exascale applications under development in the US Department of Energy Exascale Computing Project, nothing could be more apt, with the skin being exascale applications and the game being delivering comprehensive science-based computational applications that effectively exploit exascale high-performance computing technologies to provide breakthrough modelling and simulation and data science solutions. These solutions will yield high-confidence insights and answers to the most critical problems and challenges for the USA in scientific discovery, national security, energy assurance, economic competitiveness and advanced healthcare. This article is part of a discussion meeting issue ‘Numerical algorithms for high-performance computational science’.

Download


The Foundations of Verification in Modeling and Simulation

April 2019

·

32 Reads

·

4 Citations

The practice of verification is grounded in mathematics highlighting the fundamental nature of its practice. Models of reality are fundamentally mathematical and verification assures the connection between the modeling intended and achieved in code. Code verification is a process where the correctness of a computer code for simulation and modeling is proven. This “proof” is defined by the collection of evidence that the numerical approximations are congruent with the model for the physical phenomena. The key metric in code verification is the order of accuracy of the approximation that should match theoretical expectations. In contrast, solution verification is an aspect of uncertainty estimation associated with numerical error in simulations. Solution verification uses many of the same approaches as code verification, but its principal outcome is an estimate of the numerical error. The order of convergence is a secondary outcome. Together these two practices form an important part of the foundation of quality and credibility in modeling and simulation.




Uncertainty Quantification’s Role in Modeling and Simulation Planning, and Credibility Assessment Through the Predictive Capability Maturity Model

June 2017

·

42 Reads

·

1 Citation

The importance of credible, trustworthy numerical simulations is obvious especially when using the results for making high-consequence decisions. Determining the credibility of such numerical predictions is much more difficult and requires a systematic approach to assessing predictive capability, associated uncertainties and overall confidence in the computational simulation process for the intended use of the model. This process begins with an evaluation of the computational modeling of the identified, important physics of the simulation for its intended use. This is commonly done through a Phenomena Identification Ranking Table (PIRT). Then an assessment of the evidence basis supporting the ability to computationally simulate these physics can be performed using various frameworks such as the Predictive Capability Maturity Model (PCMM). Several critical activities follow in the areas of code and solution verification, validation and uncertainty quantification, which will be described in detail in the following sections. The subject matter is introduced for general applications but specifics are given for the failure prediction project. The first task that must be completed in the verification & validation procedure is to perform a credibility assessment to fully understand the requirements and limitations of the current computational simulation capability for the specific application intended use. The PIRT and PCMM are tools used at Sandia National Laboratories (SNL) to provide a consistent manner to perform such an assessment. Ideally, all stakeholders should be represented and contribute to perform an accurate credibility assessment. PIRTs and PCMMs are both described in brief detail below and the resulting assessments for an example project are given.


Arbitrary Lagrangian-Eulerian methods for modeling high-speed compressible multimaterial flows

July 2016

·

710 Reads

·

170 Citations

Journal of Computational Physics

This paper reviews recent developments in Arbitrary Lagrangian Eulerian (ALE) methods for modeling high speed compressible multimaterial flows in complex geometry on general polygonal meshes. We only consider the indirect ALE approach which consists of three key stages: a Lagrangian stage, in which the solution and the computational mesh are updated; a rezoning stage, in which the nodes of the computational mesh are moved to improve grid quality; and a remapping stage, in which the Lagrangian solution is transferred to the rezoned mesh.


Verification, Validation, and Uncertainty Quantification for Coarse Grained Simulation

June 2016

·

25 Reads

·

4 Citations

Small-scale turbulent flow dynamics is traditionally viewed as universal and as enslaved to that of larger scales. In coarse grained simulation (CGS), large energy-containing structures are resolved, smaller structures are spatially filtered out, and unresolved subgrid scale (SGS) effects are modeled. Coarse Grained Simulation and Turbulent Mixing reviews our understanding of CGS. Beginning with an introduction to the fundamental theory the discussion then moves to the crucial challenges of predictability. Next, it addresses verification and validation, the primary means of assessing accuracy and reliability of numerical simulation. The final part reports on the progress made in addressing difficult non-equilibrium applications of timely current interest involving variable density turbulent mixing. The book will be of fundamental interest to graduate students, research scientists, and professionals involved in the design and analysis of complex turbulent flows.


Citations (69)


... However, a comparable methodology on how to approach verification in modeling and simulation can be observed: Conducting code verification first, embracing software quality assurance and numerical algorithm verification, followed by solution verification, focusing on the estimation of the numerical accuracy of discrete solutions compared to their mathematical model; cf. (Roy 2005;Rider 2019). ...

Reference:

Towards continuous simulation credibility assessment
The Foundations of Verification in Modeling and Simulation
  • Citing Chapter
  • April 2019

... This has included approximately 70 projects spanning programming models, software performance analysis tools, math libraries, data handling (massive I/O), and visualization. In addition, the ECP has supported the development of 24 diverse Application Development (AD) projects spanning a breadth of science and engineering applications (Alexander et al., 2020). The intent of the ECP is to have these applications, each taking advantage of the software technology developments, fully prepared to exploit the new exascale platforms coming online. ...

Exascale applications: Skin in the game
Philosophical Transactions A

Philosophical Transactions A

... They were computed by simulating an expansion nozzle flow [144] and employing a mT model. Their correct prediction is fundamental to well describe the shock wave/boundary layer interaction near the body [162,163]. Dealing with an oxygen mixture, chemistry is activated at a lower temperature and a mT approach might be not suitable to describe the non-equilibrium [152]. On the other hand, a StS approach represents a much more accurate alternative to predict the flow conditions at the outlet of the nozzle, namely the free stream conditions for the double-cone flow, in the presence of strong non-equilibrium phenomena. ...

Validation Assessment of Hypersonic Double-Cone Flow Simulations using Uncertainty Quantification, Sensitivity Analysis, and Validation Metrics
  • Citing Conference Paper
  • January 2019

... Such techniques have benefited the aerospace, automotive and biomedical industries. The challenges are numerical and physical, and several reviews have been written on specific industrial applications or areas of physics [29][30][31][32]. However, despite significant advances in the field,there is still a need to improve efficiency and accuracy. ...

Verification, Validation, and Uncertainty Quantification for Coarse Grained Simulation
  • Citing Chapter
  • June 2016

... The effectiveness of a simulation relies heavily on the quality of the model. At present, the evaluation of model quality mainly uses two metrics, namely fidelity [1][2][3] and credibility [4][5][6]. However, the two metrics mainly focus on the static status or performance of a model at a specific stage or condition, and cannot directly reflect the changes of the model in its use and management process. ...

Uncertainty Quantification’s Role in Modeling and Simulation Planning, and Credibility Assessment Through the Predictive Capability Maturity Model
  • Citing Chapter
  • June 2017

... In particular, the extrapolation of the simulation model and its uncertainty to real plant conditions that are not totally available in experiments is a difficult task. Some specific methods, such as the predictive capability maturity model Rider et al., 2015) may be helpful, but not yet mature and practical enough to be applied to nuclear power plant safety analyses using system thermal hydraulic codes. ...

Uncertainty Quantification’s Role in Modeling and Simulation Planning, and Credibility Assessment Through the Predictive Capability Maturity Model
  • Citing Chapter
  • January 2015

... Given the advantages and limitations inherent to both Lagrangian and Eulerian methods, Arbitrary Lagrangian Eulerian (ALE) based techniques have been developed to harness the strengths of each approach. The fundamental principle of the ALE formulation [6][7][8][9][10][11][12][13][14][15][16][17] involves utilising a referential (fixed) domain for motion description. This referential domain is different from the material domain (used in Lagrangian descriptions) and the spatial domain (used in Eulerian descriptions). ...

Arbitrary Lagrangian-Eulerian methods for modeling high-speed compressible multimaterial flows
  • Citing Article
  • July 2016

Journal of Computational Physics

... As presented by multiple publications [22,23] and demonstrated in ref. [3], the second-generation wavelets are capable of handling such irregularities. Wavelets have demonstrated their unique ability to identify features in time and space, among others, by revealing small scale properties of significant importance that have not been otherwise possible, such as the continuous wavelet methodology described by [25] and successfully demonstrated by [11,20]. More specifically, first-generation wavelets have been used for vorticity analysis in grid-based methods of fluid modeling where the sample spacing and the number of samples can be ensured to be regular and dyadic by the generation of the grid. ...

The gas curtain experimental technique and analysis methodologies

... The development of SGS models has been a subject of intense interest for decades, in particular in the 1990s [123,140]. It turns out, however, that the truncation error introduced by some numerical schemes is similar in form and magnitude to conventional SGS models [88]. Stable numerical schemes in fact often achieve stability by introducing truncation errors or stabilization terms that replicate the effect of the subgrid scales on the resolved scales [100,101]; which corresponds to dissipation in under-resolved 1 turbulence simulations [82,189]. ...

A rationale for implicit les
  • Citing Article
  • January 2007