Conference PaperPDF Available

A Flexible Software Approach To Simulate Two-Scale Coupled Problems

Authors:
The 8th European Congress on Computational Methods in Applied Sciences and Engineering
ECCOMAS Congress 2022
5-–9 June 2022, Oslo, Norway
A FLEXIBLE SOFTWARE APPROACH TO SIMULATE
TWO-SCALE COUPLED PROBLEMS
ISHAAN DESAI1, CARINA BRINGEDAL2AND BENJAMIN
UEKERMANN1
1Institute for Parallel and Distributed Systems (IPVS)
University of Stuttgart
Universit¨atstraße 38, 70569, Stuttgart
e-mail: {ishaan.desai,benjamin.uekermann}@ipvs.uni-stuttgart.de
2Institute for Modelling Hydraulic and Environmental Systems (IWS)
University of Stuttgart
Pfaffenwaldring 61, 70569, Stuttgart
email: carina.bringedal@iws.uni-stuttgart.de
Key words: multiphysics, multiscale, research software, preCICE, porous media
Abstract. Many multiscale simulation problems require a many-to-one coupling between differ-
ent scales. For such coupled problems, researchers oftentimes focus on the coupling methodology,
but largely ignore software engineering and high-performance computing aspects. This can lead
to inefficient use of hardware resources, on the one hand, but also inefficient use of human
resources as solutions to typical technical coupling problems are constantly reinvented.
This work proposes a flexible and application-agnostic software framework to couple inde-
pendent simulation codes in a many-to-one fashion. To this end, we introduce a prototype of
a new lightweight software component called Micro Manager, which allows us to reuse the cou-
pling library preCICE for two-scale coupled problems. We demonstrate the applicability of the
framework by a two-scale coupled heat conduction problem.
1 INTRODUCTION
For many multiscale scenarios in simulation technology, the physics on different scales can
be viewed as separate problems that interact with each other. The simulation software for such
scenarios involves solving the physics on the individual scales and devising some form of coupling
between the scales. This work presents a software framework to couple simulations at two scales
in an application-agnostic way. Avoiding reinventing coupling strategies for each application
is the main motivation of developing such a framework. Two-scale simulations encompass a
broad range of techniques and approaches, and this work targets a specific class of two-scale
simulations.
Most two-scale phenomena have a well-defined scale separation. The two scales are commonly
referred to as macro and micro scales. A two-scale multiphysics problem can be seen as a
macro-micro coupled problem. There are several application areas where such macro-micro
coupled simulations have already been done, for example, porous media [1, 2], dual-phase steel
simulation [3], computational mechanics [4] and biomechanics [5]. In each of these applications,
1
Ishaan Desai, Carina Bringedal and Benjamin Uekermann
the coupling methodologies are mostly developed from scratch. These methodologies typically
involve efficient data transfer and communication between the scales, different coupling schemes,
and technical solutions for how to combine different programming languages. This work builds
on the functionality of the coupling library preCICE [6] to develop a software framework that
can facilitate application-agnostic macro-micro coupling.
The development of a flexible macro-micro coupling software has been previously discussed
from different perspectives. Groen et al. (2019) [7] states that the field of generic multiscale
coupling software is still maturing. Even though software packages such as MUSCLE3 [8],
MUI [9], or Amuse [10] exist, they are all tailored to particular multiscale computing patterns.
The macro-micro coupling we are addressing in this work falls into the heterogeneous multiscale
computing pattern [11]. For this pattern, high-performance computing (HPC) software is still
rare [11]. There exist, however, already application-tailored solutions for massively parallel
simulations, for example Klawonn et al. (2020) [3] or Di Natale et al. (2019) [12].
The proposed software framework has two parts: the preCICE coupling library and a newly
developed so-called Micro Manager1. Partitioned black-box coupling between two or more simu-
lation codes can already be done with preCICE. The Micro Manager controls a set of micro-scale
simulations and couples them to a macro-scale simulation using preCICE. This way, we can reuse
the sophisticated and efficient coupling solutions of preCICE.
The macro-micro coupling framework is explained in a step-by-step manner. Section 2 intro-
duces the software components used in this work. Section 3 describes a two-scale heat conduction
example problem, which is developed to showcase and test the applicability of the software frame-
work. The macro-scale problem solves the heat equation using the conductivity and material
amounts from micro-scale problems connected to each point of the macro problem. Section 4
shows results of the simulation runs of the model problem.
2 SOFTWARE
There are two software components in this work, namely preCICE and a newly developed
software package called the Micro Manager. The coupling between one macro simulation and
several micro simulations using preCICE and the Micro Manager is shown in Figure 1. This
section presents both the used and the developed software packages.
2.1 preCICE coupling library
preCICE facilitates coupling between two or more simulation codes to perform multiphysics
simulations. The coupling is done in a minimally-invasive black-box fashion. Through its appli-
cation programming interface (API), preCICE steers the coupled codes and handles the coupling
numerics. preCICE does fully distributed point-to-point communication and offers functionality
such as data mapping methods and coupling schemes [6]. Typical examples of coupled scenarios
are fluid-structure interaction and conjugate heat transfer, but preCICE is not restricted to these
applications. preCICE is an open-source project2with extensive documentation3and tutorial
1https://github.com/precice/micro-manager
2https://github.com/precice
3https://precice.org/docs.html
2
Ishaan Desai, Carina Bringedal and Benjamin Uekermann
Micro
Sim
Active
Micro
Sim
Inactive
Micro
Sim
Active
Micro
Sim
Active
Micro
Sim
Active
Micro
Sim
Active
Micro
Sim
Active
Micro
Sim
Active
Micro
Sim
Active
Micro
Sim
Active
Micro
Sim
Active
Micro
Sim
Active
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Micro
Sim
Inactive
Macro Simulation Micro Manager
Figure 1: A macro simulation is coupled to the Micro Manager which in turn controls a set of micro
simulations. The coupling is done via preCICE. The active and inactive micro simulations indicate that
the Micro Manager can run them adaptively.
cases4.
2.2 Micro Manager
The Micro Manager controls all micro simulations and couples them to a macro simulation
through preCICE. To this end, the micro code needs to be converted to a callable library with a
specific API, shown in Listing 1 as it would be for a micro problem code for the model problem
explained in section 3. The functions solve(...),initialize(...),save_checkpoint() and
reload_checkpoint() are part of the Micro Manager API, which means that the functions need
to have those particular names and signatures so that the Micro Manager can call them, give
them data and use the returned data. More details about the API can be found in the Micro
Manager README 5.
The Micro Manager itself is coupling to the macro simulation using preCICE. The data
transfer between the Micro Manager and each individual micro simulation is done in-memory.
We present a first version of the Micro Manager, in which all micro simulations are run in each
time step. The Micro Manager is configured with a JSON (JavaScript Object Notation) file. A
configuration is shown in Listing 2.
The configuration file has two components, coupling_params and simulation_params. In
coupling_params,config_file_name is the name of the preCICE XML configuration file,
macro_mesh_name is the name of the macro mesh as stated in the preCICE configuration file,
and, read_data_names and write_data_names are dictionaries with the names of the data as
keys and strings scalar or vector as values, depending on whether the data is scalar or vec-
tor. In simulation_params, the user needs to set the entity macro_domain_bounds which
are the minimum and maximum limits of the macro domain in all axis. For a 2D simula-
4https://github.com/precice/tutorials
5https://github.com/precice/micro-manager#readme
3
Ishaan Desai, Carina Bringedal and Benjamin Uekermann
1class MicroSimulation:
2def __init__(self):
3self._dims = 2
4self._porosity =None
5self._conductivity =None
6self._checkpoint =None
7
8def initialize(self):
9self._porosity = 0
10 self._conductivity =[]
11
12 def solve(self, temperature, dt):
13 self._porosity =solve_allen_cahn(temperature, dt)
14 self._conductivity =solve_heat_eqn(self._porosity)
15
16 return {"porosity":self._porosity, "conductivity_i":self._conductivity[0],
"conductivity_j":self._conductivity[1]},
17
18 def save_checkpoint(self):
19 print("Saving state of micro problem")
20 self._checkpoint =self._porosity
21
22 def reload_checkpoint(self):
23 print("Reverting to old state of micro problem")
24 self._porosity =self._checkpoint
Listing 1: Example of micro simulation code as a callable Python class
1{
2"micro_file_name":"micro_heat",
3"coupling_params": {
4"config_file_name":"precice-config.xml",
5"macro_mesh_name":"macro-mesh",
6"read_data_names": {"temperature":"scalar"},
7"write_data_names": {"porosity":"scalar","conductivity_i":"vector",
"conductivity_j":"vector"},
8},
9"simulation_params": {
10 "macro_domain_bounds": [0.0,1.0,0.0,1.0,0.0,1.0]
11 }
12 }
Listing 2: JSON configuration file of Micro Manager
tion the format is [xmin, xmax, ymin, ymax] and correspondingly for a 3D simulation it is
[xmin, xmax, ymin, ymax, zmin, zmax]. Using these bounds, the Micro Manager extracts
the coordinates of the points on the macro domain from preCICE.
Future versions of the Micro Manager will be capable of adaptively running micro simulations.
Adaptive means classifying the micro simulations as active and inactive, running the active ones
4
Ishaan Desai, Carina Bringedal and Benjamin Uekermann
and using their output to generate the full output field of all micro simulations. Various strategies
for selecting the adaptivity criteria have been proposed [13, 14] and will be implemented in the
Micro Manager. The Micro Manager is already parallelized with the Message Passing Interface
(MPI) for Python [15], and it is foreseen that the manager will have dynamic load-balancing
strategies when adaptivity is used for parallelized scenarios.
3 MODEL PROBLEM
The new macro-micro software framework is tested using a two-scale heat conduction exam-
ple, motivated by Bastidas et al. (2021) [13]. We consider a macro domain with an underlying
micro structure of two materials. The material properties are known, but the micro structure
evolves in an a priori unknown way. The existence of such an evolving micro structure means
that the constitutive properties necessary on the macro scale cannot be directly estimated or
taken from literature. At each computation point of the macro problem, a micro problem is
solved and the constituent quantities are upscaled. Such a macro-micro problem setting is pos-
sible for a system with clear scale separation. That means if the micro length is land the macro
length scale is L, we assume that l/L 1. The macro domain is denoted by Xand the micro
domain is denoted by Y.
3.1 Problem on macro scale
On the macro scale, the heat equation is solved on a two dimensional rectangular domain, see
Figure 2. Let Kbe the potentially anisotropic thermal conductivity tensor, ρthe molar density
and cthe specific heat. We know a priori that a micro structure with two materials g(grain)
and s(solid) exists, and hence the macro problem needs to be suitably modified to incorporate
this. We need to use the density and specific heat of both the materials. We define a ratio Φ(x)
which is the relative amount of material sto the total material in a micro domain, which we
henceforth call porosity. We use Φ to write the heat equation as
tρscsu+ (1 Φ)ρgcgu) = · (Ku) in X,(1)
where uis the temperature to be solved for. The quantities Kand Φ will be acquired by solving
micro problems.
3.2 Problem on micro scale
We assume that a micro structure exists in the macro domain, as shown in Figure 2. We
choose the domain of the micro simulation as a unit square as it is only a representative simula-
tion and not an actual zoomed-in domain of the macro scale. The micro domain is made up of a
material having conductivity kgwhich is embedded in another material having conductivity ks.
For simplicity reasons, the material gis chosen to have a circular shape. The micro simulation
has periodic boundary conditions on all boundaries. The grain geometry is mathematically rep-
resented by introducing a phase-field variable φwhich takes the value 0 in the grain domain and
the value of 1 outside it. Using this phase-field variable, the heat equation in the micro domain
5
Ishaan Desai, Carina Bringedal and Benjamin Uekermann
u= 0
nu= 0
x2
x1
X
L
kg
ks
y2
y1
Y
l
Figure 2: Macro domain Xwith boundary conditions and a representative micro domain Ywith a
heterogeneous grain structure.
can be written as
t(ρscsφu +ρgcg(1 φ)u) = · ((ksφ+kg(1 φ))u) in Y= [0,1]2.(2)
This equation simplifies to the equation of the corresponding material depending on whether
the value of φis closer to 0 or 1. The phase-field approach is used to model the heat conduction
through the two material domains. The macro problem is already quasi-non-dimensional, and
more details on the formulation of the equations can be found in Bringedal et al. (2020) [1]. We
solve the following equation
· ((kg(1 φ) + ksφ)(ej+ψj)) = 0,(3)
Eq. (3) is known as the cell problem for ψjand it appears from two-scale homogenization of
Eq. (2). For more details on homogenization of such problems, we refer to Bringedal et al.
(2020) [1]. To guarantee uniqueness, we constrain the weights in the following way
ZY
ψjdy = 0.(4)
The components of the effective upscaled conductivity matrix Kare calculated from the weights
as
Kij =ZY
(kg(1 φ) + ksφ)(δij +yiψj)dy. (5)
The initial micro structure is set using an analytical representation of the phase field φ
φ(y1, y2) = 1
1 + exp(4
λp(y1y0,1)2+ (y2y0,2)2r2
0),(6)
where (y1, y2) are the micro coordinates, (y0,1, y0,2) is the center of the grain, r0is the initial
radius of the grain and λ > 0 is related to the width of the transition layer between the two
6
Ishaan Desai, Carina Bringedal and Benjamin Uekermann
materials. To calculate the evolution of the phase field, we use the Allen-Cahn equation with a
reaction rate frelated to the macro-scale temperature in the following way
λ2tφ+γP 0(φ) = γλ22φ4λφ(1 φ)f(u),(7)
where P(φ) = 8φ2(1 φ)2is the double-well potential, γis the diffusion coefficient and f(u) is
a reaction rate, which is formulated as
f(u) = kuu2
u2
eq
1,(8)
where kuis a constant deciding the speed of the grain expansion or contraction and ueq is a
chosen equilibrium temperature. This reaction rate term is artificially constructed to couple
micro problem to a macro entity, which in this case is the macro temperature u. The porosity
Φ is calculated from the phase field in the following way
Φ = ZY
φdy. (9)
3.3 Solving the macro-micro problem
Both the macro and micro problem are solved using the finite element library Nutils [16]. The
macro and micro codes are publicly available6. Initially the macro field uis set at 0.5. Boundary
conditions for the macro domain are as shown in Figure 2. There is a point Dirichlet boundary
condition at the lower left corner, and otherwise the macro domain has zero flux boundary
conditions. All micro simulations have an initial grain of material gof radius r0= 0.25. The
material conductivity values are chosen as kg= 0 and ks= 1.0. For simplicity, the ρand con
both scales are chosen as 1.0. On the micro scale γ= 0.01 and ueq = 0.5.
Figure 3: Grain micro structure represented by a phase field in micro problem. Adaptive mesh refinement
is used to accurately resolve the phase field at the diffuse interface layer. The hanging nodes seen here
are handled in Nutils using a hierarchical basis.
The phase field over the material transition layer needs to be resolved accurately. The material
transition layer should have a width of at least three mesh points to be resolved accurately. The
grain is represented by having a thin transition layer, which means that the mesh needs to be
6Coupled heat conduction code: https://github.com/IshaanDesai/coupled-heat- conduction
7
Ishaan Desai, Carina Bringedal and Benjamin Uekermann
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0
0.1
0.2
0.3
0.4
0.5
0
0.1
0.2
0.3
0.4
0.5
u
Figure 4: Macro temperature u(background color) and macro grains (overlay circles) scaled by size at
time t= 0.25. As expected, the lower temperature in the lower left corner results in smaller grain sizes.
very fine. To reduce the computational cost of each micro problem, adaptive mesh refinement is
used. The mesh is refined where 0.1<φ<0.9. Mesh refinement is done in a recursive manner
up to 2 levels. A refining and coarsening algorithm similar to Bastidas et al. (2021) [13] is
employed to accurately resolve an evolving grain, as shown in Figure 3.
4 RESULTS
The macro problem is run on a single processor and the micro manager is run on 48 proces-
sors using MPI parallelization. Using 48 processors to run the micro manager leads to having
approximately 2 or 3 micro simulations per processor. The simulation is run till t= 0.25 with a
time step of 0.01. A serial implicit coupling scheme is used along with an iterative quasi-Newton
acceleration scheme. We choose this acceleration scheme as it gave efficient and stable results.
From Figure 4 we see that as the value of udecreases, the grain in the micro problem contracts.
This process is similar to the dissolution process considered by Bastidas et al. (2021) [13].
The grain is chosen to be of a lower conductivity than the material around it, hence a shrinking
grain leads to a corresponding increase in the upscaled effective conductivity. Figure 5 shows
that the porosity on the macro scale increases as the grain contracts. The results are preliminary
and have not been validated against a benchmark case.
5 CONCLUSIONS AND OUTLOOK
We presented a software framework for generic two-scale coupled problems. The main focus is
to show a preliminary working version of the Micro Manager with a two-scale coupled problem.
The model problem is tightly coupled. To get a stable simulation, we used an implicit coupling
scheme from preCICE. preCICE offers several variants of acceleration schemes which were used
to tune the implicit coupling to obtain a stable and accurate solution. The results show that the
Micro Manager works in parallel and is capable of bi-directional implicit coupling. Despite the
8
Ishaan Desai, Carina Bringedal and Benjamin Uekermann
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.8
1
1.2
1.4
K
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0
0.1
0.2
0.3
0.4
0.5
0.5
0.6
0.7
0.8
0.9
1
Φ
Figure 5: Effective conductivity Kand porosity Φ on the macro scale at time t= 0.25.
heterogeneity of the model problem, we observe that several micro simulations are similar to each
other. In the future, having adaptive initialization of micro problems would greatly reduce the
number of micro problems which need to be solved in each time step [13]. The Micro Manager
partitions the macro domain at the start of the simulation, hence adaptivity will lead to a load
imbalance at some point. The Micro Manager will need to have dynamic load balancing to be a
scalable software. Figure 4 shows that due to the variation of grain sizes with space and time,
there is a considerable variation in the mesh size of the micro problems. This further highlights
the need of having dynamic load balancing capability in the micro manager. To showcase the
application-agnostic nature of our approach, we plan to apply it also to other two-scale coupled
problems, such as porous media with flow, human muscle models and human liver models.
6 ACKNOWLEDGMENTS
We are funded by Deutsche Forschungsgemeinschaft (DFG, German Research Foundation)
under Germany’s Excellence Strategy - EXC 2075 390740016. We acknowledge the support
by the Stuttgart Center for Simulation Science (SimTech).
REFERENCES
[1] C. Bringedal, L. von Wolff, I. S. Pop, Phase field modeling of precipitation and dissolution
processes in porous media: Upscaling and numerical experiments, Multiscale Modeling &
Simulation 18 (2) (2020) 1076–1112. doi:10.1137/19M1239003.
[2] C. Bringedal, I. Berre, I. S. Pop, F. A. Radu, Upscaling of non-isothermal reactive porous
media flow with changing porosity, Transport in Porous Media 114 (2) (2020) 371–393.
doi:10.1007/s11242-015-0530-9.
[3] A. Klawonn, M. Lanser, M. Uran, O. Rheinbach, S. ohler, J. Schr¨oder, L. Scheune-
mann, D. Brands, D. Balzani, A. R. Gandhi, G. Wellein, M. Wittmann, O. Schenk,
R. Janal´ık, EXASTEEL: Towards a Virtual Laboratory for the Multiscale Simulation of
Dual-Phase Steel Using High-Performance Computing, 2020, pp. 351–404. doi:10.1007/
978-3-030-47956-5_13.
[4] F. Fritzen, M. Fern´andez, F. Larsson, On-the-fly adaptivity for nonlinear twoscale simula-
tions using artificial neural networks and reduced order modeling, Frontiers in Materials 6
(2019). doi:10.3389/fmats.2019.00075.
9
Ishaan Desai, Carina Bringedal and Benjamin Uekermann
[5] L. Lambers, M. Suditsch, A. Wagner, T. Ricken, A multiscale and multiphase model of
function-perfusion growth processes in the human liver, PAMM 20 (1) (2021) e202000290.
doi:10.1002/pamm.202000290.
[6] G. Chourdakis, K. Davis, B. Rodenberg, M. Schulte, F. Simonis, B. Uekermann, et al.,
preCICE v2: A sustainable and user-friendly coupling library [version 1; peer review: 2
approved], Open Research Europe 2 (51) (2022). doi:10.12688/openreseurope.14445.1.
[7] D. Groen, J. Knap, P. Neumann, D. Suleimenova, L. Veen, K. Leiter, Mastering the scales:
a survey on the benefits of multiscale computing software, Philosophical Transactions of
the Royal Society A: Mathematical, Physical and Engineering Sciences 377 (2142) (2019)
20180147. doi:10.1098/rsta.2018.0147.
[8] L. E. Veen, A. G. Hoekstra, Easing multiscale model design and coupling with muscle 3,
in: V. V. Krzhizhanovskaya, G. avodszky, M. H. Lees, J. J. Dongarra, P. M. A. Sloot,
S. Brissos, J. Teixeira (Eds.), Computational Science ICCS 2020, Springer International
Publishing, Cham, 2020, pp. 425–438. doi:10.1007/978-3-030-50433-5_33.
[9] Y.-H. Tang, S. Kudo, X. Bian, Z. Li, G. E. Karniadakis, Multiscale universal interface: A
concurrent framework for coupling heterogeneous solvers, Journal of Computational Physics
297 (2015) 13–31. doi:10.1016/j.jcp.2015.05.004.
[10] Pelupessy, F. I., van Elteren, A., de Vries, N., McMillan, S. L. W., Drost, N., Portegies
Zwart, S. F., The astrophysical multipurpose software environment, A&A 557 (2013) A84.
doi:10.1051/0004-6361/201321252.
[11] S. Alowayyed, D. Groen, P. V. Coveney, A. G. Hoekstra, Multiscale computing in the
exascale era, Journal of Computational Science 22 (2017) 15–25. doi:10.1016/j.jocs.
2017.07.004.
[12] F. Di Natale, H. Bhatia, et al., A massively parallel infrastructure for adaptive multiscale
simulations: Modeling ras initiation pathway for cancer, in: Proceedings of the International
Conference for High Performance Computing, Networking, Storage and Analysis, SC ’19,
Association for Computing Machinery, New York, NY, USA, 2019. doi:10.1145/3295500.
3356197.
[13] M. Bastidas, C. Bringedal, I. S. Pop, A two-scale iterative scheme for a phase-field model
for precipitation and dissolution in porous media, Applied Mathematics and Computation
396 (2021) 125933. doi:10.1016/j.amc.2020.125933.
[14] M. Redeker, C. Eck, A fast and accurate adaptive solution strategy for two-scale models
with continuous inter-scale dependencies, Journal of Computational Physics 240 (2013)
268–283. doi:10.1016/j.jcp.2012.12.025.
[15] L. Dalcin, Y.-L. L. Fang, mpi4py: Status update after 12 years of development, Computing
in Science and Engineering 23 (4) (2021) 47–54. doi:10.1109/MCSE.2021.3083216.
[16] G. van Zwieten, J. van Zwieten, W. Hoitinga, Nutils v7.0 (Jan. 2022). doi:10.5281/
zenodo.6006701.
10
... In Desai et al. (2022), we use the Micro Manager to solve a two-scale heat conduction problem, where both the macro and micro scales are solved using the finite element library Nutils (Zwieten et al., 2022). Kschidock (2023) shows the flexibility of preCICE and the Micro Manager by solving the same problem using DuMu x (Koch et al., 2021). ...
... In addition to the two-scale heat conduction problem in Desai et al. (2022), the Micro Manager has already been used in multiscale models of the human liver in which a lobule-scale continuum-biomechanical model is coupled to many cell-scale models (Otlinghaus, 2022). ...
Article
Full-text available
MPI for Python (mpi4py) has evolved to become the most used Python binding for the Message Passing Interface (MPI). We report on various improvements and features that mpi4py gradually accumulated over the past decade, including support up to the MPI-3.1 specification, support for CUDA-aware MPI implementations, and other utilities at the intersection of MPI-based parallel distributed computing and Python application development.
Article
Full-text available
Due to an ageing society and unhealthy living conditions, liver diseases like non‐alcoholic fatty liver disease (NAFLD) or liver cancer will account for an increasing proportion of deaths in the coming years. Using a mathematical model, the underlying function‐perfusion processes of both diseases are investigated. We developed a multiscale and multiphase model for the simulation of hepatic processes on the lobular and cell scale. The lobular scale is described with partial differential equations (PDEs) based on the Theory of Porous Media (TPM), whereas on the cellular scale the metabolic processes are calculated using ordinary differential equations (ODEs). Since NAFLD and the development of a liver tumor lead to tissue growth as well as changes in the blood perfusion, growth and remodelling processes in the human liver are evaluated.
Chapter
Full-text available
We present a numerical two-scale simulation approach of the Nakajima test for dual-phase steel using the software package FE2TI, a highly scalable implementation of the well known homogenization method FE2. We consider the incorporation of contact constraints using the penalty method as well as the sample sheet geometries and adequate boundary conditions. Additional software features such as a simple load step strategy and prediction of an initial value by linear extrapolation are introduced. The macroscopic material behavior of dual-phase steel strongly depends on its microstructure and has to be incorporated for an accurate solution. For a reasonable computational effort, the concept of statistically similar representative volume elements (SSRVEs) is presented. Furthermore, the highly scalable nonlinear domain decomposition methods NL-FETI-DP and nonlinear BDDC are introduced and weak scaling results are shown. These methods can be used, e.g., for the solution of the microscopic problems. Additionally, some remarks on sparse direct solvers are given, especially to PARDISO. Finally, we come up with a computationally derived Forming Limit Curve (FLC).
Conference Paper
Full-text available
Computational models can define the functional dynamics of complex systems in exceptional detail. However, many modeling studies face seemingly incommensurate requirements: to gain meaningful insights into some phenomena requires models with high resolution (microscopic) detail that must nevertheless evolve over large (macroscopic) length- and time-scales. Multiscale modeling has become increasingly important to bridge this gap. Executing complex multiscale models on current petascale computers with high levels of parallelism and heterogeneous architectures is challenging. Many distinct types of resources need to be simultaneously managed, such as GPUs and CPUs, memory size and latencies, communication bottlenecks, and filesystem bandwidth. In addition, robustness to failure of compute nodes, network, and filesystems is critical. We introduce a first-of-its-kind, massively parallel Multiscale Machine-Learned Modeling Infrastructure (MuMMI), which couples a macro scale model spanning micrometer length- and millisecond time-scales with a micro scale model employing high-fidelity molecular dynamics (MD) simulations. MuMMI is a cohesive and transferable infrastructure designed for scalability and efficient execution on heterogeneous resources. A central workflow manager simultaneously allocates GPUs and CPUs while robustly handling failures in compute nodes, communication networks, and filesystems. A hierarchical scheduler controls GPU-accelerated MD simulations and in situ analysis. We present the various MuMMI components, including the macro model, GPU-accelerated MD, in situ analysis of MD data, machine learning selection module, a highly scalable hierarchical scheduler, and detail the central workflow manager that ties these modules together. In addition, we present performance data from our runs on Sierra, in which we validated MuMMI by investigating an experimentally intractable biological system: the dynamic interaction between RAS proteins and a plasma membrane. We used up to 4000 nodes of the Sierra supercomputer, concurrently utilizing over 16,000 GPUs and 176,000 CPU cores, and running up to 36,000 different tasks. This multiscale simulation includes about 120,000 MD simulations aggregating over 200 milliseconds, which is orders of magnitude greater than comparable studies.
Article
Full-text available
A multi-fidelity surrogate model for highly nonlinear multiscale problems is proposed. It is based on the introduction of two different surrogate models and an adaptive on-the-fly switching. The two concurrent surrogates are built incrementally starting from a moderate set of evaluations of the full order model. Therefore, a reduced order model (ROM) is generated. Using a hybrid ROM-preconditioned FE solver additional effective stress-strain data is simulated while the number of samples is kept to a moderate level by using a dedicated and physics-guided sampling technique. Machine learning (ML) is subsequently used to build the second surrogate by means of artificial neural networks (ANN). Different ANN architectures are explored and the features used as inputs of the ANN are fine tuned in order to improve the overall quality of the ML model. Additional ML surrogates for the stress errors are generated. Therefore, conservative design guidelines for error surrogates are presented by adapting the loss functions of the ANN training in pure regression or pure classification settings. The error surrogates can be used as quality indicators in order to adaptively select the appropriate—i.e., efficient yet accurate—surrogate. Two strategies for the on-the-fly switching are investigated and a practicable and robust algorithm is proposed that eliminates relevant technical difficulties attributed to model switching. The provided algorithms and ANN design guidelines can easily be adopted for different problem settings and, thereby, they enable generalization of the used machine learning techniques for a wide range of applications. The resulting hybrid surrogate is employed in challenging multilevel FE simulations for a three-phase composite with pseudo-plastic micro-constituents. Numerical examples highlight the performance of the proposed approach.
Article
Full-text available
In the last few decades, multiscale modelling has emerged as one of the dominant modelling paradigms in many areas of science and engineering. Its rise to dominance is primarily driven by advancements in computing power and the need to model systems of increasing complexity. The multiscale modelling paradigm is now accompanied by a vibrant ecosystem of multiscale computing software (MCS) which promises to address many challenges in the development of multiscale applications. In this paper, we define the common steps in the multiscale application development process and investigate to what degree a set of 21 representative MCS tools enhance each development step. We observe several gaps in the features provided by MCS tools, especially for application deployment and the preparation and management of production runs. In addition, we find that many MCS tools are tailored to a particular multiscale computing pattern, even though they are otherwise application agnostic. We conclude that the gaps we identify are characteristic of a field that is still maturing and features that enhance the deployment and production steps of multiscale application development are desirable for the long-term success of MCS in its application fields. This article is part of the theme issue ‘Multiscale modelling, simulation and computing: from the desktop to the exascale’.
Article
Full-text available
We expect that multiscale simulations will be one of the main high performance computing workloads in the exascale era. We propose multiscale computing patterns as a generic vehicle to realise load balanced, fault tolerant and energy aware high performance multiscale computing. Multiscale computing patterns should lead to a separation of concerns, whereby application developers can compose multiscale models and execute multiscale simulations, while pattern software realises optimized, fault tolerant and energy aware multiscale computing. We introduce three multiscale computing patterns, present an example of the extreme scaling pattern, and discuss our vision of how this may shape multiscale computing in the exascale era.
Article
preCICE is a free/open-source coupling library. It enables creating partitioned multi-physics simulations by gluing together separate software packages. This paper summarizes the development efforts in preCICE of the past five years. During this time span, we have turned the software from a working prototype -- sophisticated numerical coupling methods and scalability on ten thousands of compute cores -- to a sustainable and user-friendly software project with a steadily-growing community. Today, we know through forum discussions, conferences, workshops, and publications of more than 100 research groups using preCICE. We cover the fundamentals of the software alongside a performance and accuracy analysis of different data mapping methods. Afterwards, we describe ready-to-use integration with widely-used external simulation software packages, tests, and continuous integration from unit to system level, and community building measures, drawing an overview of the current preCICE ecosystem.
Article
Mineral precipitation and dissolution processes in a porous medium can alter the structure of the medium at the scale of pores. Such changes make numerical simulations a challenging task as the geometry of the pores changes in time in an apriori unknown manner. To deal with such aspects, we here adopt a two-scale phase-field model, and propose a robust scheme for the numerical approximation of the solution. The scheme takes into account both the scale separation in the model, as well as the non-linear character of the model. After proving the convergence of the scheme, an adaptive two-scale strategy is incorporated, which improves the efficiency of the simulations. Numerical tests are presented, showing the efficiency and accuracy of the scheme in the presence of anisotropies and heterogeneities.