Uploaded by pippo

PIIS2352711018302164

advertisement
SoftwareX 9 (2019) 68–72
Contents lists available at ScienceDirect
SoftwareX
journal homepage: www.elsevier.com/locate/softx
EOF-Library: Open-source Elmer FEM and OpenFOAM coupler for
electromagnetics and fluid dynamics
∗
Juris Vencels a , , Peter Råback b , Vadims Geža a
a
b
Laboratory for Mathematical Modelling of Environmental and Technological Processes, University of Latvia, Riga, Latvia
CSC - IT Center for Science Ltd, Espoo, Finland
article
info
Article history:
Received 28 September 2018
Received in revised form 8 January 2019
Accepted 9 January 2019
Keywords:
Elmer
FEM
OpenFOAM
FVM
CFD
MPI
a b s t r a c t
EOF-Library is a software that couples Elmer and OpenFOAM simulation packages. It enables efficient
internal field interpolation and communication between the finite element and the finite volume frameworks. The coupling of the two packages is based on the Message Passing Interface, which results in low
latency, high data bandwidth and parallel scalability. Potential applications are magnetohydrodynamics,
convective cooling of electrical devices, industrial plasma physics and microwave heating. In this work we
introduce the software and perform interpolation accuracy and parallel scaling tests by sending a known
scalar distribution between the two codes.
© 2019 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license
(http://creativecommons.org/licenses/by/4.0/).
Code metadata
Current code version
Permanent link to code/repository used of this code version
Legal Code Licence
Code versioning system used
Software code languages, tools, and services used
Compilation requirements, operating environments & dependencies
If available Link to developer documentation/manual
Support email for questions
1. Motivation and significance
Most multiphysical problems on complex geometries are solved
using the finite element method (FEM) and the finite volume
method (FVM). The availability of high-quality FEM and FVM
solvers combined with pre and postprocessing tools make them
very popular. In principle, one could model multiphysics problems
within a single framework by using either FEM or FVM. In practice,
FEM dominates in structural mechanics and electromagnetics,
while FVM is preferred for computational fluid dynamics (CFD).
FEM has the freedom to choose the order of basis functions,
∗ Corresponding author.
E-mail address: vencels@eof-library.com (J. Vencels).
URL: https://eof-library.com/ (J. Vencels).
0.1.0
https://github.com/ElsevierSoftwareX/SOFTX_2018_185
GPL v3
git
OpenFOAM.org (C++14), Elmer (Fortran 2008), MPI, CMake
Linux or Docker (with Linux image)
https://eof-library.com/learning-zone
vencels@eof-library.com
and for electromagnetics, FEM can make use of curl conforming
edge elements allowing vector-potential continuity for tangential
component and jumps for the normal component. On the other
hand, FVM is easier to implement for CFD. In advection dominated
transient fluid flows, especially highly turbulent ones, FVM is more
robust and is therefore used more frequently than FEM.
The initial application for the EOF-Library was magnetohydrodynamics (MHD) with free surface modelling [1] of various
industrial applications. MHD problems involve coupled electromagnetics and fluid dynamics. Commercial simulation packages
have limited MHD modelling capabilities for metallurgy, however,
there has been significant progress in MHD modelling using the
file-based coupling for Ansys commercial software by [2]. The filebased approach runs into performance issues due to file exchange
overhead, while closed source software is in some cases slowing
https://doi.org/10.1016/j.softx.2019.01.007
2352-7110/© 2019 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
J. Vencels, P. Råback and V. Geža / SoftwareX 9 (2019) 68–72
the development of new physical models. We chose to couple two
open-source packages instead and use the Message Passing Interface (MPI) as an alternative for two-way coupling. Among many
FEM software packages, Elmer [3] was chosen for electromagnetics
and OpenFOAM [4] for fluid dynamics.
We chose Elmer due to its parallelization structure, which is
suitable for MPI-coupling, implemented 2D and 3D electromagnetic solvers, modularity and a strong user base. It is a multiphysics
FEM package mainly developed by CSC—IT Center for Science in
Finland, written in modern Fortran language and licenced under
LGPL. The main application areas of this software include structural
mechanics, acoustics, electromagnetics [3] and its patched version,
Elmer/Ice [5], is used for ice sheet modelling for glaciers. Unlike
many other FEM packages, it comes with a fully-parallelized matrix
assembly, built-in iterative solvers and preconditioners. Elmer has
a Navier–Stokes solver for fluid dynamics, which has been used
for modelling laminar flows such as blood flow in arteries [6].
We realized that it is not that well suited for high turbulence
MHD flows on complex geometries that are found in industrial
applications.
For fluid dynamics, OpenFOAM was a clear choice because of
available models for multiphase and highly turbulent flows. It is
the most popular open-source FVM software for fluid dynamics
with RANS and LES turbulence models, the volume of fluid (VOF)
method for modelling free surface flows [7], a solver for rarefied gas
flows using the direct simulation Monte Carlo (DSMC) method [8].
It also has electrostatic, magnetic and MHD solvers, but these are
only suitable for single-region simulations on simple geometries.
OpenFOAM has been used for MHD modelling, such as liquid metal
batteries [9] or industry-focused problems like arc welding [10],
but in all cases the applications were limited by direct currents
or stationary magnetic fields. Many industrial applications have
alternating currents and magnetic fields that induce eddy currents
in electrical conductors. Complex vector-potential notation is preferred for time-harmonic eddy currents. Recently, it was shown
that 3D electromagnetic eddy-current problems can be solved
using foam-extend software [11], which is an extended community
version of OpenFOAM.
The idea to couple FEM and FVM software is not new. The
majority of such coupling solutions focus on fluid–structure interaction (FSI) where fluid flow causes deformation of solids, which
in turn affect fluid flow [12]. Applications related to FSI are found
in medicine (blood flows in elastic arteries and heart valves),
aerospace (wing deformations, aerodynamics and flutter), mechanics (impact on structures) and acoustics. We think that two
representative couplers among the many available solutions are
the open-source library preCICE [13], which is being actively developed, and the proprietary coupler MpCCI [14]. It is worth mentioning that both couplers have options for MPI-based data communication. On the other hand, they are primarily focused on bidirectional surface coupling. In this context we want to emphasize
that the EOF-Library, with its current focus on electromagnetics
and highly efficient volumetric data coupling, belongs to another,
less common type of couplers.
2. Software description
The EOF-Library is a collection of Elmer and OpenFOAM libraries that are dynamically linked during runtime. The coupling is
flexible, and allows any number of internal volumetric scalar and
vector fields to be sent in either or both directions. It provides a
framework for automated interpolation and communication, and
uses native Elmer and OpenFOAM interpolation routines.
The user is responsible for the implementation of physics and
therefore needs to set up Elmer and OpenFOAM cases with all
required initial and boundary conditions. The user is free to choose
any Elmer and OpenFOAM solvers.
69
2.1. Software architecture
The EOF-Library is written in modern Fortran and C++ following
the style of Elmer and OpenFOAM. Its core consists of roughly 1500
lines and extensively reuses functionality from underlying simulation packages. The coupler uses MPI and peer-to-peer strategies for
data communication between any two processes.
There are three major development branches for OpenFOAM ESI-OpenCFD (OpenFOAM.com), foundation - CFD Direct (OpenFOAM.org) and foam-extend (foam-extend.org) versions. EOFLibrary is supported for the foundation (OpenFOAM.org) version,
but with minimal code modifications it can be linked to all versions.
Peer-to-peer communication allows MPI processes from one
software to communicate directly with MPI processes from another without any central entity (server). A communication pattern
between processes is established during the simulation initialization phase. Since Elmer and OpenFOAM both use MPI for internal
parallelization, using the same communication standard is the best
choice in terms of convenience, parallel scaling and performance.
On the downside, it requires using the same MPI implementation
for all involved software.
Multiple programme, multiple data (MPMD) launch mode allows
multiple independent MPI-enabled programmes to be run simultaneously and share the same parallelization layer. For example,
to run two programmes simultaneously, execute the following
command in the terminal;
$ mpirun -n [nCores1] [program1] : -n [nCores2] [program2]
where [nCores1] and [nCores2] are the number of processes
used for programmes [program1] and [program2], respectively.
When two programmes are started in MPMD mode, they share
the same global MPI_COMM_WORLD communicator. MPI communicator splitting allows safe separation of Elmer, OpenFOAM and coupler parallelization. Global MPI communicator MPI_COMM_WORLD
is used for coupling purposes, OpenFOAM uses MPI_COMM_FOAM
and Elmer uses ELMER_COMM_WORLD for their internal parallelization. In such way both packages can run in parallel without
disturbing each other and maintain the ability to communicate
through a common communicator.
The EOF-Library uses an explicit coupling scheme. Elmer and
OpenFOAM perform consecutive computations, therefore only half
of the requested MPI processes are actually computing, while the
remaining half is idle. In our coupler implementation, we force MPI
processes to make short sleep calls to the system to wait while
another software is busy. This prevents performance degradation
due to overly frequent MPI_TEST calls. In order to achieve maximum performance, user should over-subscribe (request more CPU
ranks than available physical CPU cores) by running both Elmer and
OpenFOAM at the maximum number of physical cores. On a node
with N physical cores, 2N MPI processes should be requested.
2.2. Software functionality
A schematic for the initialization process and runtime is shown
in Fig. 1.
For the OpenFOAM-to-Elmer (O2E) coupling, Elmer sends point
coordinates to OpenFOAM. Depending on the selected option,
these coordinates belong to element centres (-elem), integration
points (-ip) or discontinuous Galerkin (-dg) points. It could be
noted that direct interpolation to the finite element nodes is not
ideal since the data must be interpolated to Gaussian quadrature
when used in finite element assembly. Also, interpolation to nodes
would result in issues at the boundaries: some nodes could lie
outside the OpenFOAM computational domain and interpolation
of discontinuous data would become problematic.
70
J. Vencels, P. Råback and V. Geža / SoftwareX 9 (2019) 68–72
It uses heuristics,
where the interpolated element is shrunk by
√
a factor 1/ 3 during the interpolation. For many elements, this
choice coincides with the 2nd order Gaussian quadrature and may
often provide a good compromise between speed and accuracy.
For the Elmer-to-OpenFOAM (E2O) coupling, OpenFOAM sends
all its cell centre coordinates to Elmer. Elmer searches for these
coordinates within its mesh and builds a sparse interpolation matrix consisting of weights for every point–node pair. Finally, all MPI
processes running Elmer inform all MPI processes running OpenFOAM of the search results and establish peer-to-peer contacts.
During runtime, Elmer applies the stored interpolation matrix to
variables that need to be communicated. Compared to initialization, this is a fast operation.
The coupler provides the means for sending status signals from
OpenFOAM to Elmer. In its current implementation OpenFOAM
can request Elmer to recompute fields or terminate simulation,
but other functions, such as support for dynamic meshes, can
rely on mutual exchange of information. Another candidate for
improvement is the coupling scheme. The coupler works using an
explicitly staggered coupling scheme, which, in theory, can cause
numerical instabilities in strongly coupled highly non-linear cases.
Fig. 1. Schematic for processes taking place during initialization and runtime. The
colour scheme corresponds to Elmer (blue), EOF-Library (green) and OpenFOAM
(orange). ‘‘E2O" is Elmer-to-OpenFOAM coupling and ‘‘O2E" is OpenFOAM-to-Elmer
coupling.
OpenFOAM searches the coordinates of points provided by
Elmer within its mesh and constructs arrays for point–cell pairs.
Then OpenFOAM informs Elmer about search results and establishes peer-to-peer connections between individual MPI processes.
During runtime, OpenFOAM interpolates values using interpolation algorithms specified by the user, such as ‘‘cell" or ‘‘cellPoint".
cell interpolation relies on simple nearest cell method. cellPoint is
more accurate and relies on breaking cells into tetrahedrons using
the cell centre points. Then the algorithm finds the tetrahedron
that encloses the point and uses inverse distance weighting to
perform interpolation.
Typically, numerical integration in FEM is carried out using
Gaussian quadratures that allow accurate integration of polynomials of known degrees. Therefore, the ideal interpolation for a
finite element assembly is via Gaussian interpolation points (ip). As an example, for a linear tetrahedron Elmer requires one
Gaussian integration point, whereas for quadratic hexahedrons it
needs 33 = 27 integration points. For smooth data, the interpolation to Gaussian quadratures may hence become computationally
excessive and a more economical alternative is the interpolation
to cell centres (-elem). Elmer also has a third option (-dg), which
involves interpolation to the corner nodes of the finite element.
3. Illustrative examples
The EOF-Library has been used in a 3D simulation of levitating
liquid metal in alternating electromagnetic field [15] and a simulation of surface waves in liquid metal generated by low frequency
electromagnetic field [16]; illustrations for both problems are provided in Fig. 2. In both cases, Elmer was used to solve the timeharmonic electromagnetic problem, and OpenFOAM was used for
two phase modelling of liquid metal using the VOF method.
In this paper we focus on synthetic benchmark tests for measuring the precision of interpolation algorithms, parallel scaling and
performance.
3.1. Interpolation test
A unit cube geometry and three tetrahedral meshes were used
with three different maximum edge sizes, such as 0.1 (4.7k tets),
0.05 (32k tets) and 0.025 (245k tets). To make Elmer and OpenFOAM meshes different while keeping the same number of tetrahedrons, we rotated one mesh around the x-axis. The effect of rotating
the mesh is shown in Fig. 3. Elmer and OpenFOAM initialize the
scalar field 2 sin[2π x] cos[2π y] illustrated in Fig. 4 and send it to
each other. The received field is compared against the reference
distribution and the relative error is computed as an L2 norm
(squared value).
For the O2E interpolation, we tested two OpenFOAM methods - cell and cellPoint, and three Elmer field types - elemental
Fig. 2. Simulation of 3D MHD with a free surface for electromagnetic levitation melting (left) and the problem of surface wave generation in liquid metals using low frequency
electromagnetic field (right). Images are generated from unpublished 3D results.
J. Vencels, P. Råback and V. Geža / SoftwareX 9 (2019) 68–72
71
Table 1
Relative error in % for OpenFOAM → Elmer interpolation using OpenFOAM cell (left)
and cellPoint (right) methods, and different Elmer field types (elem = elemental and
dg = discontinuous Galerkin)
cell (0th order)
cellPoint (1st order)
tet size
elem
dg
ip
tet size
elem
dg
ip
0.1
0.05
0.025
16
6.7
3.0
15
6.6
3.0
13
5.8
2.5
0.1
0.05
0.025
14
5.2
1.9
11
4.0
1.5
11
4.0
1.5
Table 2
Relative error in % for Elmer → OpenFOAM interpolation for different Elmer and
OpenFOAM mesh refinements. OpenFOAM meshes change horizontally and Elmer
meshes change vertically.
OpenFOAM
Elmer
Fig. 3. Two meshes have the same number of elements, but are rotated by 90
degrees around the x-axis to produce two different layouts. Magenta and cyan
colours represent the elements of the different meshes.
tet size
0.1
0.05
0.025
0.1
0.05
0.025
8.0
2.3
0.59
7.7
2.3
0.58
7.8
2.2
0.57
Table 3
Strong scaling test results for EOF-Library coupling routines. ‘‘Cores"—the number
of physical CPU cores used (the number of MPI ranks is double that), ‘‘Mem,
GB"—maximum memory usage (‘‘*"—estimated) and ‘‘Wall, s"—total runtime in
seconds, ‘‘Wall eff"—scaling efficiency for wall time. Actual scaling results are ‘‘O2E,
s"—OpenFOAM2Elmer and ‘‘E2O, s"—Elmer2OpenFOAM, where combined time for
interpolation and communication routines was measured.
Initialization
Runtime
Cores
Mem,GB
Wall,s
Wall eff
O2E,s
E2O,s
O2E,s
E2O,s
2
4
16
64
256
22.4
25.9
28.5
29.4*
77.7*
11791
5837
1500
420
173
100%
101%
98%
88%
53%
289
175
36
12
4.4
215
96
29
9.3
4.9
21
10
2.3
0.65
0.22
0.82
0.36
0.13
0.1
0.14
3.2. Strong scaling test
Fig. 4. Unit cube with scalar field 2 sin[2π x] cos[2π y] used for interpolation tests.
(-elem) with values stored in element centres, integration points
(-ip) and discontinuous Galerkin nodes (-dg). From the interpolation results in Table 1, we see that ‘‘cellPoint" performs better
than the ‘‘cell" algorithm. The relative decrease in errors is slightly
faster than that of cell size, suggesting that in both cases the
error is linearly bounded. The best result was obtained for the
integration point algorithm combined with the cellPoint algorithm
from OpenFOAM; an interpolation error of about 1.5% is tolerable
for many engineering applications.
For the E2O interpolation, we tested different combinations for
mesh refinement. From the interpolation results in Table 2, we see
that error is quadratically proportional to element size. The error
rate is almost identical for all OpenFOAM meshes—this result is
expected because OpenFOAM is not introducing any error in this
particular test. Compared to the results in Table 1, we conclude
that for the same meshes and distributions, the E2O interpolation
is more precise than O2E.
For our scaling test, we used the same unit cube geometry as
before and a much finer tetrahedral mesh. Elmer and OpenFOAM
meshes were different, but with an identical number of 13108874
elements/cells.
The scaling test was performed on the supercomputer Taito at
CSC—IT Center for Science. Computing nodes had two 8 core Intel
Xeon E5-2670 (Sandy Bridge) 2.6 GHz CPUs and 64 GB of memory
and were connected via high-bandwidth, low-latency Infiniband
FDR interconnect.
The results are shown in Table 3. We started with 2 physical
cores and went up to 256 cores (16 nodes). The reason why we start
with 2 cores is that OpenFOAM does not support runs on less than 2
MPI ranks. For tests on up to 64 cores, no more than 30 GB of memory were used. The wall time includes the initialization of Elmer,
OpenFOAM and EOF-Library routines, and 10 scalar field transfers
in both directions (E2O and O2E). Parallel scaling efficiency for
wall time ‘‘Wall eff" shows an overall estimate for coupler efficiency without applied simulation load. In the ‘‘Runtime" column,
combined interpolation and communication times for ‘‘O2E" and
‘‘E2O" routines are shown. For long runs with simulation load
these values have more impact on the total wall time. It is clear
that for realistic engineering simulations, typically run on 16–64
cores, EOF-Library takes about a second or less. This time is usually
insignificant compared to solution times of the underlying nonlinear physical equations.
4. Impact and conclusions
There are existing applications of the EOF-Library in MHD (levitation, wave generation and casting) and industrial plasma physics
72
J. Vencels, P. Råback and V. Geža / SoftwareX 9 (2019) 68–72
(vacuum arc PVD coating, magnetron sputtering). The coupler can
also be easily extended to other applications wherein volumetric
fields are shared between Elmer and OpenFOAM. Other potential
applications are convective cooling of electrical devices and microwave heating.
In this work we carried out interpolation precision tests and
found that for the Elmer-to-OpenFOAM direction the error decreases quadratically with element size. and that for the
OpenFOAM-to-Elmer direction it is proportional (linearly) to cell
size. The scaling test on up to 256 cores (16 nodes) shows that
EOF-Library has a negligible performance overhead. It uses a peerto-peer MPI-based coupling strategy and makes calls to Elmer and
OpenFOAM routines to achieve high performance and scalability.
In most realistic cases, the interpolation and field communication
combined will take less than a second.
The next development work will focus on coupling schemes,
allow temporal synchronization of coupled simulation software
and add coupling for surface fields.
Acknowledgements
This work was funded by the European Regional Development
Fund under the contract ‘‘Refinement of Metallurgical Grade Silicon Using Smart Refinement Technologies" (No. 1.1.1.1/16/A/097).
Juris Vencels visit at CSC—IT Centre for Science and access to its
supercomputers was supported by HPC-Europa3 project (Horizon
2020, No.730897). The authors would like to acknowledge the HPC
support provided for this research by Sami Ilvonen.
References
[1] Fautrelle Y, Perrier D, Etay J. Iron and Steel Institute of Japan international. ISIJ
Int 2003;43(6):801–6.
[2] Spitans S, Baake E, Jakovics A, Franz H. Large scale electromagnetic levitation
melting of metals. Int J Appl Electromagn Mech 2017;53(S1):S61–6.
[3] Keränen J, Pippuri J, Malinen M, Ruokolainen J, Råback P, Lyly M, Tammi K.
Efficient parallel 3-D computation of electrical machines with Elmer. IEEE
Trans Magn 2015;51(3):1–4.
[4] Weller HG, Tabor G, Jasak H, Fureby C. A tensorial approach to computational continuum mechanics using object-oriented techniques. Comput Phys
1998;12(6):620–31.
[5] Todd J, Christoffersen P, Zwinger T, Råback P, Chauché N, Benn D, Luckman A,
Ryan J, Toberg N, Slater D, et al. A full-stokes 3-D calving model applied to a
large greenlandic glacier. J Geophys Res Earth Surf 2018;123(3):410–32.
[6] Järvinen E, Råback P, Lyly M, Salenius J-P. A method for partitioned
fluid–structure interaction computation of flow in arteries. Med Eng Phys
2008;30(7):917–23.
[7] Deshpande SS, Anumolu L, Trujillo MF. Evaluating the performance of the
two-phase flow solver interFoam. Comput Sci Discov 2012;5(1):014016.
[8] Scanlon T, Roohi E, White C, Darbandi M, Reese J. An open source, parallel
DSMC code for rarefied gas flows in arbitrary geometries. Comput & Fluids
2010;39(10):2078–89.
[9] Stefani F, Galindo V, Kasprzyk C, Landgraf S, Seilmayer M, Starace M, Weber N,
Weier T. Magnetohydrodynamic effects in liquid metal batteries. IOP conference series: Materials science and engineering, vol. 143, IOP Publishing; 2016,
p. 012024.
[10] Choquet I, Shirvan AJ, Nilsson H. On the choice of electromagnetic model
for short high-intensity arcs, applied to welding. J Phys D: Appl Phys
2012;45(20):205203.
[11] Beckstein P, Galindo V, Vukčević V. Efficient solution of 3D electromagnetic
eddy-current problems within the finite volume framework of OpenFOAM. J
Comput Phys 2017;344:623–46.
[12] Nikbay M, Öncü L, Aysan A. Multidisciplinary code coupling for analysis and
optimization of aeroelastic systems. J Aircr 2009;46(6):1938–44.
[13] Bungartz H-J, Lindner F, Gatzhammer B, Mehl M, Scheufele K, Shukaev A, Uekermann B. preCICE–a fully parallel library for multi-physics surface coupling.
Comput & Fluids 2016;141:250–8.
[14] Joppich W, Kürschner M. MpCCI - a tool for the simulation of coupled applications. Concurr Comput: Pract Exper 2006;18(2):183–92.
[15] Vencels J, Jakovics A, Geza V. Simulation of 3D MHD with Free Surface Using
Open-Source EOF-library. Magnetohydrodynamics 2017;53(4):643–52.
[16] Geža V, Venčels J, Zāg̀eris Ģ, Pavlovs S. Numerical modelling of surface waves
generated by low frequency electromagnetic field for silicon refinement
process. IOP conference series: Materials science and engineering, vol. 355,
IOP Publishing; 2018, p. 012020.
Download