Software:PreCICE

From HandWiki
preCICE
logo of preCICE showing a round cutout of 6 coloured streamlines of laminar flow around a cylinder. 2 black, 1 blue, 1 orange, 2 black.
Developer(s)University of Stuttgart, Technical University of Munich, and the preCICE community
Initial releaseJune 1, 2010; 14 years ago (2010-06-01)[1]
Written inC++
Operating systemLinux, macOS, Windows[2],FreeBSD[3]
PredecessorFSI*ce[4]
Available inEnglish
Typesimulation software, multiphysics simulation, multiscale simulation
LicenseLGPL-3.0-or-later


preCICE is a coupling library for partitioned multi-physics simulations, including but not restricted to fluid-structure interaction, conjugate heat transfer, and more. preCICE is not specific to particular applications or tools, but instead couples independent, existing codes capable of simulating a subpart of the complete physics involved in a simulation. It offers convenient, robust, and efficient methods for transient equation coupling, communication, and data mapping.

To create a coupled simulation, a user would modify existing simulation codes to add calls to the preCICE API (or use one of the provided adapters), provide a preCICE configuration file, and start each code normally, e.g., in a separate terminal. preCICE follows a library approach, does not introduce any central component, and does not require modifications to the calling code in order to use a different coupling or interpolation method, keeping the integration minimally-invasive[5].

preCICE is free software, developed publicly on GitHub and with mainly public funding. It follows open science and the FAIR principles, demonstrated via several open access publications[5][6][7].

Overview of the preCICE coupling library (as of 2024), including the main concepts, features, and examples of codes already coupled.

History

Early years

The foundations for the work leading to preCICE come from projects funded between 2003-2009 by the German Research Foundation in the Research Group FOR493. The name "preCICE" (precise code interaction coupling environment) appears in literature first in 2010[1]. preCICE is a direct successor of FSI*ce (stylized FSI❄ce), developed at the Technical University of Munich, which mainly targeted fluid-structure interaction simulations[4].

preCICE v1

In May 2015, the development of preCICE was moved to its own organization on GitHub, which now includes repositories for the core library and several further components of the project. The first stable version of the core library was released in November 2017, following semantic versioning (v1.0.0). At that time, the documentation of the project was hosted in a GitHub Wiki.

The state of preCICE v1.0.0 is largely as described in what is described in the preCICE literature guide[7] as "v1 reference paper", published in 2016[8], together with collaborators from the University of Stuttgart. The paper describes the core library, with main features being a variety of coupling schemes (explicit and implicit, Aitken underrelaxation, Anderson and Broyden quasi-Newton acceleration algorithms), data mapping methods (nearest-neighbor, nearest-projection, RBF), and communication methods (TCP/IP sockets, MPI ports). The paper also includes a list of coupled codes developed by the authors or collaborators, as well as FSI benchmarks demonstrating numerical accuracy and performance scalability up to 16384 processes on SuperMUC. During that time, the development of preCICE was partially funded by the German Priority Programme 1648: SPPEXA - Software for Exascale Computing via the ExaFSA project[9].

The v1.x release cycle saw releases until v1.6.1, in September 2019.

preCICE v2

During the v1.x release cycle, and driven primarily by a German Research Foundation project specifically intended for "Research data and software" (project number 391150578), the preCICE project saw development in different directions: extensive refactoring of the code and full migration to the CMake build system (from previously SCons), new or additional unit, integration, and system tests, large expansion of the available documentation, development of several new adapters[10] and several community building measures[11]. Several of these changes are connected to the acceptance of preCICE into the extreme-scale scientific software development kit (xSDK)[12].

This development led to preCICE v2 (v2.0.0) and later on to a new reference paper[5] describing the state of the software and the respective ecosystem at that time. This paper is currently the default citation recommendation in the literature guide[7].

Since constructing a coupled simulation typically involves more components beyond the core library (language bindings and adapters), the source code of selected components, together with the application cases, was published as a separate data publication[13]. This bundle is called a preCICE distribution, and has seen semi-regular releases since, following a calendar-based versioning scheme.

The v2.x release cycle saw releases until v2.5.1, in January 2024.

preCICE v3

preCICE v3.0.0 was released in February 2024. It was soon after replaced by v3.1.0 and then v3.1.1, which is included in the preCICE Distribution v2404.0[6].

Notable changes of v3 include simplifications in the API and configuration, multirate and higher-order time stepping[14], and faster RBF mapping based on a partition of unity approach.

At the time of the v3 release cycle, the project has expanded from targeting mainly surface coupling to also targeting volume coupling (overlapping domains, see domain decomposition methods) via the more efficient mapping methods, geometric multiscale mapping, system codes implementing the Functional Mock-up Interface[15], and multiscale simulations[16].

API

The native API of preCICE is written in C++. Language bindings for C and Fortran are compiled into the preCICE library itself. Further language bindings are available externally.

Language Repository License Package Usage (v3)
C++ on GitHub LGPL-3.0-or-later GitHub Releases
#include <precice/precice.hpp>
precice::Participant p(…);
C
#include <precice/preciceC.h>
preciceC_createParticipant(…);
Fortran
CALL precicef_create(…)
Fortran Module on GitHub LGPL-3.0-or-later
use precice
CALL precicef_create(…)
Python on GitHub LGPL-3.0-or-later PyPi
import precice
p = precice.Participant(…)
Rust on GitHub LGPL-3.0-or-later crates.io
use precice
let mut participant = precice::Participant::new(…);
Julia on GitHub LGPL-3.0-or-later
using PreCICE
PreCICE.createParticipant(…)
Matlab on GitHub LGPL-3.0-or-later
p = precice.Participant(…)

Configuration

The individual coupled codes (coupling participants) share a common XML-based configuration file, which specifies the data and coupling meshes, the communication, coupling scheme, acceleration method, and more at runtime. Using a different coupling scheme does not require any changes to the code calling preCICE, but only to the configuration file[8].

The coupled codes often group the preCICE API calls into an adapter code. This adapter is typically configured by a separate configuration file (with a format appropriate for the respective simulation code), and specifies the exact region of the simulation domain to be coupled, as well as the exact data exchanged.[5].

Example

An adapted fluid solver written in Python using preCICE v3 (ported from v2 reference paper[5]). While this example is inspired from fluid-structure interaction (exchanging forces and displacements), the model, solution fields, and the exchanged fields can be arbitrary.

import precice

participant = precice.Participant("Fluid", "../precice-config.xml", 0, 1)

positions = ... # define coupling mesh, 2D array with shape (number of vertices, dimension of physical space)
vertex_ids = participant.set_mesh_vertices("Fluid-Mesh", positions)

participant.initialize()

t = 0 # time

u = initialize_solution() # returns initial solution

while participant.is_coupling_ongoing(): # main time loop
        
    if participant.requires_writing_checkpoint():
        u_checkpoint = u

    solver_dt = compute_adaptive_time_step_size() 
    precice_dt = participant.get_max_time_step_size()
    dt = min(precice_dt, solver_dt) # actual time step size
    
    # returns 2D array with shape (n, dim)
    displacements = participant.read_data("Fluid-Mesh", "Displacement", vertex_ids, dt)
    
    u = solve_time_step(dt, u, displacements) # returns new solution
    
    # returns 2D array with shape (n, dim)
    forces = compute_forces(u) 
    participant.write_data("Fluid-Mesh", "Force", vertex_ids, forces)
    
    participant.advance(dt)

    if participant.requires_reading_checkpoint():
        u = u_checkpoint
    else: # continue to next time step 
        t = t + dt    

participant.finalize()

Coupled codes

While preCICE is a software library with an API that can be used by programmers to couple their own code, there exist several integrations with several simulation codes, making preCICE more accessible to end users that are not primarily programmers (such as applied mathematicians, mechanical engineers, or climate scientists).

In the terminology used by preCICE, the integrations to simulation codes are called adapters[10] and can be maintained by the preCICE developers or third parties. A non-exhaustive list of adapters is available on the preCICE website[17].

Example codes that preCICE integrates with via adapters include[17], among others:

The v2 reference paper[5] also cites works that have coupled CAMRAD II, DLR TAU, DUST, DuMuX, Rhoxyz, Ateles, XDEM, and FLEXI. Further known coupled codes include MBDyn, OpenFAST, LS-DYNA[18], and G+Smo.

Applications

Academic publications by the developers and by independent research groups have demonstrated preCICE for several applications (see pointers to literature in the v2 paper [5]), while further examples are listed on the website of the project.

  • Mechanical and civil engineering
    • Aeroacoustics: See, e.g., the ExaFSA HPC project (Germany, Netherlands, Japan)[9].
    • Aerodynamics: See, e.g., work by the TU Delft on inflatable kites (Netherlands)[19].
    • Aerodynamic heating: See, e.g., a paper on hypersonic aerothermal simulations (USA)[20].
    • Explosions: See, e.g., work by the National University of Defense Technology (China)[21].
    • Urban wind modeling: See, e.g., work by the University of Manchester (UK)[22].
    • Manufacturing processes: See, e.g., work by the Austrian Institute of Technology (Austria)[18].
  • Marine engineering
    • See, e.g., work by the University of Split (Croatia)[23].
  • Bioengineering
    • Hemodynamics - heart valves: See, e.g., work by the University of Stellenbosch (South Africa)[24].
    • Hemodynamics - aorta: See, e.g., work by the UPC (Spain) and the University of Stuttgart (Germany)[25].
    • Fish locomotion: See, e.g., work by the University of Strathclyde (UK) and collaborators (China)[26].
    • Muscle-tendon systems: See, e.g., work by the University of Stuttgart[27] (Germany).
  • Nuclear fission and fusion reactors
    • Thermohydraulics: See, e.g., work by GRS (in collaboration with the Technical University of Munich and the preCICE developers) on coupled reactor thermohydraulics[28] (Germany).
  • Geophysics
  • Further examples

Community

The preCICE community meets in annual preCICE workshops and sessions in further conferences.

User support is provided by the developers and further community members in the preCICE forum (based on Discourse) on a voluntary basis. A support program is also available, which also uses public discussions in the forum (with priority) as a primary means of communication.

Further channels exist (an announcements mailing list and Matrix channels), but the forum is the most active.

Details regarding the community and the community-building measures are detailed in the v2 reference paper[5] and in further blog posts[11].

See also

References

  1. 1.0 1.1 Gatzhammer, Bernhard; Mehl, Miriam; Neckel, Tobias (June 2010). "A coupling environment for partitioned multiphysics simulations applied to fluid-structure interaction scenarios". Procedia Computer Science (Elsevier) 1 (1): 681–689. doi:10.1016/j.procs.2010.04.073. 
  2. "MSYS2 Packages - Package: mingw-w64-x86_64-precice". https://packages.msys2.org/packages/mingw-w64-x86_64-precice. 
  3. "FreeBSD Git repositories - root/science/precice/Makefile". https://cgit.freebsd.org/ports/tree/science/precice/Makefile. 
  4. 4.0 4.1 "Software Developments - Chair of Scientific Computing". Technical University of Munich. https://www.cs.cit.tum.de/en/sccs/research/software-developments/. 
  5. 5.0 5.1 5.2 5.3 5.4 5.5 5.6 5.7 Chourdakis G.; Davis K.; Rodenberg B.; Schulte M.; Simonis F.; Uekermann B.; Abrams G.; Bungartz HJ. et al. (2022). "preCICE v2: A sustainable and user-friendly coupling library [version 2; peer review: 2 approved"]. Open Research Europe 2 (51): 51. doi:10.12688/openreseurope.14445.2. PMID 37645328. 
  6. 6.0 6.1 Chen, Jun; Chourdakis, Gerasimos; Desai, Ishaan; Homs-Pons, Carme; Rodenberg, Benjamin; Schneider, David; Simonis, Frédéric; Uekermann, Benjamin et al. (2024). "preCICE Distribution Version v2404.0". DaRUS. doi:10.18419/darus-4167. 
  7. 7.0 7.1 7.2 "preCICE website - literature guide". https://precice.org/fundamentals-literature-guide.html. 
  8. 8.0 8.1 Hans-Joachim Bungartz; Florian Lindner; Bernhard Gatzhammer; Miriam Mehl; Klaudius Scheufele; Alexander Shukaev; Benjamin Uekermann (2016). "preCICE – A fully parallel library for multi-physics surface coupling". Computers & Fluids 141: 250–258. doi:10.1016/j.compfluid.2016.04.003. ISSN 0045-7930. https://www.sciencedirect.com/science/article/pii/S0045793016300974. 
  9. 9.0 9.1 Lindner, Florian; Totounferoush, Amin; Mehl, Miriam; Uekermann, Benjamin; Pour, Neda Ebrahimi; Krupp, Verena; Roller, Sabine; Reimann, Thorsten et al. (2020). "ExaFSA: Parallel Fluid-Structure-Acoustic Simulation". in Hans-Joachim Bungartz, Severin Reiz, Benjamin Uekermann, Philipp Neumann, Wolfgang E. Nagel. Software for Exascale Computing - SPPEXA 2016-2019. 136. Cham: Springer International Publishing. pp. 271–300. doi:10.1007/978-3-030-47956-5_10. ISBN 978-3-030-47955-8. 
  10. 10.0 10.1 Uekermann, Benjamin; Bungartz, Hans-Joachim; Cheung Yau, Lucia; Chourdakis, Gerasimos; Rusch, Alexander (October 2017). "Official preCICE Adapters for Standard Open-Source Solvers". Proceedings of the 7th GACM Colloquium on Computational Mechanics for Young Scientists from Academia. doi:10.18419/opus-9334. https://www.gacm2017.uni-stuttgart.de/registration/Upload/ExtendedAbstracts/ExtendedAbstract_0138.pdf. Retrieved 5 November 2024. 
  11. 11.0 11.1 Uekermann, Benjamin (September 2020). "How did preCICE get popular?". Zenodo. doi:10.5281/zenodo.12795484. 
  12. "GitHub - xsdk-project - xSDK Community Policy Compatibility for preCICE". https://github.com/xsdk-project/xsdk-policy-compatibility/blob/master/precice-policy-compatibility.md. 
  13. Chourdakis, Gerasimos; Davis, Kyle; Rodenberg, Benjamin; Schulte, Miriam; Simonis, Frédéric; Uekermann, Benjamin; Abrams, Georg; Bungartz, Hans-Joachim et al. (2021). "preCICE Distribution Version v2104.0". DaRUS. doi:10.18419/darus-2125. 
  14. Rüth, Benjamin; Uekermann, Benjamin; Mehl, Miriam; Birken, Philipp; Monge, Azahar; Bungartz, Hans-Joachim (2020). "Quasi-Newton Waveform Iteration for Partitioned Surface-Coupled Multi-Physics Applications". International Journal for Numerical Methods in Engineering 122 (19): 5236–5257. doi:10.1002/nme.6443. 
  15. Willeke, Leonard; Schneider, David; Uekermann, Benjamin (2023). "A preCICE-FMI Runner to Couple FMUs to PDE-Based Simulations". in Müller, Dirk; Monti, Antonello; Benigni, Andrea. Linköping Electronic Conference Proceedings. 
  16. Desai, Ishaan; Scheurer, Erik; Bringedal, Carina; Uekermann, Benjamin (2023). "Micro Manager: a Python package for adaptive and flexible two-scale coupling". Journal of Open Source Software (The Open Journal) 8 (91): 5842. doi:10.21105/joss.05842. 
  17. 17.0 17.1 "preCICE website - Overview of adapters". https://precice.org/adapters-overview.html. 
  18. 18.0 18.1 Scheiblhofer, Stefan; Jäger, Stephan; Horr, Amir M. (2019). "Coupling FEM and CFD solvers for continuous casting process simulation using precice". COUPLED VIII: Proceedings of the VIII International Conference on Computational Methods for Coupled Problems in Science and Engineering. CIMNE. pp. 23–32. ISBN 978-84-949194-5-9. http://hdl.handle.net/2117/189920. 
  19. Folkersma, Mikko; Schmehl, Roland; Viré, Axelle (2020). "Steady-state aeroelasticity of a ram-air wing for airborne wind energy applications". Journal of Physics: Conference Series (IOP Publishing) 1618 (3): 032018. doi:10.1088/1742-6596/1618/3/032018. 
  20. Signorelli, Joseph M.; Higgins, Ian R.; Maszkiewicz, Samuel A.; Laurence, Stuart; Bodony, Daniel J. (2024-07-29). "Hypersonic Aerothermal Computations of a Sharp Fin Interaction". AIAA AVIATION FORUM AND ASCEND 2024. Las Vegas, Nevada: American Institute of Aeronautics and Astronautics. doi:10.2514/6.2024-3548. ISBN 978-1-62410-716-0. https://arc.aiaa.org/doi/10.2514/6.2024-3548. Retrieved 2024-10-02. 
  21. Zhang, Sen; Guo, Xiao-Wei; Li, Chao; Liu, Yi; Zhao, Ran; Yang, Canqun (2020). "Numerical Study of Fluid-Structure Interaction Dynamics under High-explosive Detonation on Massively Parallel Computers". pp. 525–531. doi:10.1109/HPCC-SmartCity-DSS50907.2020.00065. 
  22. Revell, A; Afgan, I; Ali, A E A; Camps Santasmasas, M; Craft, T; de Rosis, A; Holgate, J; Laurence, D et al. (2020). "Coupled Hybrid RANS-LES Research at The University of Manchester". ERCOFTAC Bulletin (European Research Community on Flow, Turbulence And Combustion) 120: 67. https://hal.science/hal-02476649. Retrieved 2024-11-21. 
  23. Andrun, Martina; Bašić, Josip; Blagojević, Branko; Klarin, Branko (2020). "Simulating hydroelastic slamming by coupled Lagrangian-FDM and FEM". IOS Press. pp. 135–142. doi:10.3233/PMST200036. 
  24. Davis, Kyle (2018). Numerical and experimental investigation of the hemodynamics of an artificial heart valve (Thesis). University of Stellenbosch. Retrieved 2024-11-21.
  25. Naseri, Alireza; Totounferoush, Amin; González, Ignacio; Mehl, Miriam; Pérez-Segarra, Carlos David (2020). "A scalable framework for the partitioned solution of fluid–structure interaction problems". Computational Mechanics (Springer) 66 (2): 471–489. doi:10.1007/s00466-020-01860-y. 
  26. Luo, Yang; Xiao, Qing; Shi, Guangyu; Wen, Li; Chen, Daoyi; Pan, Guang (2020). "A fluid–structure interaction solver for the study on a passively deformed fish fin with non-uniformly distributed stiffness". Journal of Fluids and Structures (Elsevier) 92: 102778. doi:10.1016/j.jfluidstructs.2019.102778. https://strathprints.strath.ac.uk/70276/1/Luo_etal_JFS_2019_A_fluid_structure_interaction_solver_for_the_study_on_a_passively_deformed_fish_fin.pdf. 
  27. Maier, Benjamin; Schneider, David; Schulte, Miriam; Uekermann, Benjamin (2023). "Bridging scales with volume coupling - Scalable simulations of muscle contraction and electromyography". in Nagel, Wolfgang E.; Kröner, Dietmar H.; Resch, Michael M.. High Performance Computing in Science and Engineering '21. Cham: Springer International Publishing. pp. 185–199. doi:10.1007/978-3-031-17937-2_11. ISBN 978-3-031-17937-2. 
  28. Herb, Joachim; Weyermann, Fabian (2024-04-25). "Implementation of the preCICE coupling interface for AC2/ATHLET". Kerntechnik 89 (2): 185–201. doi:10.1515/kern-2023-0119. https://www.degruyter.com/document/doi/10.1515/kern-2023-0119/html. Retrieved 2024-11-12. 
  29. Jaust, Alexander; Weishaupt, Kilian; Mehl, Miriam; Flemisch, Bernd (2020). "Partitioned Coupling Schemes for Free-Flow and Porous-Media Applications with Sharp Interfaces". in Klöfkorn, Robert; Keilegavlen, Eirik; Radu, Florin A. et al.. Springer International Publishing. pp. 605–613. doi:10.1007/978-3-030-43651-3_57. ISBN 978-3-030-43651-3. 
  30. Böttcher, Fabian; Davis, Kyle; Halilovic, Smajil; Odersky, Leonhard; Pauw, Viktoria; Schramm, Thilo; Zosseder, Kai (2021). Optimising the thermal use of groundwater for a decentralized heating and cooling supply in the city of Munich, Germany (Report). doi:10.5194/egusphere-egu21-14929. 
  31. Munafò, Alessandro; Chiodi, Robert; Kumar, Sanjeev; Maout, Vincent Le; Stephani, Kelly A; Panerai, Francesco; Bodony, Daniel J; Panesi, Marco (2022). "A Multi-Physics Modeling Framework for Inductively Coupled Plasma Wind Tunnels". AIAA SCITECH 2022 Forum. pp. 21. doi:10.2514/6.2022-1011. https://arc.aiaa.org/doi/10.2514/6.2022-1011.