iPIC3D: implicit Particle-in-Cell code for Space Weather Applications (KTH)

Space Weather is the study of the processes originating on the Sun and propagating through the Solar System, which effects people and technology in Space and on the Earth. KTH has implemented the massively parallel Particle-in-Cell code, iPIC3D, as a C++ program using MPI and OpenMP.

iPIC3D simulates the interaction of Solar Wind and Solar Storms with the Earth’s Magnetosphere and spacecrafts and as the AllScale pilot application focuses on the plasma particles interacting with the Earth’s magnetic field. Some highly energetic particles are trapped within the Van Allen radiation belts while others escape the confinement of the Earth’s magnetic field. This leads to large load imbalances since most of the particles are concentrate close to the Earth (violet cloud in the figure), while few particles are located in other space regions (almost not visible particle outside the violet cloud in the figure).

The current implementation of iPIC3D has considerable load imbalance problems. In AllScale, the numerical scheme of iPIC3D is redesigned to exploit the nested recursive parallelism, in order to benefit from the dynamic load balancing and resilience functionality, as supported by the AllScale Environment.


The redesign of the algorithm for iPIC3D will not only allow us leveraging nested recursive parallelism based on the AllScale API, but will also reduce the total amount of global communication and synchronization by 80%, which will eventually lead to a much better scaling behavior for iPIC3D on massively parallel computing systems. Furthermore, the AllScale Environment will enable reduction of either energy or resource consumption, by at least 25%, while not causing an execution penalty of more than 10%. Finally, the AllScale iPIC3D will allow us to perform the PIC simulation of the Earth’s radiation belts formation, on a grid with 5,000x5,000x5,000 cells with at least 10,000 particles per cell, requiring 10²²-10²³ FLOPS or 27-hour-simulation on an ExaScale super-computer.


This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 671603

Contact Details

General Coordinator

Thomas Fahringer

Scientific Coordinator

Herbert Jordan