Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2830018.2830019acmconferencesArticle/Chapter ViewAbstractPublication PagesscConference Proceedingsconference-collections
research-article
Open access

FDPS: a novel framework for developing high-performance particle simulation codes for distributed-memory systems

Published: 15 November 2015 Publication History

Abstract

We have developed FDPS (Framework for Developing Particle Simulator), which enables researchers and programmers to develop high-performance particle simulation codes easily. The basic idea of FDPS is to separate the program code for complex parallelization including domain decomposition, redistribution of particles, and exchange of particle information for interaction calculation between nodes, from actual interaction calculation and orbital integration. FDPS provides the former part and the users write the latter. Thus, a user can implement, for example, a high-performance N- body code, only in 120 lines. In this paper, we present the structure and implementation of FDPS, and describe its performance on two sample applications: gravitational N-body simulation and Smoothed Particle Hydrodynamics simulation. Both codes show very good parallel efficiency and scalability on the K computer. FDPS lets the researchers concentrate on the implementation of physics and mathematical schemes, without wasting their time on the development and performance tuning of their codes.

References

[1]
E. Asphaug and A. Reufer. Mercury and other iron-rich planetary bodies as relics of inefficient accretion. Nature Geoscience, 7:564--568, Aug. 2014.
[2]
J. S. Bagla. TreePM: A Code for Cosmological N-Body Simulations. Journal of Astrophysics and Astronomy, 23:185--196, Dec. 2002.
[3]
D. S. Balsara. von Neumann stability analysis of smooth particle hydrodynamics--suggestions for optimal algorithms. Journal of Computational Physics, 121:357--372, 1995.
[4]
J. Barnes and P. Hut. A hierarchical O(N log N) force-calculation algorithm. Nature, 324:446--449, Dec. 1986.
[5]
J. E. Barnes. A modified tree code: Don't laugh; It runs. Journal of Computational Physics, 87:161--170, Mar. 1990.
[6]
J. Bédorf, E. Gaburov, M. S. Fujii, K. Nitadori, T. Ishiyama, and S. P. Zwart. 24.77 pops on a gravitational tree-code to simulate the milky way galaxy with 18600 gpus. In Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis, SC '14, pages 54--65, Piscataway, NJ, USA, 2014. IEEE Press.
[7]
J. Bédorf, E. Gaburov, and S. Portegies Zwart. A sparse octree gravitational N-body code that runs entirely on the GPU processor. Journal of Computational Physics, 231:2825--2839, Apr. 2012.
[8]
W. Benz, W. L. Slattery, and A. G. W. Cameron. The origin of the moon and the single-impact hypothesis. I. Icarus, 66:515--535, June 1986.
[9]
D. Blackston and T. Suel. Highly portable and efficient implementations of parallel adaptive n-body methods. In Proceedings of the 1997 ACM/IEEE Conference on Supercomputing, SC '97, pages 1--20, New York, NY, USA, 1997. ACM.
[10]
P. Bode, J. P. Ostriker, and G. Xu. The Tree Particle-Mesh N-Body Gravity Solver. Astrophysical Journal Supplement, 128:561--569, June 2000.
[11]
A. G. W. Cameron and W. R. Ward. The Origin of the Moon. In Lunar and Planetary Science Conference, volume 7 of Lunar and Planetary Science Conference, page 120, Mar. 1976.
[12]
R. M. Canup, A. C. Barr, and D. A. Crawford. Lunar-forming impacts: High-resolution SPH and AMR-CTH simulations. Icarus, 222:200--219, Jan. 2013.
[13]
J. Dubinski. A parallel tree code. New Astronomy, 1:133--147, Oct. 1996.
[14]
J. Dubinski, J. Kim, C. Park, and R. Humble. GOTPM: a parallel hybrid particle-mesh treecode. New Astronomy, 9:111--126, Feb. 2004.
[15]
E. Gaburov, S. Harfst, and S. Portegies Zwart. SAPPORO: A way to turn your graphics cards into a GRAPE-6. New Astronomy, 14:630--637, Oct. 2009.
[16]
T. Hamada, T. Narumi, R. Yokota, K. Yasuoka, K. Nitadori, and M. Taiji. 42 tflops hierarchical n-body simulations on gpus with applications in both astrophysics and turbulence. In Proceedings of the Conference on High Performance Computing Networking, Storage and Analysis, SC '09, pages 62:1--62:12, New York, NY, USA, 2009. ACM.
[17]
T. Hamada and K. Nitadori. 190 tflops astrophysical n-body simulation on a cluster of gpus. In Proceedings of the 2010 ACM/IEEE International Conference for High Performance Computing, Networking, Storage and Analysis, SC '10, pages 1--9, Washington, DC, USA, 2010. IEEE Computer Society.
[18]
T. Hamada, K. Nitadori, K. Benkrid, Y. Ohno, G. Morimoto, T. Masada, Y. Shibata, K. Oguri, and M. Taiji. A novel multiple-walk parallel algorithm for the barnes-hut treecode on gpus-towards cost effective, high performance n-body simulation. Computer Science-Research and Development, 24(1):21--31, 2009.
[19]
W. K. Hartmann and D. R. Davis. Satellite-sized planetesimals and lunar origin. Icarus, 24:504--514, Apr. 1975.
[20]
R. Hockney and J. Eastwood. Computer Simulation Using Particles. CRC Press, 1988.
[21]
T. Ishiyama, T. Fukushige, and J. makino. Greem: Massively parallel treepm code for large cosmological n-body simulations. Publications of the Astronomical Society of Japan, 61(6):1319--1330, December 2009.
[22]
T. Ishiyama, K. Nitadori, and J. makino. 4.45 pflops astrophysical n-body simulation on k computer -- the gravitational trillion-body problem. In SC '12 Proceedings of the International Conference on High Performance Computing, Networking, Storage and Analysis (Salt Lake City, UT, USA, Nov 11-15, 2012, 2012.
[23]
J. Makino. A Modified Aarseth Code for GRAPE and Vector Processors. Publications of the Astronomical Society of Japan, 43:859--876, Dec. 1991.
[24]
J. Makino. A Fast Parallel Treecode with GRAPE. Publications of Astronomical Society of Japan, 56:521--531, June 2004.
[25]
J. J. Monaghan. Smoothed particle hydrodynamics. Annual review of astronomy and astrophysics, 30:543--574, 1992.
[26]
J. J. Monaghan. SPH and Riemann Solvers. Journal of Computational Physics, 136:298--307, Sept. 1997.
[27]
K. Nitadori, J. Makino, and P. Hut. Performance tuning of N-body codes on modern microprocessors: I. Direct integration with a hermite scheme on x86_64 architecture. New Astronomy, 12:169--181, Dec. 2006.
[28]
H. C. Plummer. On the problem of distribution in globular star clusters. Monthly Notices of the Royal Astronomical Society, 71:460--470, Mar. 1911.
[29]
S. Rosswog. Astrophysical smooth particle hydrodynamics. New Astronomy Reviews, 53:78--104, Apr. 2009.
[30]
J. K. Salmon and M. S. Warren. Skeletons from the treecode closet. Journal of Computational Physics, 111:136--155, Mar. 1994.
[31]
V. Springel. The cosmological simulation code gadget-2. volume 364, pages 1105--1134, December 2005.
[32]
V. Springel. Smoothed Particle Hydrodynamics in Astrophysics. Annual Review of Astronomy and Astrophysics, 48:391--430, Sept. 2010.
[33]
A. Tanikawa, K. Yoshikawa, K. Nitadori, and T. Okamoto. Phantom-GRAPE: Numerical software library to accelerate collisionless N-body simulation with SIMD instruction set on x86 architecture. New Astronomy, 19:74--88, Feb. 2013.
[34]
A. Tanikawa, K. Yoshikawa, T. Okamoto, and K. Nitadori. N-body simulation for self-gravitating collisional systems with a new SIMD instruction set extension to the x86 architecture, Advanced Vector eXtensions. New Astronomy, 17:82--92, Feb. 2012.
[35]
L. F. A. Teodoro, M. S. Warren, C. Fryer, V. Eke, and K. Zahnle. A One Hundred Million SPH Particle Simulation of the Moon Forming Impact. In Lunar and Planetary Science Conference, volume 45 of Lunar and Planetary Science Conference, page 2703, Mar. 2014.
[36]
M. S. Warren and J. K. Salmon. A portable parallel particle program. Computer Physics Communications, 87:266--290, May 1995.
[37]
M. S. Warren, J. K. Salmon, D. J. Becker, M. P. Goda, T. L. Sterling, and W. Winckelmans. Pentium pro inside: I. a treecode at 430 gigaflops on asci red, ii. price/performance of $50/mflop on loki and hyglac. In SC, page 61. IEEE, 1997.
[38]
G. Xu. A New Parallel N-Body Gravity Solver: TPM. Astrophysical Journal Supplement, 98:355, May 1995.
[39]
K. Yoshikawa and T. Fukushige. PPPM and TreePM Methods on GRAPE Systems for Cosmological N-Body Simulations. Publications of Astronomical Society of Japan, 57:849--860, Dec. 2005.

Cited By

View all
  • (2022)Numerical Simulation Study of Debris Particles Movement Characteristics by Smoothed Particle HydrodynamicsJournal of Disaster Research10.20965/jdr.2022.p023717:2(237-245)Online publication date: 1-Feb-2022
  • (2020)MethodDevelopment of a Numerical Simulation Method for Rocky Body Impacts and Theoretical Analysis of Asteroidal Shapes10.1007/978-981-15-3722-6_2(19-60)Online publication date: 18-Mar-2020
  • (2019)Implementation of SPH and DEM for a PEZY-SC Heterogeneous Many-Core SystemComputational and Experimental Simulations in Engineering10.1007/978-3-030-27053-7_60(709-715)Online publication date: 17-Nov-2019
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
WOLFHPC '15: Proceedings of the 5th International Workshop on Domain-Specific Languages and High-Level Frameworks for High Performance Computing
November 2015
72 pages
ISBN:9781450340168
DOI:10.1145/2830018
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 November 2015

Check for updates

Author Tags

  1. algorithm
  2. framework for implementing parallel codes
  3. high-performance computing
  4. particle simulation

Qualifiers

  • Research-article

Conference

SC15
Sponsor:

Acceptance Rates

WOLFHPC '15 Paper Acceptance Rate 9 of 13 submissions, 69%;
Overall Acceptance Rate 13 of 19 submissions, 68%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)99
  • Downloads (Last 6 weeks)19
Reflects downloads up to 02 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2022)Numerical Simulation Study of Debris Particles Movement Characteristics by Smoothed Particle HydrodynamicsJournal of Disaster Research10.20965/jdr.2022.p023717:2(237-245)Online publication date: 1-Feb-2022
  • (2020)MethodDevelopment of a Numerical Simulation Method for Rocky Body Impacts and Theoretical Analysis of Asteroidal Shapes10.1007/978-981-15-3722-6_2(19-60)Online publication date: 18-Mar-2020
  • (2019)Implementation of SPH and DEM for a PEZY-SC Heterogeneous Many-Core SystemComputational and Experimental Simulations in Engineering10.1007/978-3-030-27053-7_60(709-715)Online publication date: 17-Nov-2019
  • (2019)The Performance Prediction and Improvement of SPH with the Interaction-List-Sharing Method on PEZY-SCsComputational Science – ICCS 201910.1007/978-3-030-22750-0_40(476-482)Online publication date: 8-Jun-2019
  • (2017)Does Explosive Nuclear Burning Occur in Tidal Disruption Events of White Dwarfs by Intermediate-mass Black Holes?The Astrophysical Journal10.3847/1538-4357/aa697d839:2(81)Online publication date: 18-Apr-2017

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media