Nothing Special   »   [go: up one dir, main page]

Jump to content

Edinburgh Parallel Computing Centre

Coordinates: 55°55′18″N 3°10′26″W / 55.9217°N 3.1740°W / 55.9217; -3.1740
From Wikipedia, the free encyclopedia

EPCC
The Bayes Centre on Potterrow, Edinburgh
Established1991
Field of research
High performance computing, Hardware acceleration, Computational science and engineering
DirectorMark Parsons
Alan Simpson, Technical Director
Paul Clark, Director of High Performance Computing
Chairman
Arthur Trew
Staff90[1]
Students40
AddressThe Bayes Centre, 47 Potterrow
LocationEdinburgh, United Kingdom
EH8 9BT
TOP500 rank
National: 1 [2]
World: 22
AffiliationsGlobus Alliance, Software Sustainability Institute, BonFIRE
Operating agency
University of Edinburgh
Websitewww.epcc.ed.ac.uk

EPCC, formerly the Edinburgh Parallel Computing Centre, is a supercomputing centre based at the University of Edinburgh. Since its foundation in 1990, its stated mission has been to accelerate the effective exploitation of novel computing throughout industry, academia and commerce.

The University has supported high performance computing (HPC) services since 1982. As of 2013, through EPCC, it supports the UK's national high-end computing system, ARCHER (Advanced Research Computing High End Resource), and the UK Research Data Facility (UK-RDF).

Overview

[edit]

EPCC's activities include: consultation and software development for industry and academia; research into high-performance computing; hosting advanced computing facilities and supporting their users; training and education.

The Centre offers two Masters programmes: MSc in High-Performance Computing and MSc in High-Performance Computing with Data Science.[3]

It is a member of the Globus Alliance and, through its involvement with the OGSA-DAI project, it works with the Open Grid Forum DAIS-WG.

Around half of EPCC's annual turnover comes from collaborative projects with industry and commerce. In addition to privately funded projects with businesses, EPCC receives funding from Scottish Enterprise, the Engineering and Physical Sciences Research Council and the European Commission.

History

[edit]

EPCC was established in 1990, following on from the earlier Edinburgh Concurrent Supercomputer Project and chaired by Jeffery Collins from 1991.[4] From 2002 to 2016 EPCC was part of the University's School of Physics & Astronomy, becoming an independent Centre of Excellence within the University's College of Science and Engineering in August 2016.[5]

It was extensively involved in all aspects of Grid computing including: developing Grid middleware and architecture tools to facilitate the uptake of e-Science; developing business applications and collaborating in scientific applications and demonstration projects.

The Centre was a founder member of the UK's National e-Science Centre (NeSC), the hub of Grid and e-Science activity in the UK. EPCC and NeSC were both partners in OMII-UK, which offers consultancy and products to the UK e-Science community. EPCC was also a founder partner of the Numerical Algorithms and Intelligent Software Centre (NAIS).

EPCC has hosted a variety of supercomputers over the years, including several Meiko Computing Surfaces, a Thinking Machines CM-200 Connection Machine, and a number of Cray systems including a Cray T3D and T3E. In October 2023 it was selected as the preferred site of the first UK exascale computer.[6]

High-performance computing facilities

[edit]

EPCC manages a collection of HPC systems including ARCHER (the UK's national high-end computing system) and a variety of smaller HPC systems. These systems are all available for industry use on a pay-per-use basis.

Current systems hosted by EPCC include:

  • ARCHER2: As of 2021, the ARCHER2 facility is based around a HPE Cray EX supercomputer that provides the central computational resource, with an estimated peaks performance of 28 Peta FLOPS. ARCHER 2 runs the HPE Cray Linux Environment, which is based on the SUSE Linux Enterprise Server 15.[7]
  • Blue Gene/Q: As of 2013, this system consists of 6144 compute nodes housed in 6 frames. Each node comprises a 16 core Powerpc64 A2 processor, with 16GB memory per node, giving a total of 98,304 cores and a peak performance of 1.26 PetaFlops. It is part of the Distributed Research utilising Advanced Computing (DiRAC) consortium.

Recent systems hosted by EPCC include:

  • ARCHER: From 2014 to 2020, the EPCC hosted the ARCHER facility. ARCHER was a Cray XC30 supercomputer. It is supported by a number of additional components including: high-performance parallel filesystems, pre- and post-processing facilities, external login nodes, and UK-RDF, a large, resilient, long-term data facility. ARCHER ran the Cray Linux Environment (CLE), a Linux distribution based on SUSE Linux Enterprise Server (SLES).[8] ARCHER was to be replaced in early 2020 but that was delayed by it being used for research on the COVID-19 pandemic. During May 2020 it was taken offline as a result of a security incident.[9] The ARCHER service ended on 27th January 2021.[9]
  • HECToR: The 2010 system (Phase 2b, XT6) was the first production Cray XT6 24-core system in the world. It was contained in 20 cabinets and comprised a total of 464 compute blades. Each blade contained four compute nodes, each with two 12-core AMD Opteron 2.1 GHz Magny Cours processors. This amounted to a total of 44,544 cores. Each 12-core socket was coupled with a Cray SeaStar2 routing and communications chip. This was upgraded in late 2010 to the Cray Gemini interconnect. Each 12-core processor shared 16Gb of memory, giving a system total of 59.4 Tb. The theoretical peak performance of the phase 2b system was over 360 Tflops. HECToR was decommissioned in 2014.
  • HPCx: Launched in 2002, when it was ranked ninth-fastest system in the world. HPCx was an IBM eServer p5 575 cluster, located at Daresbury Laboratory. It latterly operated under the complementarity capability computing scheme, preferably hosting workload which can not easily be accommodated on the HECToR system. EPCC supported the HPCx and HECToR systems on behalf of the UK research councils, making them available to UK academics and industry.
  • Blue Gene : Launched in 2005, EPCC's Blue Gene/L was the first Blue Gene system available outside the United States. EPCC operated this 2048-compute core service for the University of Edinburgh.
  • QCDOC: One of the world's most powerful systems dedicated to the numerical investigation of quantum chromodynamics, which describes the interactions between quarks and gluons. It was developed in collaboration with a consortium of UK lattice physicists (UKQCD), Columbia University (NY), Riken Brookhaven National Laboratory and IBM.
  • Maxwell: Maxwell was an innovative, award-winning FPGA-based supercomputer built by the FPGA High Performance Computing Alliance (FHPCA). Maxwell comprised 32 blades housed in an IBM BladeCenter. Each blade comprised one Xeon processor and two FPGAs. The FPGAs were connected by a fast communication subsystem which enabled the total of 64 FPGAs to be connected together in an 8×8 toroidal mesh. The processors were connected together via a PCI bus.

See also

[edit]
  • DEISA: Distributed European Infrastructure for Supercomputing Applications.

References

[edit]
  1. ^ "About | EPCC at the University of Edinburgh". Archived from the original on 14 April 2013. Retrieved 12 February 2009.
  2. ^ "TOP500 List - November 2021 | TOP500". www.top500.org.
  3. ^ "About the MSc | EPCC at the University of Edinburgh". Archived from the original on 9 March 2009. Retrieved 12 February 2009.
  4. ^ "Jeffrey Collins Obituary, University of Edinburgh". Archived from the original on 31 October 2016.
  5. ^ "EPCC History". Archived from the original on 30 March 2013. Retrieved 12 February 2009.
  6. ^ "Game-changing exascale computer planned for Edinburgh". gov.uk (Press release). Department for Science, Innovation and Technology. 9 October 2023. Retrieved 1 January 2024.
  7. ^ "ARCHER2 Hardware & Software". Archived from the original on 31 March 2020. Retrieved 12 May 2021.
  8. ^ "ARCHER » Software". Archived from the original on 22 December 2019. Retrieved 9 December 2019.
  9. ^ a b Griffin, Andrew (15 May 2020). "Supercomputer researching coronavirus taken offline after 'security incident'". The Independent. Archived from the original on 19 May 2020. Retrieved 15 May 2020.
[edit]

55°55′18″N 3°10′26″W / 55.9217°N 3.1740°W / 55.9217; -3.1740