Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Exascale Computing In the SKA

Exascale Computing In the SKA

Multicore World 2013

February 18, 2013
Tweet

More Decks by Multicore World 2013

Other Decks in Technology

Transcript

  1. Will be World’s largest and most powerful radio telescope (actually

    four types of telescope arrays). Cost €1.5-6.0 billion, construction between 2016-2024, but already some precursor arrays (ASKAP, MeerKAT, MWA). Hopes to address five fundamental science questions about the universe. Extreme Mega-science and Exascale computing project: Large Hadron Collider expected to produce about 15 PetaBytes data per year SKA will produce about 40 PetaBytes per hour. 10 Member countries: Australia, Canada, China, Germany, Italy, Netherlands, New Zealand (through MBIE), South Africa, Sweden, UK (and India associate).
  2. Since Multicore World 2012: Dual site decision for stage 1.

    Two processing systems. RfP to be released by end of March 2013, preconstruction start October 2013. Work Package Consortia: CSP Consortium lead by NRC Canada and MDA Corporation SDP Consortium lead by University of Cambridge MGR Consortium lead by NCRA India.
  3. Pioneers for NZ’s involvement in the SKA: Geoff Austin (Auckland),

    Ian Axford (SKANZ), Nicolás Erdödy (Open Parallel), Sergei Gulyaev (IRASR), Marilyn Head (MED), John Hearnshaw (Canterbury), John Houlker (NZTE), Melanie Johnston- Hollitt (Victoria), Jonathan Kings (MBIE), Robin McNeill (Venture Southland), Dougal Watt (IBM NZ). NZ had a joint bid with Australia to host SKA until dual site decision in 2012. In December 2012 NZ Expression of Interest round: NZ SKA Open Consortium (across computing aspects) Victoria Link group (MWA, radio source detection) Six individual organisations (software, bearings, dish fabrication, project management, communications).
  4. 23 academics from five institutions: • Institute for Radio Astronomy

    and Space Research – Radio Astronomy • AUT University – Computing and Mathematical Sciences – Electrical and Electronic Engineering • University of Auckland – Computer Science – Electrical and Computer Engineering • Massey University – Computer Engineering – Computer Science • University of Otago – Computer Science.
  5. Compucon: Integration of open standard computer components and performance optimization.

    ICTI: Consultancy infrastructure and datacentre services, operating at coal face of ICT. i-lign: Enterprise collaboration for the modern business, spanning cloud, social, mobile, data. IRL/Callaghan Innovation: CRI providing research, development, and commercialisation services. GreenButton: Platform for applications to transition to the cloud, management and delivery of cloud services. NZTE: NZ international business development agency, strategy and assistance for NZ business performance. Open Parallel: Parallel programming, multicore technology and software system infrastructure.
  6. Ulterior motives: Bring together critical mass of high technology expertise

    within NZ. Leverage NZ investment in SKA membership to participate in mega-science project, particularly its computing, including leading parts. Increase NZ expertise, international profile, linkages, exposure to emerging technologies. Attract/develop and retain more STEM graduates. Many SKA spin-offs particularly around HPC, parallelization, multicore, big data, cloud computing. Collective and open New Zealand Inc approach.
  7. Tentative agreements with CSP/SDP/MGR Consortia leads for preconstruction: • System

    Engineering – Modelling (lead) – Architecture (participate) • Software – Software Development Environment (lead) – Common Software (lead or participate) • Hardware – Computing Platform (lead or participate) • HPC/Parallelization – Correlator, beamformer, non-image processing (approx 30%) – Science data processing pipelines (approx 30%). NZ filling SKA expertise gaps, particularly in HPC/parallelization. Firm agreements to be reached in April/May 2013.
  8. Largest current radio telescopes (LOFAR, ASKAP) require about 100TeraFLOP, use

    10000 cores. SKA estimated to require about 100PetaFLOP (Cray Titan is 17.59PetaFLOP). Most legacy radio astronomy code sequential, needs complete algorithm redesign, improved software engineering and massive parallelism. Push for “single digital backend”, more done in software instead of hardware/ASIC/FPGA for greater reconfigurability/flexibility. Power requirements major concern, early estimates 20-60MW for computing (NZ installed capacity about 10GW). Correlator options known to be under consideration include: GPU correlators (egTesla via CUDA) Many integrated core architectures (eg Xeon Phi) IBM-ASTRON DOME project (eg PowerPC A2, others) New low-power “digital platform” being investigated for CSP. Imaging pipeline in SDP a major concern for performance.