HPC Midlands - Update for Bull eXtreme Computing User Group 2012 meeting

Post on 22-May-2015

733 views 2 download

Transcript of HPC Midlands - Update for Bull eXtreme Computing User Group 2012 meeting

Martin Hamilton, Centre Manager@martin_hamilton hpc-midlands.ac.uk

HPC MidlandsBull User Group for eXtreme Computing, 2012

Tildesley Report – see http://goo.gl/qw2DW

e-Infrastructure funding – http://goo.gl/SKgdG

New High Performance Computing facility for academia and industry

Jointly operated by Loughborough University and University of Leicester

£1m funding from EPSRC/BIS e-Infrastructure programme Building on relationships with existing industrial partners and

software providers Opportunity to “operationalize” HPC spending

What is HPC Midlands?

About Loughborough University

Number One for Student Experience

“Research That Matters”

• £1m EPSRC investment + institutional contributions• 3,000 core Bull supercomputer (48 Teraflops)• 11 x chassis (18 blades per chassis)• 188 compute node blades, each with

– 2 x 2.0GHz (8 core) Sandy Bridge CPUs• 15TB RAM

– 140 blades with 64GB RAM (4GB/core)– 48 blades with 128GB RAM (8GB/core)

• 60TB Lustre storage• QDR Infiniband interconnect

Hera: The HPC Midlands Cluster

The Delivery!

8th Decembere-Infrastructure call received

5th January HPC Midlands proposal submitted

19th January Funding awarded

23rd January Tender issued against RM721 framework

24th January Mechanical and Electrical work commissioned

30th January Tenders received and scored, contract awarded to Bull

9th February Work begins to prepare the HPC Midlands site

24th FebruaryWork begins on the plumbing / chilled water supply

1st March Space being prepared for new chiller unit

9th March 63A circuits now mostly in place

15th March Raised floor installed

21st March Dedicated chiller unit arrives and craned into place

26th March Delivery of HPC Midlands hardware

HPC Midlands Timeline

Mechanical and Electrical Work (FM)

Mechanical and Electrical Work (FM)

Some Assembly Required…

Inside a Sandy Bridge compute node

• Cutting edge hardware innovations Nehalem/Sandy Bridge/Ivy Bridge

Increase in cores/threads per socket, I/O bandwidth etc >2bn transistors per CPU!

Many Integrated Core (MIC)

Leading player in HPC compilers Intel Cluster Studio

Intel will contribute training on parallel

programming and their HPC tools

HPC Midlands Partners: Intel

Market leading HPC software (CFX, FLUENT etc)But significant capital investment required for licenses

Locks out SMEs and spinoffs Inflexible in today’s challenging climate

New model: “Pay As You Go” access to the ANSYS suite of products

HPC Midlands Partners: ANSYS

E.ON New Build and Technology, Software & Modelling

Sample use cases:

Precipitator ductwork Gas turbine blade lifetime Gas turbine enclosure safety Wind farm resource assessment Steam flow in power plants

HPC Midlands Partners: E.ON

HPC Midlands Partners – E.ON

HPC Midlands Partners – E.ON

HPC Midlands Partners – E.ON

• JANET upgrades planned for e-Infrastructure £26m capital investment Potential for e.g. new primary connections / dedicated bandwidth for key research

centres / instruments

What would people like from JANET? Can we develop a model for JANET interconnects with industrial partners?

JANET as _peer_ not backhaul Build on existing partnerships Move beyond Sneakernet scenario

e-Infrastructure and JANET(UK)

Martin Hamilton, Centre Manager@martin_hamilton hpc-midlands.ac.uk

HPC MidlandsBull User Group for eXtreme Computing, 2012