Researchers from across the University of Exeter are benefitting from a new High Performance Computing (HPC) machine, called Isca. Existing departmental HPC resources within Life Sciences and Physics were coming to the end of life, so using funding from the University and a large grant from the Medical Research Council, the University acquired a new, central core HPC resource to support researchers University-wide across numerous disciplines.
The new system has already been contributing to research into the modelling and formation of stars and galaxies, using Computational Fluid Dynamics (CFD) within Engineering to understand how flooding affects bridges, as well as being used in the Medical School looking at genetic traits in diabetes using data from the UK Biobank. The HPC resource is now in use by more than 200 researchers across 30+ active research projects in Life Sciences, Engineering, Mathematics, Astrophysics, and Computing departments.
As part of the original tender, the University asked for options to provide temporary housing for the new HPC machine whilst work on a new data hall was being finished. High performance compute, storage and data analytics integrator, OCF, proposed a unique solution to house the new HPC machine in a Rapid Deployment Data Centre (RDDC) container solution from Stulz Technology Integration Limited (formerly TSI UK).
Nicknamed The Pod, this is a dedicated HPC-related containerised solution developed by Stulz Technology Integration, which was custom fabricated for the University by the data centre design and build specialists
Nicknamed The Pod, this is a dedicated HPC-related containerised solution developed by Stulz Technology Integration, which was custom fabricated for the University by the data centre design and build specialists. OCF designed, integrated and configured the HPC machine and had the entire system delivered in its container to the University in July 2016.
“This was phase one of the new supercomputer, located on campus in the specialised container, where the machine ran for the first twelve months,” commented David Barker, Technical Architect at the University of Exeter. “We tested and used the system while it was housed in the temporary location to give us an understanding of what we used a lot of; this informed phase two of the project which was to expand the system with the help of OCF and move it to its final location in the new data centre on campus.”
In addition, OCF and Lenovo jointly worked on the design of the computer to support the differing needs of the life sciences and physics researchers, which required virtualised and traditional environments respectively. The new 6,000 core system is comprised of Lenovo NeXtScale servers with a number of NVIDIA GPU nodes, Intel Xeon Phi processors and additional high memory compute resources to cater to these needs,
Lenovo’s NeXtScale servers are connected through Mellanox EDR Infiniband to three GS7K parallel file system appliances from DDN Storage, providing just under one petabyte of capacity. OCF’s own Open Source HPC Software Stack, based on XCAT, runs on the system along with RDO OpenStack, NICE DCV and Adaptive Computing MOAB.
“As well as having the standard nodes, we also have various pieces of specialist kit which includes NVIDIA GPU nodes, Intel Xeon Phi nodes and OpenStack cloud nodes as well,” commented David. “We wanted to ensure that the new system caters for as wide a variety of research projects as possible, so the system reflects the diversity of the applications and requirements our users have.”
The impact it has had on research has been significant, with researchers seeing 2-3x speed up compared to the previous departmental clusters.
“We’ve seen in the last few years a real growth in interest in High Performance Computing from life sciences, particularly with the availability of new high-fidelity genome sequencers, which have heavy compute requirements, and that demand will keep going up,” comments David Acreman, Senior Research Fellow at the University of Exeter. “Isca is proving to be an incredibly valuable service to the whole university and is now proving indispensable to our research groups.”