Inside the UK’s first collaborative data centre

When CIOs look for high performance and speed, along with low latency and overall costs, in IT infrastructure these days, they are presented with the prospect of cloud computing.

In fact, if you really want the highest possible performance with the lowest possible costs, it is public cloud solutions pitched by the likes of Microsoft, Google and Amazon that offer the best value for money.

There are, however, the security and privacy concerns that come with offloading business-critical data onto shared infrastructure, which have held back mass adoption among larger organisations to date.

>See also: The great IT myth: is cloud really less secure than on-premise?

King’s College London, one of the UK’s top universities and research institutions, identified an alternative to the cloud route that could generate similar results.

Like many large organisations with legacy systems, the university’s data centre facilities were inefficient, outdated and an unsuitable environment for future technology development. The result was an ad hoc evolution of technology and infrastructure, which was not conducive to its research aspirations.

This was compounded by the increased need for computational solutions to undertake larger medical research programmes such as high-performance computing (HPC) clusters, which handle and analyse large amounts of data at high speed.

That, normally, is when a CIO turns to the cloud. But Nick Leake had other ideas. ‘In talking to some of my counterparts in other London institutions, I found that they were in similar situations and had a need for external data centre space,’ he says. ‘So we decided to work together to procure a data centre space for ourselves.’

All for one

This meeting of minds involved five other institutions: University College London, The Sanger Institute, The Francis Crick Institute, The London School of Economics & Political Science and Queen Mary University of London.

Together, they wanted to establish the UK’s first collaborative research data centre in order to improve their infrastructure capabilities and help maintain the UK as a global research powerhouse.

Leake recalls reading about the idea of UK universities sharing data centre facilities over ten years ago in a report that suggested they could save around £1 billion if they worked together in their procurement, but nothing had materialised.

However, the founding members were confident that it would succeed if they opted for a very simple hosting contract, rather than adding complexity by asking for managing services.

The first, and perhaps most vital, part of the plan required getting on-board Jisc, a non-departmental public body that acts as a coordinating organisation for ICT among all further and higher education institutions in the UK, and provides the Janet network that manages the country’s research and education network.

Shared data goals

Together, these institutions had a common interest to provide shared facilities for researchers to collaborate, increase energy efficiency and reduce cost. Jisc and its institutional partners knew that a shared data centre would provide a platform to facilitate greater collaboration between universities and other research institutions.

‘We went through this procurement process with Jisc, which brought some fantastic understanding and knowledge of EU procurement rules,’ says Leake. ‘Being in the public sector, we had to make sure that we complied, so they kept us on the straight and narrow as far as that was concerned.

‘From our initial assessment we were comfortable for the location of the data centre to be anywhere within the main body of England, so, effectively, east of the Welsh border – because some of the organisations, like NHS England, had concerns about historic data in England – and as far north as Manchester.’

>See also: 4 megatrends that will dominate cloud computing for the next decade

Eventually, a data centre in Slough, owned and operated by Infinity SDC, was selected to provide the facilities for the collaborating group of partners to establish the shared facility for education and research. 

In July 2014, the Janet network connected the data centre to its backbone, meaning that any other UK university, NHS academic science centre or research institution could also take advantage of the facility if they wished to.

Moving to the new facility has allowed King’s to plan, purchase and combine HPC technology to increase computing power. The team has been able to develop a clear HPC strategy, enabling King’s to provide better HPC facilities.

King’s currently has 22 racks in the Infinity data centre, of which most are HPC clusters that enable the data-intensive projects that its researchers carry out.

When equipment in its on-premise data centres gets to the end of its life, King’s is now replacing it with new equipment in Infinity – as well as moving some equipment between the two.

‘I’m sure we’ll always have some things on-premise, but the logic being what it is, from my own point of view, it’s a question of why wouldn’t you put it into Infinity?’ says Leake. ‘That’s the way I put it in the discussions I’ve had with my colleagues, many of whom are very keen to go there and benefit from things like the collaboration capability, the flexibility and security, and the fact that the equipment is better housed and therefore likely to last longer.

‘But occasionally we have people who need persuading, because they’ve not seen the future and they’re convinced that it won’t work. Therefore, they have to be persuaded to evaluate this and check that it does work for them. Ultimately, if it doesn’t – and there are cases that we are aware of, most particularly when you’ve got large things like MRI scanners and some of the imaging systems we have that generate huge amounts of data – we will need to have areas on campus to summarise, aggregate and compress that data before we send it off to Infinity. Then, once it’s in Infinity, we can get access to the full computational power of what we’ve got there.’

While Janet was an ultra-high speed and low-speed network that encouraged collaboration between institutions, it still ran into problems on the more intensive HPC tasks.

But by connecting that network directly into the Infinity data centre, which has enjoyed significant investments in network connections by Jisc, those using the service are enjoying better performance than ever.

Leake is yet to see the metrics but is expecting the latency between King’s Colleges’ main London campuses and the Infinity data centre to be lower than that between and within the campuses.

And while he doesn’t identify the solution as cloud computing – more data-centre-as-a-service – he does expect to land in that territory when they start to cross-connect equipment between institutions and move to multi-tenancy.

‘Also, for us, the HPC clusters in Infinity are already connected to major external cloud providers, so we can verse capacity into Amazon or Microsoft from the facility because there are some computational problems that even 22 HPC racks can’t solve,’ he says. ‘If we can slice the problem up small enough, we can distribute that across large cloud providers and then get the answers back more quickly. I’m not sure that has really been used yet, but it’s part of the provision.’

>See also: Will public cloud kill the data centre?

Trial by fire

But latency and speed aren’t the only benefits of the new data centre services, as was demonstrated last month when a large fire blazed for around 48 hours on London’s Kingsway, taking out all of the power supplies to King’s College’s Strand campus.

Had all of the college’s data centre facilities still been hosted on-premise, the backup generators would not have been able to support the infrastructure, meaning that equipment would have had to be shut down.

However, because it had moved so much of its load to Slough, the generators were able to support what was left and keep everything going on the campus.

‘So having it in a secure facility with all the levels of resilience and so on has been great. Many other organisations are joining in, and that in itself is part of the vision of encouraging collaboration,’ says Leake. ‘It’s more than just a hosting facility – it’s a facility that engenders collaboration and cooperation between different institutions and their researchers.’

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...