Supercomputers, cloud and the Internet of Things: the evolution of the weather man

For many, weather forecasting conjures up an image of a smiley broadcaster pointing to an animated screen, telling you to bring your umbrella to work tomorrow.

But the journey of that information being delivered to you is long and sophisticated, involving vast amounts of data and complex scientific observations. In the UK, that responsibility falls to the Met Office, a trading fund of the Department for Business, Innovation and Skills.

Headquartered in Exeter and employing around 1,900 people, the Met Office is recognised as one of the world’s most accurate forecasters. Every day, it uses more than 10 million weather observations and an advanced atmospheric model to deliver 3,000 tailored forecasts and briefings to government, businesses, armed forces and the general public.

>See also: 5 real-life applications of supercomputers that you never knew about

As you can image, the infrastructure required to do this is quite hefty. The Met Office’s weather forecasts, in particular, rely on domain infrastructure that has been around for a long time.

’Before you can start to predict anything, the first thing that you really need to do is collect the state of the system that you are trying to predict, so for us that’s a massive observations programme,’ says Charles Ewen, CIO at the Met Office. ‘So there is a heap of infrastructure that has been there for a very long time, which is a global collaboration for the interchange of hundreds of millions of observations everyday.’

This activity takes place under the guiding structures of a United Nations association called the World Meteorological Organization (WMO), and involves 191 organisations that do a whole series of observation platforms around the world.

But that’s not it. There is also another very substantial infrastructure required to satisfy the UK’s operation in a European geostationary and polar-orbiting satellite programme.

‘Largely that is domain specific and quite complex,’ Ewen adds. ‘It’s regularly updated but has been there for a very long time. Its job is to make sure that the observations are ushered around in the timescales that they need to with the right degree of resilience so that we can begin the job of forecasting the weather.’


The main functionality behind weather prediction and climate research, however, involves supercomputers that are engineered to meet the challenges of today’s most demanding high performance computing (HPC) users.

In October last year, the Met Office awarded US supercomputer company Cray a $128 million contract to provide it with multiple supercomputers and storage systems. 

Consisting of three phases, the major deliveries are expected between 2015 and 2017. In their final configurations, the supercomputers will have 13 times more power than the Met Office’s current systems, while the next-generation storage solution will include more than 20 petabytes of storage capacity, running at speeds of more than 1.5 terabytes per-second of bandwidth.

According to Ewen, the realisation of the social-economic benefits that are part of the case for the new supercomputers sits firmly at the top of his CIO agenda.

‘At the end of the day, the Met Office’s job is about saving lives, protecting property, enhancing wellbeing and supporting economic growth,’ he says. ‘In that mission, this new investment is a genuine sea change, and therefore our passion is about making sure we realise those socio-economic benefits.

‘We will not do that in a linear way. Most of those benefits will not be accrued by us and therefore our need to move from where we are to being an enabling platform is absolutely paramount.’

>See also: Met Office looks to enhance weather services by publishing more open data

Becoming such a platform involves Ewen utilising cloud computing to evolve how the Met Office positions itself.

While he hasn’t deliberately sought a cloud strategy, a lot of the requirements the Met Office has accumulated – like instant provisioning and instant scaling – are typical of that environment.

Further to this, the level of data that it deals with – both inbound and outbound – necessitates a very high connectivity to the internet.

So being intrinsically part of the cloud is the important thing for Ewen, rather than explicitly moving towards it. But will the Met Office continue to do everything itself based in Exeter in close physical proximity to the supercomputers?

‘Some things, yes,’ he says. ‘Despite it being quite an old concept, data gravity is something that people are increasingly talking about, which is the point at which data becomes so massive that it makes more sense to bring other people’s problems to the data as opposed to taking the data to the problems.’

As such, Ewen is looking for a platform that enables that to happen in a safe, secure and appropriate manner.

‘Ultimately, we need to provide the environment where other people can run their IT on our data – so not us taking their data and giving them an output, but the ability to mentally and programmatically do that work and get the answer in a safe and secure fashion.’

Open and transparent

While a long way off, this is the end game for the Met Office: a platform for reuse. The more it develops its science in both weather and climate services, the more is possible to organisations in the private sector in terms of extracting value.

Therefore, from a technology standpoint, the Met Office is shaping itself to look more like a platform that enables innovation in the weather and climate space, as opposed to being the entity that makes it happen.

The vision complements the organisation’s commitment to making more information available electronically as open data, in order to boost the creation of new services.

In the same month that the Met Office announced its new supercomputer contract, it also became the first UK trading fund to join the Open Data Institute’s (ODI) membership programme as a partner.

While it was already publishing significant amounts of open data via its DataPoint API service – including five-day forecasts, real-time observations and regularly-updated forecasts for mountain weather, national parks and UK regions – it will be working with the ODI to both improve this data and expand its range of open data. 

The Met Office is only partially funded by public money so needs to charge for some of its data in order to cover its running costs as a trading fund, but has a responsibility under the Public Records Act to record the history of the UK’s weather.

‘Things to look at are not necessarily hugely challenging, but things to reuse are increasingly challenging,’ says Ewen. ‘That’s all driven, fundamentally, by Moore’s Law and an ever-bigger supercomputer, which creates and generates ever-more-vast series of data.’

The Met Office is at the cutting edge of policy development and technology development to understand how data sizes get larger. This might currently be quite unique to the Met Office, but in a decade or two other organisations will face the same data challenges.

So understanding how to release the value of data in more sophisticated ways than simply pumping the data out like a fire hose remains an important piece of work for the Met Office.

>See also: Gartner's Internet of Things predictions

One platform that the Met Office has rolled out for several years now – in the form of an R&D project – encompasses an embrace of the Internet of Things.

The Weather Observations Website (WOW) is crowd-sourced platform for the collection of observations and has been hugely ambitious in the scale of its data gathering and need to be able to cope with a wide range of different data sources.

‘It’s entirely cloud-based, and happens to be based on the Google Enterprise API, because it gives me the ability to control scaling costs more predictably than traditional platforms,’ says Ewen. ‘Increasingly, that platform is about machines and expert systems being able to go to the definitive voice of understanding what state the atmosphere is going to be in and be able to use that information intelligently.

‘So from a technology point of view, recognising the Internet of Things allows you to understand more about the world in a physical and environmental sense – it’s a really important thing. That’s part of the vision for the platform.’

This ongoing work – particularly around data – is something that other CIOs should keep a close eye on. The Met Office is clearly advanced in its understand of both how to handle huge quantities of data, and how to turn it into business value.

Not to forget, its progress with the next generation of supercomputers is likely to form an important reference point in the years to come.

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics

Open Data