How big data is taming the growing cost of the UK’s nuclear decommissioning liabilities

Once hailed as an energy source ‘too cheap to meter’, the cost of cleaning up the UK’s civil nuclear liabilities continues to escalate. In 2013, the Guardian reported that the total cost of the clean-up programme could exceed £100 billion. In February 2014, the UK’s Public Accounts Committee called for efficiency gains at the Sellafield site, where clean up costs are expected to top £70 billion.

Across the estate of the Nuclear Decommissioning Authority (NDA), the public body charged with cleaning up 17 of Britain’s civil nuclear sites, the challenges are complex and frequently unique. A recurring problem the decommissioning programme faces is access to information, or rather the lack of it.  

Ironically, but perhaps not unusually, for an industry driven by scientific and engineering innovation, the challenge of managing decades worth of date stored in disparate formats on a plethora of different systems, has become something of a bête noir.

This was less of an issue while knowledgeable staff involved in site build and operation were on hand, but as the industry grapples with an ageing workforce and loss of critical and undocumented knowledge, the problem is becoming acute.

This, of course, matters to us all, not least because the cost of nuclear decommissioning is now a significant annual multi-billion pound item on the balance sheet of UK PLC, but also because of the global imperative to eliminate any risk that the sites themselves might pose to the environment. Safe and cost effective decommissioning programmes depend on good information for decision-making, and while larger initiatives such as the NDA’s Knowledge Management Programme and National Nuclear Archive stand out, the industry is also looking outside for solutions to the diverse range of challenges it faces.

>See also: How big data is beating Ebola

Sellafield, Europe’s largest nuclear site, has utilised big data concepts, with the help of Informed Solutions, to help them make sense of their huge volume of land remediation data. More than 60 years of activity at the site have understandably created public anxieties about ground and groundwater contamination, in response to which longstanding monitoring and remediation programmes have been in place. 

These in turn have generated a wealth of data spanning more than 40 years and representing a £15million-plus investment. Unlocking that data investment without embarking on a costly and complex exercise to standardise its quality, consistency and completeness, meant drawing on the experience and practices of high risk industries as diverse as oil & gas exploration and defence, where big data techniques are more established, and all of whom share a common need to cost effectively and confidently make sense of high-volume, mission-critical and diverse data.

This approach has allowed substantial quantities of historic land and groundwater quality data to be recovered from archives, long-lost systems and even the supply chain, which can all now be integrated and analysed alongside current data acquisition programmes.

Big data is now delivering the big picture to better understand risks and liabilities, and further improve Sellafield’s remediation strategy. In so doing, it has helped tame the UK’s nuclear decommissioning costs.

 

Sourced from Justin Hassel, Informed Solutions

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics

Analytics
Big Data
Data