How can businesses manage a large data centre estate effectively?

As the data centre industry is expanding at a fast pace, the demand for smart data centre solutions continues to heat up, putting operators under increasing pressure to deliver high-performance data facilities.

Thanks to the gallop of technology, businesses are feeling the pressure to increase their performance, agility and adaptability in order to stay in lockstep with customers’ and consumers constantly evolving appetite for digital solutions.

Data centres underpin this digital landscape and have become a major part of the critical growth engine of our global economy.

Rapidly increasing data traffic is driving demand for data storage and processing and the data centre industry continues to expand at a fast pace. According to recent research, the UK’s data centre industry alone is worth about £73.3 billion a year and investment is expected to increase 11% by 2025.

Data centres on the rise

In order to accommodate the enormous amount of data we generate every day, companies are operating an increasing number of data centres located in various locations across the globe. Furthermore, as the IoT industry develops, the demand for edge data centres increases exponentially.

>See also: The importance of smartening up a legacy data centre

Edge data centres are increasingly important because businesses and data centre operators need to support the IoT and the deployment of edge applications that the investment in technologies such as 5G enables. Also, the collective demand for connectivity, bandwidth and low latency will continue to grow aggressively as smart houses, offices, driverless cars and artificial intelligence become the new norm across our society.

Industry challenges

Managing a big facility can be challenging, but overseeing large networks of data centres dispersed all over the world, while at the same time ensuring their profitability and the delivery of high-quality services, is a mammoth task – one many data centre operators are just starting to wrap their heads around.

To make things even more difficult, in many cases site and technical knowledge is siloed within teams, making rapid diagnosis, continuous optimisation and upgrades time consuming and expensive, which prevents valuable economies of scale from being realised.

Furthermore, many data centres, even newly built sites are still using outdated building management systems (BMS) and energy management systems from the ‘90’s that were never designed to keep up with newer data centre designs and operational control strategies.

So how can data centre operators gain visibility and ultimately control, across a large number of data centre sites of any type and age in order to increase efficiency, productivity and ROI while keeping uptime as priority number one?

>See also: Addressing the incredible complexity of the modern data centre

An increasing number of organisations are adopting smart analytics solutions and Machine Learning in conjunction with data from DCIM tools to improve the management, forecasting and construction designs of their facilities.

And while these solutions are very useful for improving a data centre’s environmental impact, reducing energy costs and maximising performance, they often rely on raw data that is not always accurate and reliable. On average, only 60-65% of raw data can be relied upon without cleaning, validation, normalisation and labelling prior to being used for any sort of decision making or informative analysis.

As the IoT evolves and the number of sensors and data collection points skyrockets, the situation will become even more challenging as these huge qualities of unstructured data will need to be automatically cleaned and processed in order for any real value to be extracted.

Only by using data cleaning tools and techniques specifically designed to deal with this unstructured data, can data centre organisations be sure that the data they base their strategic decisions on and potentially even bill their customers on, is accurate and validated.

At the same time, to ensure visibility and stronger performance across a large number of facility sites (of any type and age), data centres operators need to find cloud based solutions that combine predictive modelling with big data analytics as this combination of technologies has been independently proven to be able to validate data to an accuracy of greater than 98%.

>See also: The rise of data centres in the modern media landscape

These technologies enable both on-site and remote teams to access consistent, reliable analytics, diagnostics and optimisation data and equipped with these capabilities, teams can ensure:

• Site specific knowledge is captured and ‘digitised’ within the predictive model
• BMS, EPMS and DCIM data is presented in a single, coherent interface, regardless of source
• Detailed knowledge of set-points, control intents, target ranges is encoded in, or calculated by, the platform, and data is constantly tested against this knowledge
• Site data is constantly evaluated against the digital model
• Sites operate exactly to target, de-risking events and making deviations more easily visible
• Remote experts can rapidly gain context and diagnose for, or with, the site team

>See also: Challenges and opportunities facing data centres in the enterprise today

By capturing site specific and technical knowledge within a digital model for each site that allows the combined teams to manage the facilities proactively, data centre operators are able to:

• Reduce management cost per site
• Lower operational risk by constantly guiding site teams back toward optimal operation and identifying issues before they are visible to traditional alarm/alerting systems
• Drive and measure global consistency
• Rapidly identify the underlying causes of operational performance issues
• Optimise energy and cost performance effectively

In conclusion, data centre operators that want to gain visibility over a large network of data centres need to make sure that they have a reliable and uniform analysis system across the sites they manage that not only collects random data but also has the ability to clean it, verify it, analyse it and make accurate modelling predictions based on real-world data.

 

Sourced by Zahl Limbuwala, co-founder & CEO, Romonet

Related Topics

Data Centres
IoT