Bringing web-scale IT to the corporate data centre

Despite almost universal adoption of server virtualisation, data centres everywhere continue to rely on complex, inflexible and expensive compute and storage platforms to support business-critical applications. In its favour, this approach is tried and tested and known to work.

Against, it severely limits scalability, discourages change and handicaps companies seeking to exploit new IT developments and ways of working.

Recognising this problem, it is fair to say that the big names in cloud computing have long abandoned proprietary solutions in favour of their own, infinitely more scalable and flexible, software-defined infrastructures, hosted on cheap, easy-to-deploy commodity hardware.

In the process, they also revamped their IT processes and operations as well as their organisations to achieve the agility that business demands. This new architectural approach buying, deploying and managing infrastructure is called web-scale IT.

>See also: CIO as Frankenstein: Creating a monster infrastructure with web-scale IT

It was certainly a huge gamble and a major investment even given the deep pockets of the companies concerned, but the benefits have been enormous. By doing everything possible in software, the likes of Amazon, Facebook and Google have been able to build IT platforms able to deliver on-demand computing and storage to millions of users worldwide.

We are all consumers of these web-scale platforms and we all know how effective and scalable they can be. More than that, this kind of web-scale approach is now gaining traction amongst smaller enterprises keen to benefit from the predictable scalability and business agility it confers as well as a much lower TCO.

Smaller companies typically lack the financial and technical resources of the big cloud companies or are unwilling to take the risk of disrupting their IT environments.

For these companies, turnkey enterprise solutions that use web-scale technologies deliver benefits that fit not just their needs, but their technical resources and budgets, eliminating the need for a complete overhaul.

Web-scale, now

The principles and architectures underlying web-scale IT infrastructure are well understood. The hardware used in web-scale systems is essentially unremarkable and readily available already.

Moreover, it doesn’t have to involve the use of large multi-processor servers, complex and expensive SANs or anything which might be viewed as proprietary. Just basic x86 servers combined in large numbers to build massively scalable computing arrays.

The concept is very simple. Intelligent software pools and aggregates the compute power of individual x86 servers to create the abstraction of a compute fabric.

When more computing power is required, there’s no need to re-architect the infrastructure, replace or upgrade existing servers. Just buy more of the same and add extra compute nodes to the mix.

>See also: Network Rail, UBS Investment Bank and Salvation Army to share their big data stories at Data Leadership 2014

Moreover, if one node fails, another can simply take its place while the faulty node is either fixed, ditched or replaced.

Similarly, software pools and aggregates the drives within the x86 servers into a single logical pool of shared software-defined storage that can be scaled seamlessly in small increments when needed.

When it comes to storage, the big cloud companies have mostly binned the traditional SAN (Storage Area Network) which is seen as overly complicated and a potential bottleneck when it comes to massively scalable computing platforms.

SANs also require specialist management and expertise to bridge the gap between the LUNs used to provision storage and the virtual disks used by applications that consume them.

The end result is a massively parallel distributed system that is resilient enough to support always-on operation and can be scaled predictably without limits. Extensive automation and rich analytics eliminate the need for manual, error-prone management, lowering costs and enabling agility.

From concept to reality

Despite being a relatively new idea, web-scale IT is already at something of a tipping point, with Gartner predicting that, by 2017, it will be deployed by around half of all enterprise data centres, up from under 10 per cent in 2013.

A range of different approaches are available to companies that are looking to bring web-scale architectures into their data centres with varying levels of customisability, risk and skill requirements.

One approach is to do as the web companies do, and handcraft a custom IT environment from the ground up using web-scale technologies. IT environments built this way need to change not only infrastructure architectures and technologies, but also operational processes and organisational structure and skill sets.

Done right, this offers the full potential of web-scale IT to the enterprise. But it also entails the highest level of change and risk for traditional businesses.

Enterprises that use technology at the core of their business or as a source of competitive advantage can use this model.

A second approach is to use hardened, enterprise-packaged offerings of open-source web-scale tools to build agile, scalable environments for a section of the IT environment while still using traditional IT for the rest.

An example of this approach is the emergence of big data groups within large companies that use tools like Hadoop and NoSQL databases to capture, process and analyse large volumes of data to generate actionable insights.

This approach allows companies to use web-scale IT selectively where it makes sense, while still keeping their systems of record on traditional infrastructure. While this approach makes web-scale more accessible and acceptable to mainstream enterprises, it still requires hiring for specialised skillsets to handle these radically different environments.

Companies need to invest in learning and managing web-scale technologies that are very different from what they are used to. The cost and complexity is often prohibitive to traditional enterprises.

>See also: Cyber security guide to the 10 most disruptive enterprise technologies

Lastly there is the approach that allows enterprises to embrace web-scale principles and architectures without revamping their IT environments or learning new skills. A few companies are attempting to bring the essence of web-scale principles and architectures to enterprise customers as turnkey solutions.

Vendors such as Nutanix, for example, have developed solutions based on commodity compute, storage and networking hardware delivered together in a low cost appliance-like converged infrastructure format.

Pre-configured with the necessary software-defined storage layer and able to integrate with multiple hypervisor platforms these web-scale solutions reflect the approach taken by the large cloud companies in that they don’t rely on the use of legacy technologies.

Neither do they require specialist management tools or expertise. By distributing compute, storage and networking resources across a multi-node network, they make it possible to bring web-scale into the data centre and deliver the proven benefits of this approach to companies of any size for any workload.

 

Sourced from Suda Srinivasan, Nutanix

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics

Big Data
Data Centres
Storage