Hyperconverged infrastructure: what’s all the hype?

A cornerstone of today’s digital business is the constant growth of data. We have seen the mobile revolution and now we are in the process of seeing the IoT revolution. IT leaders are now having to consider, more than ever before, how they are best going to store, manage and use this data to drive their organisation’s strategic goals. Is hyperconverged infrastructure the answer?

IT traditionally has run on a three-tier architecture consisting of compute, storage and networking. However, this cannot respond quickly enough to the proliferation of new applications.

This lack of agility is why many organisations have considered the public cloud. Although for many this comes with a fear of losing control of their infrastructure, unstable costs and a lack of security.

Beyond the cloud, new technologies like hyperconverged infrastructure are promising improved efficiency, scaling and management breakthroughs.

>See also: The promise of storage and IT infrastructure in 2018

Despite being in its infancy, it is getting a lot of attention from IT leaders. According to research from IDC, full-year sales of hyperconverged systems surpassed $3.7 billion in 2017, up 64.3% from 2016.

What is hyperconverged infrastructure?

Hyperconvergence is an IT infrastructure that is promised by various vendors to be more flexible and simpler to manage than legacy solutions.

Hyperconverged infrastructure (HCI) differs from converged infrastructure, in that the latter is concerned with just combining storage, networking and server technology along with a software stack into a single framework.

HCI builds on this model in that it looks to reduce the size of data’s footprint by combining the likes of storage, networking, compute and back up into a singular unit which is then controlled by a piece of software, which virtualises each of these components.  This fully software-defined infrastructure is essentially putting all the components of a data centre in a single box.

Hyperconverged systems are modular and can be scaled-out by adding additional nodes. Hence the promise of low-prices per unit and claims that it is easy to roll out and manage.

>See also: What is the right storage software needed for DevOps to be a success?

Is it right for your organisation?

For small to medium-sized organisations, HCI has many advantages with this type of infrastructure. They have the benefit of a simplified network design. Take, for example, a start-up which is growing, and needs to roll-out an infrastructure to keep up, however, they lack funds and expertise. With HCI they can scale one node at a time, which is very cost effective and arguably simpler than traditional systems as the addition of nodes is a case of “plug in and play.”

HCI’s flexibility and scalability has some benefits for larger organisations too, who maybe want to set up new branches or remote offices.

However, while this modular approach is cutting edge, it has also received criticism, mainly because it can result in wasted resources, especially when the economies of scale are considered.

Under a HCI the only way to add more compute infrastructure is to buy another node, this node will contain storage and network resources that an organisation does not need. There is also a fear that at some point the scale economies will become untenable. This is why traditional storage systems have been sold on a scale-up model.

It saves money for smaller enterprises but for larger enterprises who will need to buy more HCI nodes, cost efficiency will not be a credible selling point.

>See also: Building a smart storage strategy

However, there are various different HCI systems out there, and the market is constantly growing. Each system it different and has its own pros and cons. If you are thinking about adopting HCI, make sure it is the right one for your organisation. Some systems, for example, are supported by a single hypervisor, so if you want to avoid vendor lock, avoid these types of models.

Data-efficiency under HCI

Arguably, being scale-out is beneficial when it comes to looking after your data, in that, putting data across different nodes in a data centre, or even across different data centres, can enable resilience.

In terms of data-recovery, the software-defined system at the heart of HCI means that public cloud storage can be utilised. On top of this, many vendors provide a variety of specific data-recovery solutions.

Avatar photo

Andrew Ross

As a reporter with Information Age, Andrew Ross writes articles for technology leaders; helping them manage business critical issues both for today and in the future