Three things essential to the future of edge computing

Of all the developments in IT infrastructure set to happen in the future, one of the most disruptive will likely be the widespread adoption of edge computing. The edge model sees organisations bring computing and storage resources out of the data centre, and closer to the locations where data is generated and needed. By 2022, there will be an estimated 55 billion edge devices on the market – by 2025, this is expected to grow to 150 billion.

With the growing amount of data that businesses and enterprises hold in their cloud systems, along with the adoption of heavily data-intensive workflows from the adoption of AI and 5G, many businesses are under increased pressure to move towards the edge model. As with most successful digital transformation initiatives, however, a successful edge transition has certain prerequisites.

There are three prerequisites organisations must pay particular attention to so as to avoid stumbling in their edge journey – standards built on open technologies, use of the hybrid cloud, and a commitment to scaling up from the outset.

How to empower your data scientists in the era of edge computing and AI

Dan Warner, CEO and co-founder of LGN, discusses how data scientists can be empowered in the era of edge computing and AI. Read here

1. Standardisation on open technologies

At its core, edge computing relies on geographically disparate pieces of equipment being able to seamlessly talk with one another. This could be compute or storage nodes talking with one another, or those nodes talking with sensors or machinery that collect or action an edge network’s data. Edge infrastructure depends on those technologies being able to reliably interact.

Geographic separation has also led to a tendency towards a diversity in equipment. Whether due to supplier availability or adaptations to the local area, the most efficient edge infrastructure is one that can accommodate a variety of technologies. In practice, the marketplace pressures to accommodate this is often inevitable for many larger operators of edge networks, especially for those that wish to avoid lock-in with a particular vendor.

To make a diverse and disparate edge network viable, organisations need to adopt open technologies. Creating standards around open source software and hardware to ensure that they can interact via open source solutions is ultimately the only way to guarantee that every component in a diverse and distributed edge network can interact with its counterparts.

2. Using the hybrid cloud

In practice, a sufficiently scaled edge network is going to be a fusion of many different workloads operating in concert with one-another. Among other things, you can expect your edge infrastructure to be running virtual machines, containers, and bare-metal nodes running network functions. With particularly data-intensive workloads such as AI demanding microservice architectures, edge computing needs to be able to reconcile such complex tasks with more traditional and routine workloads.

This is where the hybrid cloud becomes essential for the edge computing paradigm. A hybrid cloud deployment creates a common foundation for an edge system, which in turn allows teams to manage thousands of networked devices just as they would a centralised server. In addition, a hybrid cloud architecture’s inherent diversity also helps organisations avoid the spectre of vendor lock-in.

Storing up success for hybrid cloud applications

Recent global events have reshaped how enterprises of all types create, store and process their data. Supervising this gets complicated but hybrid cloud can help streamline data management and enhance business performance, explains Thomas Harrer, CTO at IBM IT Infrastructure/Servers & Storage (EMEA). Read here

3. Attention to scale

One of the main strengths of edge computing is its ability to scale – both geographically, and in terms of the workloads it handles. Adopting open standards and hybrid cloud infrastructure is integral to enabling edge to scale, so as to accommodate new workloads and systems without hitches.

Alongside the fundamentals of open technologies and the hybrid cloud, however, organisations need to ensure that their edge infrastructure is created with the intent of scaling. This means that architectures and resources should be structured and planned to accommodate new technologies, and that they are able to recognise, address and mitigate the inevitable challenges that will arise when scaling a network up. One example of where this approach pays dividends is security planning: planning out the structure of your permissions system ahead of time is always going to be far easier than having to replace an ad-hoc structure that’s not fit for purpose.

The value-add of the edge is set to be game-changing for many organisations, enabling next-generation technologies and applications to deliver huge performance and societal benefits. Through adopting open technologies, embracing the hybrid cloud, and planning to scale from the outset, the edge computing future can deliver on these promises – while also maximising quality-of-life for your team through making the edge resilient and scalable.

Written by Martin Percival, solutions architect at Red Hat

Editor's Choice

Editor's Choice consists of the best articles written by third parties and selected by our editors. You can contact us at timothy.adler at stubbenedge.com