Why edge computing is forcing us to rethink software architectures

The early 21st century saw a remarkable shift in computing. The rapid and dramatic shift to the cloud was precipitated by IaaS (infrastructure-as-a-service) solutions such as AWS providing effectively unlimited access to computing hardware. This first generation brought around two key advantages mainly centred around the democratisation of a capability that traditionally had a higher bar for adoption.

The first is the ability to use the familiar operating systems and development tools of everyday users — at the scale of cloud. This meant the skills and techniques developed and honed on personal computers with operating systems such as Linux and Windows were readily applicable to the cloud. Neither the mainframe skills of old, nor yet current HPC (High-Performance Computing) skills were required or even useful. ‘Throwing hardware at the problem,’ became the common strategy.

The second was the inexpensiveness of this ‘commodity hardware’ which characterised the first-generation of cloud. Hardware became a lot cheaper than programmers, and paying by the hour for what you use rather than upfront was a particularly attractive proposition. The elimination of upfront capital expenditures needed to invest in expensive computer systems or data centre capacity opened up technology innovation at a whole new level, both at the corporate and startup levels. The original Google was built on a data centre rack in the founders’ garage-office. This was no longer the price of admission.

Ever smaller teams could now build, test, release, and iterate applications that traditionally would have been prohibitively complicated. The smallness of these teams meant that leanness in human resource and time was valued far beyond correctness, robustness, and efficiency of the software. “Cheap hardware, expensive programmers,” became the dictum. It is therefore no accident that the approach to software was to add layers on top of an existing stack, patch issues at a higher level of abstraction, and paper over issues rather than taking a rigorous approach to solving them at the root. There is an old aphorism humorously called the “fundamental theorem of software engineering” that rings true here: “”All problems in computer science can be solved by another level of abstraction… Except for the problem of too many layers of abstraction.”

Accessing the benefits of edge computing – a must for industry 4.0

Organisations are beginning to realise that they need unwavering control over every aspect of their business to drive digital transformation. How they do this? At the edge. Read here

The perspective on cloud hardware has since shifted. The current generation of cloud focuses on expensive, high-performance hardware rather than cheap commoditised systems. For one, cloud hardware and data centre architectures are morphing into something resembling an HPC system or supercomputer. Networking has followed the same route, with technologies like infiniband EDR and photonics paving the way for ever greater bandwidth and tighter latencies between servers, while using backbones and virtual networks have led to improvements in the bandwidth between geographically distant cloud data centres.

The other shift currently underway is in the layout of these platforms themselves. The cloud is morphing and merging into edge computing environments where data centres are deployed with significantly greater de-centralisation and distribution. Traditionally an entire continent may be served by a handful of cloud data centres. Edge computing moved these computing resources much closer to the end-user — virtually to every city or major town. The edge data centres of every major cloud provider are now integrated into their backbone providing a sophisticated, geographically dispersed grid.

Despite this fundamental progress in the hardware sphere, the software problems stand out more starkly than ever. The operating systems, programming tools and frameworks being used to build on the cloud are mutations on top of what has worked for us in the first-generation of cloud and prior to cloud computing. Applications such as video conferencing and streaming, real-time virtual events, 3D gaming, AR/VR, and the swarms of IoT devices require tremendous computing power, bandwidth and low latencies to keep pace. The issue has accelerated due to the Covid-19 pandemic and the need to transition more significantly to an online world. While the hardware is available the software platforms and applications are buckling under the pressure.

Tech companies are scrambling to invest in their online collaboration platforms, video conferencing systems, and virtual environments. But the real expense is the opportunity cost of not being able to innovate quickly enough for this new world. Despite the downside of the pandemic, that innovation enabled by cloud and edge computing infrastructure is being held back by the software used to develop and scale to user demands.

Written by Dr. Rashid Mansoor, CTO of UK supercomputing startup, Hadean

Editor's Choice

Editor's Choice consists of the best articles written by third parties and selected by our editors. You can contact us at timothy.adler at stubbenedge.com