Digital transformation is radically reshaping the way organisations across the globe do business. Empowered by DevOps practices, IT teams are helping to drive down costs, enhance agility and create a new era of innovation-fuelled growth. But what drives DevOps? Increasingly, the answer is containers: viewed by many as a major evolution in cloud computing, providing scalability and flexibility where developers need it most. Yet for the enterprise architects tasked with maintaining IT infrastructure, the “dream” of containers can very quickly turn into a nightmare.
Container sprawl and interoperability issues with legacy technology including centralised databases threaten to undermine the DevOps project, and with it, the digital transformation efforts now so vital to business growth.
Everything you need to know about containers
The beauty of containers
Containers could be described as the modern building blocks of cloud computing. Like virtual machines (VMs), they provide a neat, self-contained package in which developers can run their applications, libraries and other dependencies. In so doing, containers offer a consistent, predictable environment isolated from other applications. However, they’re more lightweight and have lower associated overheads than virtual machines, enabling them to be deployed quickly and easily at large scale across private, public and hybrid cloud environments.
It’s therefore no surprise to see why containers have garnered such positive press over recent years. The ability to set-up test environments quickly and easily and scale them up to full production if necessary is a tantalising prospect for developers. It’s claimed that over 80% of IT teams used containers in 2018, up from just over half (58%) the year before. Google alone says it starts over two billion containers each week.
Debunking the myths surrounding Linux-based containers
From dream to nightmare
However, the rapid adoption of containers is making clear a growing rift in IT architecture, between stateless application workloads running on container environments, and stateful application workloads running on more traditional infrastructure. As container orchestration tools such as Kubernetes have allowed organisations to take greater control over their container environments, so businesses have begun to see the benefits of stateless applications — from allowing an online-first approach to services, to easier scalability and redeployment, and the ability to connect multiple applications into services using APIs.
Yet as organisations have taken full advantage of containers, they are now facing the opposite challenge from their legacy IT. Quite simply, architecture built for stateful applications cannot match the flexibility, agility and rapid evolution that is now possible. For instance stateful
applications will often exist in silos, with their own independent network, policies and infrastructure – meaning it is much harder to scale without directly adding to that infrastructure, or to connect with other applications using APIs. What this means is that architects face an all-too-common nightmare of running without actually moving, as despite investment and energy put into building and improving legacy applications and their databases, the potential of stateless applications continues to accelerate over the horizon.
It’s clear that architects need to bridge this gap – as the longer they leave it, the wider and harder to cross it will become. The task will have to be delicate. The new lightweight approach containers allow is at odds with the traditional, monolithic approach of legacy databases and infrastructure. At the same time, simply replacing a legacy database with a more modern alternative is not an easy answer. That database will doubtless support applications that are absolutely critical to the business, and there is no guarantee that a more modern NoSQL database will automatically support containers.
Tackling security with container deployments
Orchestrating DevOps success
The good news is that there is light at the end of the tunnel. Modern databases are being designed to operate seamlessly with new container orchestration tools like Kubernetes — allowing architects to more easily manage how containers connect with centralised databases in the cloud. With these tools to hand, architects can finally take a holistic approach to IT infrastructure, ensuring each component works well together.
The challenge for architects will be understanding which of their applications need to be moved from stateful to stateless quickly, to ensure they can keep pace with the evolution of containers; and which can be kept in their legacy environment, as they are at no risk of becoming obsolete. For instance finance and payment functions, whose prime concern is performing the exact same action consistently, quickly and transparently, could remain on their legacy database, while anything that impacts the customer or end user experience should be modernised so that it can keep evolving at the same rate as customer demands. Over time, almost all of the applications in a business will be built on containers. If they can manage this evolution, architects can ensure that containers remain both a DevOps dream and an architect’s best friend.
IT architects have an increasingly challenging role in the organisation, as they are not only tasked with keeping the lights on, but also providing the right environment to drive innovation-fuelled success. Containers are just the latest advances in technology that tests their ability to keep pace with DevOps teams. There will surely be more ahead. To keep adding value to the business, architects must continue to evaluate ways to integrate existing and emerging technologies.