Out of the box: a peek at the future of containerisation in enterprise

Since launching in 2013, open source cloud containerisation engine Docker has seen explosive growth. The concept of containerisation is not new – it’s been around for many years as a solution to the problem of how to get software to run reliably when moved from one computing environment to another.

But Docker has breathed new life into the idea by simplifying the process for the average developer and system administrator, giving them a standard interface and easy-to-use tools to quickly assemble composite, enterprise-scale, business-critical applications.

It may seem like just another approach to virtualisation, but unlike virtual machines (VMs), Docker does not require a full OS to be created. It can be thought of as ‘OS virtualisation’ to a VM’s ‘hardware virtualisation’.

While both approaches allow you to abstract the workload from the underlying hardware, with virtualisation admins usually walled off apps from each other by putting one app per virtual machine.

Containerisation technology like Docker uses a lightweight approach to deploy each application in its own container on the ‘bare metal’ of the server while sharing part of the single host OS. Now you don’t need to spin a VM for every app, so containers can be quicker to create and launch.

As Forrester analyst Randy Heffner explained in his report Nine Questions to Ask About Docker, containers can enable a company to pack a lot more applications into a single physical server than a VM can.

> See also: Why the future of storage is software-defined

What this means in practice is that you can put two to three times as many applications on a single server with containers than you can with a VM.

‘To begin considering whether, when and how to add Docker to one’s technology estate, start with what is known about the benefits of OS-level containers for application delivery,’ says Heffner. ‘With traditional server virtualisation, developers would typically package our sample application into five separate VMs – each with a full copy of the Linux OS. To run and test the application, developers would start each separate VM and initiate specific application components on each.’

With Docker, however, developers would configure and run five Docker containers, but they have more flexibility in how the containers are run, developers could run all five containers on a single copy of Linux. And for production deployment, the five could run on one copy of Linux or be split across multiple, according to scalability demands.

‘Containers start much more quickly than traditional VMs,’ says Heffner. ‘This helps delivery teams maintain momentum as they move through incremental build, test and deployment cycles with Agile development and continuous delivery.’

With Forrester’s sample application, developers could configure alternative versions of each of the five components in different ways for different stages of testing, and then quickly switch between them by stopping and starting containers.

‘When deploying to production, DevOps teams could more quickly roll out changes to Docker-enabled VMs that are already running, and more quickly roll back changes that don’t work properly.’

Don’t be caught out

The containerisation approach automates the packaging, shipping and deployment of apps, making them lightweight, portable and self-sufficient, with a finer degree of granularity and better resource utilisation in the host platform. Forall these reasons, many are wondering whether it could make virtualisation obsolete.

Goldman Sachs recently announced an ambition to containerise 90% of its workloads. But despite all the hype, there is still a lack of meaningful use cases to drive business value. Businesses are finding limitations in a number of areas, especially around incurring unnecessary costs with cloud computing.

As Oscar Wahlberg, director of product management at software-defined storage specialist Nexenta, explains, ‘Container environments are incredibly easy to construct and scale up when extra capacity is needed, but if they are not then downsized or removed when they are no longer required, companies will continue to pay for the cloud usage – something that cloud service providers love.’

> See also: Containerisation: the winning strategy for BYOD mobility

And another big problem, which often gets overlooked in the excitement about containers, is security.

According to Black Duck Software, which helps organisations to secure and manage open source software, containers often bundle applications with a lot of software and files you may not know about or want in your production environment. As adoption of containers grows, says the firm, so does the security risk of potential open source vulnerabilities being hidden inside them.

‘If organisations don’t keep software stacks and application portfolios free of known, exploitable versions of open source code, any vulnerabilities present in the open source components of a container application can jeopardise its security,’ notes Randy Kilmon, vice president of engineering at Black Duck Software.

‘Visibility and control are critical for container security.’

What’s needed, Kilmon advises, is a three-step process of informed open-source code selection, continual vigilance by users and integrators of open source code, and ongoing code maintenance over the full life cycle of the containerised application.

‘In other words, it’s all about knowing your container’s code,’ he says, ‘especially any open source code you’re using, because you can’t manage what you can’t see.’

But Ben Wootton, VP of technology for Europe at DevOps specialist Sendachi, argues that there is a lot of unfounded fear around the issue.

Docker has spent the past few years focusing on the issue of security of its containers and their life cycle.

‘All major Linux isolation APIs are now supported, and Docker incorporates a signing and verification workflow to ensure that containers are not tampered with on their path to production,’ he says.

‘When instantiating your containers, there are various security settings that engineers should use to further lock down the access of their containers, and other practices to secure access to the daemon and container file system, but I believe that, fundamentally, Docker is a secure platform by default.’

The year of containers

As businesses realise the potential benefits of containerisation, legacy vendors tend to ‘add’ container technology to their existing solutions, meaning they may not be using the technology to its full extent.

‘This is just them attempting to jump on the bandwagon, and often means that they don’t make use of the technology’s full capabilities,’ argues Wahlberg. As such, he expects to see start-ups leading the way in developing the technology.

‘Younger firms are able to build their entire solution with container technology at the forefront of their thinking, sidestepping the incompatibility issues that inevitably arise when trying to bolt innovative tech on to legacy set-ups.’

Sendachi’s Ben Wootton believes that containerisation will soon hit the Microsoft platform as they emerge from the technical previews. ‘This will be incredibly powerful as, for the first time, we will be able to deploy and manage our Linux and Windows applications through the same consistent API.

‘Beyond that, it will also be the year of containers in the enterprise, as the enterprise moves forward from development and test into production.’

‘Finally,’ he says, ‘I am excited to see what happens with Docker now they have realised their ambitions of a full integrated Docker Datacentre stack that integrates all of the tools from the development tool chain into production.’

> See also: The difference between ‘cloud’ and ‘virtualisation’ – why cloud is the biggest misnomer in business

Within the next two years, Wootton argues that containerisation will overtake virtualisation, as it provides the same isolation and resiliency benefits of a virtual machine but in a much more lightweight fashion, while skipping a whole layer of tooling with the associated licence and management costs of virtual machines.

‘After this two- or three-year window, VMs will continue to live on for many years, as they are so widely deployed as an approach,’ he says. ‘Containers are undoubtedly the future, though.’

In the interim, however, the two can work together. Of course, not all software is containerised today, soit makes sense to provision virtual machines and then deploy containers onto them or alongside them while in this interim state.

But in order to embrace the powerful building block of container technology in their organisations, business leaders will need to start out on a journey of understanding how and where best to use and deploy it.

Chloe Green

Copywriter, content marketer, and journalist with 7+ years' experience including 4 in B2B technology.

Related Topics

Analytics
Data
Storage