Running workloads across multiple clouds, through multiple vendors, is the desire of many organisations wanting to achieve innovation and flexibility in the digital world. Interoperability can realise this.
The Interop Challenge set forth by IBM in April earlier this year, sought to establish that competitors, working together, could achieve interoperability on OpenStack, the open-source platform.
In total 18 OpenStack cloud vendors successfully completed this community Interop Challenge issued by IBM Cloud, ‘although we were only expecting 4 or 5 when we issued the challenge’, stated Don Rippert, general manager for IBM cloud strategy, business development and technology, as a keynote speaker yesterday morning.
What is interoperability?
Simply, interoperability is the ability to deploy and successfully run OpenStack deployments across multiple clouds, through multiple vendors, simultaneously.
IBM issued the challenge after suggestions interoperability could not be run through vendor collaboration on OpenStack.
>See also: Cloud data management: data protection
These skeptics were yesterday proved wrong in Barcelona, as 18 cloud vendors, including IBM, AT&T, Canonical, Cisco, Deutsche Telekom, DreamHost, Fujitsu, Huawei, Hewlett Packard Enterprise (HPE), Intel, Linaro, Mirantis, OpenStack Innovation Centre (OSIC), OVH, Rackspace, Red Hat, Suse and VMware, rose to the challenge.
‘Nobody has doubted the innovation and integration capabilities within the OpenStack projects,’ said Rippert, ‘however some doubted whether the vendors supporting OpenStack would work together to achieve interoperability.’
‘Today with this significant milestone, we are proving to the world that cross-vendor OpenStack interoperability is a reality.’
The Interop Challenge used deployment and execution of an enterprise workload with automated deployment tools, demonstrating the capabilities of OpenStack as a cloud infrastructure that supports enterprise applications.
Indeed, the collaborative effort, in keeping with the open-source platform’s ethos, demonstrated in front of a packed hall usual competitors taking an application and using it across multiple OpenStack environments.
This is the first time competitors have collaborated to make enterprise cloud computing environments interoperable – somewhat of a watershed moment, and certainly a defining one.
What’s the big deal?
Cloud interoperability means customers are protected from vendor lock-in and able to easily uproot workloads and move them to new cloud providers if they choose.
Basically, it gives users or customers the option to use private clouds (on-premise) and public clouds interchangeably. It provides flexibility and efficiency
One of the main drivers behind open-source adoption is the fast rate of innovation and development, which the platform champions.
Seamless interoperation, in turn, enables consumers to better leverage hybrid OpenStack cloud environments to drive innovation and choose whatever combination suits their requirements best.
It is very much an individual journey, with no one size fitting all.
‘What customers want from open source projects is innovation, integration and interoperability,’ said Rippert.
‘When it comes to OpenStack, our hope is that this demonstration of working interoperability will reduce customer fears of vendor lock-in.’
>See also: Cloud security – pie in the sky?
The success of the challenge confirms customers are able to source different interrelated components of their cloud solutions from different vendors of their choice to create the combined environment that best meets their needs.
‘The next step of making technology useful and successful to humans in open-source is interoperability,’ Dr Angel Diaz, vice president of cloud architecture and technology at IBM told Information Age.
The customer is the real winner
The customer is the real winner, stated Rippert during his speech, because people were missing out. Now through the competitive collaboration of the Interop challenge they are not.
This competition is beneficial for vendors as well, claimed Rippert: ‘A rising tide gives vendors more surface area to compete in.’
Customers expect increased innovation, better integration, and interoperability on the OpenStack platform offers this.
Materna’s senior VP IT factory, Uwe Scariot said it ‘allows us to use the same functionalities and same APIs anywhere, across data centres and clouds’.
‘It provides freedom of choice by increasing the availability of skills. It increases the availability of their footprint across the world, across everybody’s data centre,’ said Diaz.
‘The world becomes their data centre.’
Interoperability automatically deploys workloads across multiple private and public clouds, and according to Rippert, is key to empowering enterprises of all sizes for any open-source project.
The immediate future is in the multi-cloud
‘Multi-cloud is the reality today, hybrid is on the horizon,’ speculated Bobby Patrick, chief marketing officer at Hewlett Packard Enterprise (HPE).
The current future facing organisations is having a multi-cloud performing capability.
‘It’s a choice of consistency,’ Diaz told Information Age.
‘Our clients want to choose when they run an application or workload, they want it on-premise (a private cloud close to their data centre for high speed transactions for example) or they want it on the public cloud because they want to have geographical reach maybe.’
‘They want to have that assurance that they can move that workload where they want. Not just within a single vendor, but across vendors too.’
The success of the Interop Challenge further demonstrates that a free flowing movement between private (on-premise) clouds and public clouds for OpenStack users is achievable.
Ultimately, this interchangeable capability between different tropes of cloud provides better value to organisations that utilise this open-source technology.