Can organisations be sure that their rapid journey into the cloud will continue to be trouble-free? The question is worth asking because in the space of little more than a decade cloud use has gone from utility computing to the current wide array of business models.
Now practically every application or solution claims to be “cloudy” to some extent, to the point of having terms such as “cloud-washing”.
The truth is that almost every business is utilising a cloud-based solution in one form or another, with much of the migration having been almost by stealth.
At its core, cloud promises greater flexibility through resource-elasticity to enable consumption to match the demands of each business operating on it.
The range of options has ballooned. Cloud storage comes in all kinds of packages based on price, performance or capacity and enables businesses to have access to their data and applications no matter where in the world they are.
Archive or backup data has expanded hugely, for example, as organisations migrate from tape-based alternatives. Equally, cloud-based solutions have quickly evolved to increase productivity and collaborative working through rapid data-sharing.
The cloud now requires more care
It is not all rosy, however. This proliferation of solutions can present a risk for each business as unstructured data suddenly proliferates in the cloud almost by stealth, without a business being able to exercise any of the controls they would normally take for granted.
In addition, businesses migrating to the cloud often find it difficult to obtain accurate guarantees or commitments on storage performance, being faced with relative terms such as good, better or best to define performance-levels.
There can be little in the way of quantitative performance capabilities, particularly in the use of cloud for Disaster Recovery services, where providers often over-commit storage performance resources by factors of 10:1 or more.
>See also: Enterprise cloud storage: usage and trends
They operate on the dubious basis that the chances of being called on to provide the full range of resources are slim and that in any case, clients are unlikely to contend. The problem with this approach is opacity – it is often neither understood by the client nor declared by the provider.
The changing cloud atlas
It is worth here just reminding ourselves of the terminology. Infrastructure-as-a-Service (IaaS), enables organisations to utilise cloud-based resources for their compute, storage and networking demands.
Platform-as-a-Service (PaaS) builds on this concept to provide an environment when applications can be developed and deployed to a platform standard, removing some of the nuances required for IaaS.
Software as a Service (SaaS) is the delivery method offered by many software vendors so that applications come package-wrapped with all the infrastructure and services needed to operate them – you simply pay as you use the applications.
As applications evolve and are replaced, this has become the defacto standard with many core applications having shifted to the cloud piecemeal, rather than as part of a strategy.
On top of this, sub-services have developed, of which Disaster Recovery-as-a-Service (DRaaS) is one, along with Storage-as-a-Service.
It should also be remembered that today’s cloud makes massive use of virtualisation (or what are typically referred to as “Software defined” components) which boosts service-portability and reduces reliance on specific underlying pieces of hardware. This enables greater efficiency, and with increased scalability, enhanced economies of scale.
Successfully navigating the pitfalls hidden in the cloud
So how do businesses choose one of these services over another and ensure they have the best of the cloud that suits their needs and avoids unnecessary pitfalls?
It is true that the giants of the industry, the hyperscale solutions such as AWS, Azure and Google, for example, offer a very well-developed, self-service “hands off” model with little interaction with the end-client. As the name implies the main benefit of these solutions is scale and attractive headline costs.
The drawback can be that there is still considerable effort required by the client (either directly or with a managed service) to supplement the cloud with the skills and knowledge to manage it. Most hyperscale solutions also come in set incremental bundles, so clients can find that they outgrow the initial solution only to find significant jumps in cost as they scale up.
At the other end of the scale, however, the “hands-on” cloud provider wraps the cloud resources in proactive customer-managed services, to steer the passage through cloud transformation.
While both approaches can work for businesses, as indeed can a combination, if an organisation lacks the skills in-house to operate a hands off model, then the reality of cloud agility will not be delivered.
Every organisation can enjoy the cloud’s flexibility
Although a company’s size, rather than its age will be the more influential on its current cloud-use, as businesses move at ever-greater velocity, they all naturally find that cloud’s flexibility better meets the requirements of modern, agile, go-to-market strategies. Each should be able to find the cloud model that enables it to meet demand when required, without building in largely unused and wasteful resource-headroom.
Start-ups for instance are more likely to be consuming a higher percentage of cloud-based solutions than a multi-decade enterprise, purely because there are no legacy applications to consider.
For all of them however, flexibility is certainly one of the headline benefits of the cloud. Start-ups find that the cost of provisioning their own infrastructure could cripple them before they have even started, making the pay-as-you-grow model best, as it aligns costs with the success of the company. For more developed companies, cloud can help smooth out lumpy capital spend, often providing greater cost-transparency while allowing them to focus on their core business.
Overcoming fear in the cloud – security and cost especially
Inevitably, when an organisation contemplates greater cloud-use, all kinds of fears arise. Typically these concern the complexity of migration for legacy applications, the perceived security concerns and the costs.
Given the rapid increase in the volume and sophistication of cyber threats, security in the cloud should be a major concern for all businesses. Yet security is a discipline, so cloud is neither more nor less secure than self-owned or operated infrastructure.
It is often the case, too that double standards apply with organisations making more demands of the security provisions of a potential cloud provider than they apply to their own infrastructure.
Conversely many organisations have embraced dev/ops environments which leverage cloud for increased agility, but have little or no visibility of the corporate risk they are operating. The fact is, much of their cloud consumption is not identifiable. It is in these circumstances that the adage “you cannot control what you cannot see” has never been truer.
In truth, complacency and indiscipline are just as much the common risks for cloud security, as they are for other deployment models.
Security requires partnership
To address their security concerns, businesses must work in partnership with their cloud-providers to understand the end-to-end security profile they need to adopt.
Traditionally this was just about building walls to prevent issues occurring in the first place but while these are still absolutely necessary, a more holistic approach to security is required.
New technologies are now available that rely on behavioural and trend anomalies and do not assume that a business could never be breached. Instead they take a more realistic position of building to prevent breaches while taking into account the possibility that this could fail.
This is an area of security where significant advances have been made to spot deviations from normal behaviour, even if there is no substitute for expertise and experience. In the cloud, a provider can provide access to these kinds of niche skills to multiple clients through scalable security services.
The fear of cost-exposure in the cloud is legitimate. One of the problems is the difficulty of achieving an accurate comparison when current costs are dispersed into multiple budgets and the cloud alternative is rolled up into one figure. How do you compare what appear to be cloud apples with on-premises or private lemons?
The huge importance of new regulation
Contractual difficulties can complicate these matters too. A law firm in the UK, for example, may find that it cannot use a US-owned cloud-provider if its business is likely to involve legal action against the US Government.
This is actually more common than one might think. Unfortunately, changing regulation, particularly in the shape of the European Union’s General Data Protection Regulation, is probably set to increase further the complexities of where organisations should store data.
The EU GDPR is something all businesses have to take into account before it comes into force on May 25 next year because it carries significantly increased responsibilities for the holding of Personally Identifiable information.
One of the cornerstones of its 99 articles is the concept of “data privacy by design”, with potentially enormous penalties for those who do not comply, including fines of four per cent of global turnover as well as having to offer financial redress to those affected.
SaaS solutions that are resident in the hyperscale cloud could potentially be breaching the data transfer requirements of this new regulation, suddenly landing companies in significant trouble with the regulation’s supervisory authorities.
Insist on best-practice standards
There are nonetheless, a number of standards that apply to cloud, and which organisations can insist on even though they are not specific to it, such as ISO27001, PCI compliance and Sarbanes-Oxley Act compliance (or SOX) for example.
There are also those specifically related to cloud, such as CSA STAR. Nonetheless, the most significant change for all businesses and providers is the GDPR and its wide-ranging requirements around data privacy.
Any organisation utilising a cloud provider needs to ensure that there is a legal contract defining the restrictions around the Data Controller and Processor relationship within the concepts of the new restriction.
The speed of adoption and expansion of cloud has meant many organisations enjoying its benefits of the cloud do not fully understand how much of its resources they are consuming, both from SaaS solutions and also from their gradual accumulation of IT and dev-ops initiatives. In the age of the GDPR this is a reckless position to be in.
>See also: 3 considerations for a smooth cloud adoption
The importance of regulation will be one of the most significant factors as the cloud industry develops, covering any business holding personal information (including those operating their own infrastructure) and becoming part of overall corporate responsibility.
As more and more tech companies embrace subscription-style services based in the cloud, the need to act in compliance with regulation has never been more urgent. The new regulatory regimes demand that organisations have far better understanding and supervision of their cloud footprint (and indeed their private infrastructures and data sets). Failure to achieve this higher visibility is to risk significant regulatory repercussions with devastating financial and reputational consequences.
The cloud is indispensable to any ambitious business now and with expert guidance and attention its immense advantages are there for the taking. Yet too many organisations still have a poor comprehension of the changed regulatory landscape and of their own cloud-operations. This urgently needs to change.
Sourced by Adam Ryan, GDPR Practice Lead at Calligo