Which workloads are appropriate for the public cloud, and which are not?
IT professionals are constantly faced with tough decisions as to where to host their applications, but different workloads require different considerations.
If you’ve been trying to ignore the public cloud, it’s time to stop. According to Gartner, public cloud spending is up 32% in 2015, and is expected to reach $16.5 billion this year. Clearly, public cloud is not just for startups anymore.
Maybe you have yet to take the plunge because of concerns over security or data gravity. Or maybe you’re worried about spending spiraling out of control if workloads are not contained inside the walls of your data center.
Whatever your excuse might be, you may feel more comfortable taking the plunge after examining which kinds of workloads are more appropriate for public cloud - and which ones make more sense to keep within your own building.
Workload demand and data sensitivity
There are multi-billion dollar businesses that run their entire workload on Amazon Web Services. However, those are not the businesses under discussion here, so if you work for Netflix, Airbnb or some other born-on-the-cloud company, stop reading.
Instead, this is an examination of what a more conservative, classically Fortune 500 IT department should consider running on public cloud to give themselves better flexibility and cost savings. There are two axes to consider here for companies with that perspective: Workload Demand and Data Sensitivity.
Some workloads run all the time, while others are more sporadic. Because public cloud enables anyone to rent a virtual machine (VM) by the hour, only pay for the hours needed, and then shut down the VM, more variable workloads are attractive candidates for public cloud.
On the flip side, if you rented that VM by the hour, 24 hours a day, seven days a week, there comes a point when it likely becomes less expensive for you to run that VM on your own internal hardware.
Data sensitivity is a little less concrete. Some would argue that data in the public cloud can be more secure than in a private data center, for physical security reasons alone.
A provider like Google or Microsoft can afford the overhead of armed guards, super redundant networking and power, and other security fail-safes that a single IT department cannot. Software-as-a-Service (SaaS) providers like Salesforce.com and Workday take that a step further for CRM and HR markets, respectively, adding their own data security specialties.
Despite these arguments, some typically older, more conservative companies are simply not comfortable having certain mixes of financial, customer, or employee data off premises, and there is no convincing them otherwise.
Plotting workload demand and data sensitivity as the main axes on a two-dimensional model of which workloads should run where, we can produce the following gradient:
Public data that sits in highly variable workloads are a no brainer for public cloud placement. Core privacy data running in workloads that are constant over time, however, are better suited to stay in house. With that model established, let’s explore specific workload types.
Mapping specific workloads
The most obvious candidate for public cloud is a customer-facing marketing website. The data it contains is already public-facing, and the demand placed on it varies greatly depending upon advertisements, product reviews, and any number of other marketing activities.
> See also: Will public cloud kill the data centre?
To run a marketing website on private infrastructure puts great strain on the capacity planning process.
Order too little hardware and the website crashes under high demand, which can lead to lost revenue opportunity. Order too much hardware and it sits idle, eating up capital expense budget.
Utilising public cloud enables an IT team to pay for exactly how much capacity it needs exactly when it needs it, all of which can be automated using auto-scaling approaches.
After the marketing website, the next lowest hanging fruit are dev/test workloads or customer analytics, which should both be able to take advantage of sanitized data that doesn’t expose any Personally Identifiable Information (PII) for customers.
Development of custom applications is common in Fortune 500 environments, but even in Continuous Integration/Continuous Delivery scenarios, development and test resources do not have 24x7 needs, meaning they map better to the pay-as-you-go model public cloud offers as opposed to the capital expense model of your internal data center.
By contrast, certain financial operations like batch jobs that cull through sales and inventory databases on an hourly basis have sensitive data and take about the same amount of computing power every time they run.
That consistency, unlike the marketing website scenario, makes it easy to capacity plan for and amortize cost more precisely with capital expense budget, all while the sensitive data resides inside a corporate firewall.
Grey areas vary greatly by vertical and company culture, and possibly get supplanted by SaaS solutions mentioned earlier. For example, any company running PeopleSoft to manage its HR needs would not likely be comfortable running that software on a public cloud.
Adopting a SaaS approach for that function like Workday, however, is a little more appealing because not only are costs driven down by a per seat model, but SaaS can add additional security strategies in a way that a local IT team may not have bandwidth for.
Similar arguments can be made for email (Exchange vs. Gmail), CRM (Sugar CRM vs. Salesforce.com), and many other common workload types.
What should you do?
As with most things, your mileage will vary depending upon a huge variety of factors. Undoubtedly, public cloud has become a powerful force that every Enterprise IT department should consider.
Every company must discover the parts of the grey area they are comfortable with based on their own needs, but hopefully this at least provides a structure within which you can have the conversation.
Sourced from Pete Johnson, CliQr Technologies