How workload analysis is changing the storage purchasing process

An astonishing 66% of enterprise storage engineers and architects don't know the I/O profiles of their production workloads

 How workload analysis is changing the storage purchasing process

When I talk about future storage system planning, I often ask people to think about planning for a party. You know roughly how many people are coming, you work out how much food they’re likely to eat and then you buy a little extra. Because running out of food is a big no-no in the party stakes.

What tends to happen though, is that you don’t run out of food. In fact, as the last guests leave, you realise you have enough food to throw the party again tomorrow night. But all you’ll actually be throwing is money out of the window.

Working out how much and what type of storage you need, in terms of performance and capacity, in your datacentre is very similar. Until recently storage planning, especially around performance, has relied on guesswork: looking at the volume of data you’re producing now, predicting how fast it will grow in the future, and adding a bit of extra headroom for good luck. Running out of storage or IOPs is not an option.

It’s worked this way for years, and one of the major reasons is because it’s been so difficult to understand and predict changes in application workload behaviour. And then translate how those workload changes impact storage performance.

> See also: How workload automation can bring DevOps up to speed

In fact, a Gatepoint Research survey last year found that an astonishing 66% of enterprise storage engineers and architects didn’t know the I/O profiles of their production workloads.

It’s not surprising – there hasn’t been a simple, accurate way to measure them – or to use that information to ensure new storage arrays will cope in the future. And although there are many reasons why it’s valuable to regularly have that information – when replacing outdated legacy arrays, or in the case of datacentre consolidation projects – it’s more important now than ever before.

That’s because, according to the same Gatepoint Research study, almost half of all respondents will look into new technologies for their next storage purchase. That’s a game changer.

Where it’s tricky to predict whether traditional storage has the capabilities to run your key applications, object storage, cloud and virtualised storage environments take that forecasting to another level.

Software-defined storage (SDS), like Ceph or OpenStack, is the most disruptive to storage performance predictability. It’s like not knowing how many people are coming to your party and still trying to put on a good spread.

The storage industry has recognised this isn’t the best way to operate, and that’s why I anticipate we’re going to see a rise in the number of different tools to help storage managers find the right solution for their own unique application workloads.

These workload analysis tools will characterise application workload profiles, not just in terms of KPIs like latency, throughput or IOPS over time, but also key I/O metrics like read/write ratios, data/metadata command mixes and random/sequential ratios. They’ll also highlight the compressibility and dedupability of the data content.

The move towards workload analysis could completely change the way storage managers plan for and provision storage. And that’s a good thing: by having the ability to plan ahead and make sound purchasing decisions, those storage engineering and architecture teams will have much more control over their storage budgets, with a positive knock-on effect for the rest of the IT budget.

> See also: 5 critical changes to IT infrastructure usage you should be aware of in 2015

For vendors, it’s not going to be good enough to provide extra performance headroom and hope for the best. Already we’re starting to see how they are responding: by bringing in workload analysis and performance validation companies during the proof-of-concept stage to make sure the solution they’re offering is really up to the job.

It’ll mean they have to be even more transparent in terms of predicted performance and that can only be good news – for the storage managers who are buying solutions and for the industry, which I think will thrive in this new streamlined environment.

It’s an exciting time for storage –  although long overdue – and for its managers, who can cut costs by buying just what they need, and have no leftover party food going to waste.

Comments (0)