IPA is about to change the way you deploy your IT infrastructure

 

In a crowded room in Berlin this summer, Gartner analysts unveiled research picking new trends in IT investment: business intelligence, infrastructure and cloud technologies.

This kind of insight is priceless for vendors. It shapes the way infrastructure products are developed, and drives technology forward.

But, just because a vendor builds its products with its customers in mind, doesn’t mean that their products work for every customer.

This is where infrastructure performance analytics, or IPA, comes into its own – and it’s changing the way major IT purchasing and deployment decisions are made.

That’s because until recently it’s been very difficult to understand which product will work best for a customer once it’s out of the lab and handling real application workloads.

>See also: Bridging the business intelligence and analytics gaps

It’s surprising really, that some of the most important, and expensive, purchases of any organisation have been made based on guesswork and often blind faith in the vendors’ assertions.

IPA is the crystal ball everyone wishes they had when it’s time to make a major IT purchasing decision.

Take storage for example – by putting arrays through their paces with your actual application workloads, customers have a true insight into how those solutions will work when faced with your real life situations, and make a properly informed choice.

Back in Berlin, what Gartner highlighted at its annual IT infrastructure and operations management summit wasn’t a huge surprise.

But it did emphasise where CIOs will be investing their budgets over the next few years at least.

These budgets are substantial, and of all the IT purchasing decisions that are made, storage accounts for a very large share.

>See also: The 3 pillars of big data analytics potential

Gartner’s rival IDC predicts big data storage revenue alone is set to grow at a CAGR of 20.4% by 2019 and big data software deployments are forecast to reach $3.51 billion in 2019.

Storage is a big ticket item – and no wonder. Without a reliable, high performance data storage solution, Gartner’s top three priorities would be much harder to focus on.

For example, business intelligence relies on real-time access to data which simply isn’t achievable if there are bottlenecks or other glitches in the storage infrastructure.

And what vendors promise in terms of performance and availability, doesn’t always materialise when the kit is put to work with real data and real workloads.

As you can imagine, that often means that the implementation process is far from easy – the vendor’s team spends unnecessary time fixing glitches that could have been ironed out back in the lab and the customer experiences extra stress at an already high-pressured time.

>See also: New forces driving data value #2: multi-genre analytics

To address this, IPA is becoming a secret weapon in any good vendor’s or reseller’s arsenal.

A core group of vendors has already started to adopt this kind of testing: running workload analysis in a pre-production or staging environment to make sure the technology works for customers’ unique workload requirements.

And the channel is starting to test storage arrays with real workloads before it makes recommendations to its customers too.

It only makes sense then, that this trend will extend beyond significant storage and other crucial infrastructure purchases.

It’s simple: if vendors and resellers are employing IPA for storage systems, but not for other technologies like BYOD, IoT or even smart machines, their customers’ infrastructure still won’t be functioning at its optimum level.

It’s highly likely that customers won’t get to see a lot of the behind-the-scenes action when it comes to infrastructure performance analytics (IPA).

But a growth in uptake will change the way that customers make purchasing decisions.

They will have all the information they need to make a fully informed choice, and because the testing should highlight any bottlenecks that are likely to occur, these will have been dealt with before implementation, making the whole process run much more smoothly.

>See also: Heterogeneous predictive analytics: the answer to data management challenges?

How will this change IT decision-making and reshape the infrastructure landscape?

Well, there’s an obvious knock-on effect for the supply chain. In the channel, resellers with the extra insight IPA provides will demand much more from their vendors to make sure their customers’ requirements are fully met.

And that extra pressure on vendors will mean that a one-size-fits-all approach won’t cut it anymore.

So we’ll start to see vendors working much more closely with the channel and with customers, to find out what the end user really needs.

Instead of using guesswork, or shoehorning in a product that’s not quite what’s required, vendors will tailor their products to fit the customer, rather than expecting the customer to fit with what’s provided.

Armed with a much better understanding of customers’ priorities, summits like Gartner’s this summer will still be a useful part of the development process, but shouldn’t hold any surprises for vendors.

They will already know what customers need and be well on the way to developing it for them.

They say the customer is king – I think we’re about to see what that really means.

 

Sourced by Len Rosenthal, CMO at Virtual Instruments

Avatar photo

Nick Ismail

Nick Ismail is a former editor for Information Age (from 2018 to 2022) before moving on to become Global Head of Brand Journalism at HCLTech. He has a particular interest in smart technologies, AI and...