15 for ’15: the top 15 storage predictions for 2015

Tiering is clearly showing its limitations as a stop-gap on the way to all-flash primary storage – Dave Wright, CEO, SolidFire

Image for 15 for ’15: the top 15 storage predictions for 2015

Flexibility will be the biggest issue facing storage in the coming year – Sean Horne, CTO and senior director of enterprise and mid-range storage, EMC:

The biggest questions that IT decision makers will be making over the coming year will be: how do I deploy a platform that can deal with abrupt changes in the business landscape? Be that in scaling to large demands in storage, or delivering performance for next generation workloads? How do we deliver this flexibility at an affordable cost, without pushing the organisation to take uncomfortable risks? And indeed, with the responsiveness required?

Organisations both scale up and scale out, and therefore, whilst storage needs to move with this change, it will be important to not let this disrupt the whole IT ecosystem. This will result in an increase in investments in developing hybrid cloud to give organisations the flexibility to direct workloads where they need to go, as they are needed.


Security and compliance will continue to dictate decisions around companies' hybrid cloud setups – Sean Horne, CTO and senior director of enterprise and mid-range storage, EMC:

In my opinion, there are four types of control over organisational data that are needed: privacy, trust, compliance and security. For example, data centres have huge compliance requirements they need to adhere to, but the privacy of data, how it is stored and who has access to be, can be argued, is an emotional and subjective decision, between the company and its customers.

Understandably, many businesses are not comfortable with their private data sitting in a public cloud, so a degree of flexibility is needed to allow businesses to capitalise on the economies of public cloud without incurring undue risk.

> See also: Cloud-horizon: the emerging world of on-demand computing 

Policy-based lifecycle management will be the answer to spiralling storage growth – Radek Dymacz, Head of R&D, Databarracks:

The key to reducing backup costs is good management and not applying blanket policies to all data. It’s about having the right retention and archive policies in place for the right data.

I think too many organisations struggle with data management because they regard ‘deletion’ as a scary word. No one really takes responsibility for corporate data or even knows who the ultimate owner is, so deletion is regarded as someone else’s job. As software becomes more integrated, we’ll have real-time, 360o visibility – storage decisions will be based on evidence and so 'deletion anxiety' will be less of an issue.


WAN optimisation will be the key to ensuring optimal data delivery – Everett Dolgner, director of replication product management, Silver Peak:

All the bandwidth in the world will not matter if packets are being dropped or delivered out of order due to congestion, as is often the case in MPLS and Internet connections.

To overcome these challenges and ensure optimal data delivery, organisations must establish a fully equipped network that will cope with the increased flow of traffic cloud storage initiatives bring. Failing to do so will result in an environment plagued by issues that will only lead to performance and business benefits being compromised.

Optimising the WAN can reduce over 90 percent of the traffic across the network and is key in providing the scalability needed to support all current and emerging applications.


We will see an accelerated move to software-defined storage Nigel Edwards Vice President, EMEA Sales and channel marketing HGST:

Thanks to commercially-supported open-source initiatives such as GlusterFS, Inktank Ceph Enterprise, and SwiftStack for OpenStack Object Storage, we can expect to see software-defined storage systems cross from cloud into more mainstream enterprise data centers across multiple deployment options.

We can also expect to see a rise in startup-developed software-defined storage offerings as more data centres recognise the benefits of this approach. With commercial support for open storage software, traditional IT will be able to use the same approaches once limited to the largest operators.


IT teams will be forced to invest in education in different ways as they're faced with the increasing complexities of virtualisation – Patrick Hubbard, head geek, SolarWinds

End users these days can't make buying decisions without considerable education, and vendors aren't necessarily forthcoming about educating them in a practical manner.

With ever smaller IT teams managing ever more complex solutions, they're becoming ever more challenged in implementation. Software-defined networking is driving things like the importability and containerisation across Openstack, AWS and VMWare and each one of these things brings with it an opportunity for things to go wrong if there's not additional expertise.

Companies' reaction too often is to call in an outside contractor, who doesn't necessarily stop and educate the team that's going to maintain it after they're gone, doesnt help the IT team actually move forward and maintain that level of skill in something they've purchased.

Those organisations that send off staff for refresher courses are more likely to be successful with new technology. So vendors will have to move into cross disciplinary education the next few years.

See also: 6 things to consider when choosing a flash storage solution


Software will be the biggest investment in storage management – Andy Dean, development manager, HPC:

Physical hardware costs have been coming down for a long time. Plus, storage capacity on hardware is increasing – we’ve seen 6TB Hard Disk Drives available already and 10TB Tapes aren’t far off – meaning less physical hardware is necessary (again, this means less expenditure). Therefore, I think the biggest expense item will be software.

Going forward, we’re going to need a more intelligent software stack to manage the huge quantity of data that people are storing. If a customer has 6PBs of data, for example, they will not want to leave that on an unsupported platform.


Storage and other IT resources will come together as the trend for convergence continues – Sean Horne, CTO and senior director of enterprise and mid-range storage, EMC:

Storage managers will need to think in terms of application requirements beyond pure capacity and latency terms, but in terms of how they interact with the other IT components.

As organisations progress on their journey to a hybrid cloud, these converged infrastructure experts have the opportunity to become strategic supporters of the business; enabling agility and innovation by delivering resource when it is needed, and advising on what’s possible within their new infrastructure context.


Organisations will increasingly embrace technology that enables them to deliver public cloud as-a-service Ian Finlay, VP, Abiquo:

Most organisations are already operating a hybrid IT estate in one form or another, and we can expect this trend to continue over the next 12 months. However, as it becomes easier to acquire multiple cloud services, visibility and control over data, and who has access to it, will continue to increase – posing security and governance risks to enterprises. As a result, we can expect to see organisations embracing technology that enables them to deliver public cloud as-a-service. This provides flexibility to internal customers, while maintaining the control that IT is tasked with delivering to the business.

We can also expect to see cloud service providers (CSPs) continue to add public cloud providers to their service portfolios, in order to strengthen customer relationships and add value to the fairly generic public cloud offerings.


The smart approach to flash adoption will be in hybrid arrays – Robin Kuepers, EMEA Storage Marketing Directo, Dell:

IT leaders are attracted to the hybrid-flash storage route as it offers the best of both worlds: High performance flash for fastest performance with the most frequently accessed and highest demanding applications, and low cost bulk storage for aging, or colder, data.

While some workloads call for race car speeds at all times, more often than not, an organisation has needs for supporting both high performance applications and less accessed data, which doesn’t require expensive storage. This is why most organisations can benefit from a single SAN that handles both ends of the spectrum at the same time.

Hybrid arrays can support SSDs and hard disk drives (HDDs) to offer this combination of high performance and lower overall cost. With the hybrid array approach, the SSD layer provides the fast performance processing and the HDDs retain all the older, colder storage that organisations need or want to retain but don’t access as often.


Tiering may not continue to be the most commercially sound decision to be taken by CIOs – Dave Wright, CEO SolidFire:

Flash is clearly having a huge impact on the storage space, offering 10X the performance of disk at a fraction of the cost. That trend will only continue and increase over time, until disk is completely relegated to cold storage. Tiering, on the other hand, is clearly showing it's limitations as a stop-gap on the way to all-flash primary storage.

Customers who initially embraced tiering are now dealing with the negative ramifications, including inconsistent performance and a need to add an ever-increasing amount of flash to the flash tier to?maintain performance.

> See also: Software-defined storage gains traction

Commoditisation will start to positively affect every area of storage – Patrick Hubbard, head geek, SolarWinds:

It's interesting with virtalisation that there's a sort of complexity and diversity funnel. At one end you have the end points which are workstations, BYOD devices, and Internet of Things or smart connected devices, but then in the centre you have this fairly well constrained data centre now with a number of technologies that integrate well together and are fairly easy to manage but that still sits on top of storage that is highly vendor specific.

From controllers all the way down to how storage is implemented from one vendor to another there's a lot of variation, so that seems to be one area where commodisation and standardisation in the operations of data centre is actually finally pushing some long overdue commoditisation is in the way that storage is snapped into the compute and application delivery frameworks, especially for hybrid cloud.


The debate around the value of Openstack will continue – Ian Finlay, Vice President at Abiquo:

From a storage perspective, OpenStack provides a useful abstraction layer. Yet, in many ways it is still immature and is considered more of a toolkit than a solution. That said, it will be worth keeping an eye on the technology over the next 12 months, as it may prove to be a valuable solution for specific use cases.


People will become more aware of the different options for archiving and backup – Paul Rylett, systems engineer, Netgear:

As the volume of data stored continues to increase, organisations will increasingly need a simple and effective way of both archiving and backing up data. Businesses often don’t have the time, or resource to dedicate to complicated backup and recovery processes. So we can expect to see more businesses embrace next generation storage technologythat can take frequent incremental snapshots and generate full backups instantly.


Storage will move from an operational necessity to a strategic enabler of the business – Sean Horne, CTO and senior director of enterprise and mid-range storage at EMC:

Once you achieve the software-defined nirvana, with tiered storage easily and dynamically allocated to applications as they need it based on specific, policy-driven requirements, with full transparency and back-billing capability, a new world order has arrived. The storage and IT teams have become strategic enablers of the business.



Comments (0)