Access guaranteed

Back in the late 1990s, the first wave of the Internet alerted many businesses to the reality of operating around the clock, where their critical system’s downtime needed to be measured in minutes per year. Since then, the notion of what constitutes ‘high availability’ has been refined even further, with the oft-touted ‘five-nines’ uptime (systems up and running for 99.999% of the time) seen by some as too narrow; the new gold standard is absolute zero downtime.

The major proponents of this thinking are found not just in the world of ecommerce or banking, but in the modish world of Web 2.0, where online pace-setters such as Facebook and MySpace store enormous quantities of data in the form of pictures, audio files and video, all of which users expect will be accessible all of the time.

A new generation of user is emerging – one that expects access to their data to be guaranteed as a matter of course. Downtime is unthinkable; so too is the idea that a user may be served up the wrong data – as the owners of photo-sharing website Flickr recently found out to their cost when a software upgrade resulted in users’ being presented with other people’s photos. (Attuned to the new levels of user expectation, Flickr responded with a rapid fix and gushing apologies.)

But can experiences of Web 2.0 hold any lessons for the more prosaic world of enterprise IT? Mike Workman, CEO of storage systems company Pillar Data Systems certainly thinks so: the new expectations of information availability “are a very real-world” thing, he says.

Web 2.0 companies may be extreme examples he says, but they are coping with problems, such as unpredictable network demands and maintaining application performance amid peak-demand times, which are recognisable within the enterprise.

Others have the same perception, predicting a reaction. “This is going to transform how enterprises build their application infrastructure,” suggests Mark Lewis, chief development officer at enterprise storage titan EMC.

“The Web 2.0 expectations of information availability are a very real-world thing.”

Mike Workman, CEO, Pillar Data Systems

One organisation that demonstrates this is parcel delivery giant TNT Express. “We’re a transport company,” says Mike Harding, infrastructure development manager at TNT Express UK, “we’re not trying to push the boundaries of technology.”

And yet the company is an exemplar of the new enterprise realities. The business is characterised by short bursts of “very intense activity”: as one of its cargo planes lands, the packages must be unpacked, logged and sorted; customers also expect to be able to track their parcels wherever they are in the world. The key criteria for the storage environment that supports this are availability and scalability, explains Harding.

Thinking thin

Achieving such scalability is no mean feat. Many businesses have hitherto struggled with scalability, notes Andrew Manners, UK head of storage at systems giant Hewlett-Packard. Storage has been assigned on a theoretical assumption about the maximum capacity a given application may need – physical storage is allocated to a logical volume when the volume is created, which in effect means businesses have been “over-provisioning up-front”.

The concept of ‘thin provisioning’ is going to “transform the economics around storage”, argues Craig Nunes, marketing executive VP for 3Par, a storage company  that is pioneering the idea. He is dismissive of the “chubby” provisioning approach that is favoured by several of 3Par’s more traditional rivals.

Users gain real benefits from thin provisioning not only when storage capacity is allocated as it is needed, but when that allocation is done automatically – something he claims other companies have yet to master. “We believe that freeing up administrators’ time is an essential element of reducing costs,” he says.

The enthusiasm for thin provisioning is justified, notes Stanley Zoffos, an analyst with IT advisory group Gartner: its impact will be “profound”, he acknowledges. But because the technology relies on systems that support virtualisation of the back-end storage disk, the technology “has yet to be retrofitted into older storage systems.”

Nevertheless, the pressing need to rein in storage costs may eventually persuade CIOs to invest in state-of-the-art storage systems capable of using thin provisioning. The savings certainly look impressive. According to Zoffos’s calculations, as well as providing a cost-effective method to scale storage resources, thin provisioning should also improve utilisation rates, reducing the number of physical disks required to support a given workload. In turn, this also reduces power and cooling costs.

As recently as two years ago, power and cooling did not appear on the storage radar, but today they are uppermost concerns for all CIOs, notes EMC’s Lewis. By some estimates, 50% of data centres could have insufficient power and cooling capability within the next 12 months. This has also convinced many IT executives to look at

data centre consolidation and virtualisation as techniques for achieving efficiencies, says Lewis. And the ramifications of this trend will be dramatic.

Information overload

EMC is predicting that the benefits of data centre consolidation will compel enterprises to build so-called ‘information power plants’, where the vast majority of data – both structured and unstructured – will be centrally stored. This aggregation of data is the only cost-effective way to shield the business from the impact of rampant data growth, explains Lewis, but it comes replete with its own set of challenges.

If the overwhelming majority of enterprise data is stored in one place, it increases the pressure to deliver an always-on service. Likewise, by building such a richly-stocked data store, CIOs will have to impose commensurate access controls. If information needs to be available across an enterprise – and most probably across an ‘extended’ enterprise, which includes partners and suppliers – information assets need to be secure at every point, says Alan Laing, storage technology specialist at systems software maker CA.

The concept of end-to-end information asset security illustrates how the storage world is changing. An old-world view of storage would have a storage array provisioned for a given application; the introduction of network-attached storage, which allowed storage capacity to be pooled among many applications, first changed that model. But now even the way applications are being built is changing: a service-oriented approach to enterprise applications suggests that a single application may in reality be composed from smaller chunks of several unrelated applications. These composite applications may even be running within virtual environments, rather than on dedicated hardware. The notion of having chunks of data directly associated with specific applications is therefore beginning to look untenable.

However, in this new world, business leaders are going to need to understand a great deal more about individual data assets, to assess which business processes may call upon the data and which users have legitimate cause to access it, if they are to secure those assets, says Lewis. Effectively, storage management systems and applications will need some method of understanding how information assets tie in with an end-to-end business process – most probably through the use of metadata.

What is exciting about web services is that it provides businesses with a far more flexible and responsive method of automating business processes. But at the same time, it demands that enterprises become far more “information-centric”, says Lewis.

In fact, he concludes, that is the real lesson today’s business leaders can draw from the rash of Web 2.0 companies: it is their ability to deliver customisable, extensible services – whether it be embedding Google Maps into parcel delivery applications or adding weather-forecasts to fishing boat hire websites – using metadata to enrich the quality of the information provided, that is lighting the way for today’s enterprise leaders. “We’re moving to an information-centric world,” says Lewis; it’s time the enterprise caught up.

 

The dawn of the ‘information power plant’

Q&A with Joe Tucci, CEO of EMC

As CEO and chairman of EMC, Joe Tucci has an unparalleled vantage point on the seismic changes underway in the storage market. In order to stay ahead of the curve, Tucci has implemented significant changes at the enterprise storage market leader, transforming the company from a high-end array vendor to what he calls an ‘information infrastructure company’, which has pushed beyond its storage hardware roots into areas of technology that include security, document management and virtualisation software. 

Tucci’s vision is of a world that is on the cusp of revolutionary changes to the way information is dealt with. The exponential growth of unstructured data that businesses are required to store will inevitably reshape the storage landscape, making the task of managing data more complex. Soon, predicts Tucci, 85% of all data will be stored in “information power plants”.

Information Age caught up with Tucci at a recent EMC event, and asked him to explain the thinking behind his vision for data, and the ramifications this will have on storage in the enterprise.

Information Age (IA): You have overseen some major changes at EMC in the last few years. Can you explain the reasons behind these changes?

Joe Tucci (JT): I think it’s really important as a business to understand where you want to play. Clearly, a few years ago we were in data – I wanted to stay on the infrastructure side but there’s real business benefit you can drive from data. From data you beget knowledge, and from knowledge you beget business benefit. So instead of building the world’s most innovative storage – which of course we still want to do – we needed to look at what else can we do since we have that data and the means to understand that data? I wanted us to be an information company, and as we’ve done that it has unleashed tremendous opportunities for us and our customers.

IA: How does that change in focus, from data to information, manifest itself in how your customers behave?

JT: For any business out there, the bottom line is that you are generating more data. If you look at the rate of data growth you can predict that by 2010 there will be about 988 billion gigabytes of electronic data being stored globally. If you look at numbers like that, you can see we need to be smarter at how we deal with those volumes.

So I wasn’t comfortable with us staying at the data layer. I think that if we do our job correctly at the information infrastructure layer, then it makes it easier to overlay knowledge that can help support the business. So if we can help you understand what information you’re storing, then you can set better policies around it. Using automated data management you can then save money by moving data with lower business-value on to cheaper storage.

IA: If we are to accept that the volumes of enterprise data will continue to grow, does that inevitably mean that companies will spend more of their IT budget on storage?

JT: I think if you look at how quickly data volumes are growing and compare it to how quickly the storage industry is growing – analysts say about 7% a year – I don’t think it sounds like a terrible deal. Sure, businesses are creating more data, but we are helping them to manage that data more effectively.

For example, by using data de-dupe [data de-duplication programs] businesses can make sure that they are not spending money unnecessarily. Our customers have been able to dramatically cut the amount of storage capacity they require by ensuring that they are not storing multiple versions of the same data.

IA: When talking about the value of information, EMC has stressed the importance of restricting access to that information. Nevertheless, businesses have yet to embrace the concept of encrypting data at rest. Why is this?

JT: There are lots of reasons why: historically, if you want to encrypt data then you are looking at using extra hardware or taking a performance hit. But we’re working on this – products are probably not less than two years away, but we’re looking at ways to make sure encryption becomes a no-cost item on an array.

That way our customers can choose to encrypt their data at the application level, the database, on switches – wherever they choose. I see [another] role [for us] as providing centralised key management, because when you have that level of encryption, you also need a way to manage the encryption keys.

Henry Catchpole

Henry Catchpole runs Inform Direct, a company records management software company which simplifies the process of dealing with Companies House. The business was set up in 2013.

Related Topics

Data