The cost of data

Ask the representatives of almost any software or hardware vendor why they think organisations’ storage infrastructures are in such disarray and they will proclaim in unison: because they haven’t tackled ballooning end-user storage demand by implementing a consolidation programme or a storage area network (SAN).

But ask Gartner analyst Josh Krischer and the answer is very different: “In many cases it’s just poor storage management,” he says, “particularly in Unix and Windows environments.

 
 

Key storage challenges for IT management

  • Managing growth in demand for storage at a time of stagnating or shrinking budgets

  • Determining the real cost of installed storage

  • Managing an increasingly heterogenous storage environment

  • Improving total cost of ownership, perhaps through consolidation

  • Choosing the right strategy for storage consolidation

  • Deploying a storage area network

  • Drawing up a workable business continuity plan

    Source: Gartner

     

  •  

    To compensate, organisations that are unable to manage their storage in an effective way have simply thrown more storage at the problem.”

    Krischer’s comments may seem harsh to many users. Lawrence Harvey, for example, director of technology support at credit reference agency Experian, has seen such an explosion in demand for Experian’s services that the company’s storage requirements are doubling every year. “We were experiencing a 100% annual growth in server-based storage. Each time we introduce a new server, we needed to install new storage,” says Harvey.

    Rather than rushing to implement a storage area network (SAN) or any other heavily touted panacea, Krischer believes organisations need to conduct a methodical audit of their existing resources first.

    Next, the organisation needs to find out how much each and every element identified in that audit is really costing them. And over and above the easily assessed cost of hardware, software and maintenance contracts, that figure must take into account apparently hidden costs.

    Those hidden costs can be substantial. For example, for every one pound spent on hardware, an organisation will spend an average of four pounds on the less visible costs associated with running that hardware. In some organisations, that figure might be as high as 10 pounds, says Krischer.

    Complex equation

    “Most companies do not know how much storage is costing them,” he says. “They look at the hardware price, the software price and some of them look at maintenance costs, but they really don’t look at the full storage management costs.”

    For example, says Krischer, few companies take into consideration the effect of storage-related applications downtime – both in terms of the manpower required to deal with it and the effect on the company’s productivity.

    This might include something as simple as the cost of taking the application off-line in order to upgrade the disk capacity of the departmental Windows or Linux server on which

     

    The hidden costs of storage

  • Storage management costs for: capacity planning and allocation; configuration changes; performance evaluations; back-up and restore; data protection; staff training.

  • Cost of downtime (both planned and unplanned)

  • Cost of data recovery

  • Costs of business damage

    Source: Gartner

     

  •  
     

    it runs. Not only will that require the server to be taken down, but it will also involve backing up the existing drive and restoring the data when the new disk drives are in place.

    On a more refined scale, Gartner also includes in its costing models the time spent by the purchasing and legal departments when dealing with what can often be multi-million pound procurements.

    The staffing costs of storage at several levels are frequently underestimated, says Claus Egge, research director for European storage research at industry watcher IDC. “There is a misconception in the industry that most companies have dedicated storage managers,” says Egge. Most IT directors typically feel that their staff are more efficiently engaged in implementing new technology, rather than managing existing technology. As a result, numerous IT staff – even in large organisations – often find themselves doubling as storage managers in all but name, says Egge.

    Other factors also should be part of the total storage cost calculation. Electricity consumption, for example, can make a surprisingly big difference to running costs, says Hitachi Data Systems’ (HDS) European director of software product management Bob Plumridge. He cites an HDS study that compared the running costs of an HDS Lightning 9980v against an equivalent competitor. It suggested a $12,589 difference in annual energy costs between the two products (in HDS’ favour, of course) as one was consuming twice as much electricity.

    The study also highlighted another often-overlooked element: floor space. The research suggests a difference of some $43,011 in annual spend between its product and those of a much bulkier immediate rival.

    Job on the line

    This auditing and costing process can go both ways, says Krischer. He was called in when the IT director of a company in Switzerland was accused by his board of mismanaging the running of the company’s storage infrastructure due to a more than 100% storage capacity growth in a year.

    But the Gartner analysis found that the company was, in fact, running its storage systems more efficiently than the industry average – saving the IT director his job and the company a costly storage re-engineering that would have brought few benefits.

    Nevertheless, once the auditing and costing process is complete, an organisation can start to examine how its storage infrastructure can be managed more efficiently. For example, it gives a picture of the amount of data being kept permanently online and how much of that could be shifted to cheaper disk systems.

    Furthermore, business decisions can be made with regard to back-up and business continuity. For example: Does certain data really need to be duplicated for redundancy and mission critical protection? Should certain data be backed up to tape instead of a disk system? And how frequently do back-ups really need to be made?

    One of the core tools that many organisations are using to help them answer such questions – as well as help with ongoing management – is a storage resource management (SRM) software package. This enables them to auto-discover the various elements of an organisation’s storage infrastructure and then examine the devices for

     
     

    The efficiency imperative

    “Many CIOs are trying to make their infrastructure look like that of an Internet service provider (ISP) or application service provider (ASP) because that’s who they are going to be measured against in the future,” says Tikiri Wandaraqula, a server and storage consultant at IBM.

    Quite simply, as the Internet enables more and more applications and services to be delivered by third-party providers, IT managers will be under pressure to present their department’s costs more transparently and demonstrate that they offer similar or better value.

    “In the past, there was no means of measuring the effectiveness of an IT department. Now, there are companies that will provide these services at a unit cost,” says Wandaraqula.

    At the same time, analysts, such as IDC’s European storage systems research director Claus Egge, are recommending a radical change in the way that storage is costed and paid for. “I’m promoting things like invoicing or charging back the costs of storage to the departments that are actually consuming the capacity,” says Egge.

    In this way, not only will it be clearer how much the storage demands of each department actually cost, but it will also help IT managers to win support for storage consolidation and efficiency plans in the future, as their implementation will help save departmental heads money.

    However, doing so is not so easy, warns Jason Phippen, head of products and solutions marketing at storage software supplier Veritas. “The cost of working out costs and mapping them down to a line of business is a big challenge,” he says. And one that may not be addressable in the near-term.

     

     

    such information as the type of data stored, when it was created and when it was last accessed.

    As a result, policies can be set on data storage that free up substantial amounts of disk space, for example, by shifting data that has not been accessed for several months to a cheaper device, such as a tape back-up library.

    Equally interestingly, SRM can also help organisations draw up and implement enterprise-wide usage policies for staff. Specifically, it gives storage managers the power to not only set a rule that outlaws the storage of music, photographs and video files on the corporate network, but to search these out and automatically erase them.

    According to Karen Dutch, vice president of product management at storage software supplier Fujitsu Softek, one of the company’s customers was able to free up 30% of its disk capacity in one week simply by banning unauthorised MP3, jpg and other multimedia files and telling staff that it was installing storage resource management software to police the policy.

    Automatic action

    However, while the implementation of SRM can provide a significant one-off boost, it is subject to the law of diminishing returns. “You get a big payback upfront, but you cannot reclaim a similar percentage each month because you’ve done away with all the MP3 files and all that kind of nonsense,” says Egge.

    Furthermore, in order to make the most of storage resource management software, organisations need to automate it, so that the minimum of time is spent managing it.

    “Long-term, SRM is no good unless it gets automated,” says Egge. Indeed, many users complain that the automation aspect of many SRM packages still leaves a lot to be desired. In a major organisation, that normally requires a sufficiently clued-up storage manager who understands the storage needs of the company’s various applications to tune the tool accordingly.

    Users, such as Paul Rogers, the head of IT production at investment bank Tokyo-Mitsubishi International, have found that implementing SRM software has helped buy some time before having to embark on a bigger and more expensive project such as storage consolidation.

    Given that the cost of storage per megabyte is continuing to fall by some 45% every year, according to IDC, this meant that Rogers was able to get more for his money when the time came to consolidate.

    Consolidation push

    Storage consolidation involves moving data from multiple servers and storage devices, to fewer, centrally located, storage arrays. A number of servers will then share these arrays, which are either network attached (NAS) or grouped in a storage area network (SAN) – or both.

    The main benefit for companies embarking down this route is simplified storage management, wider access to data for end users, and lower overall running costs – which, despite heavy upfront costs, can be substantial.

    Storage consolidation is typically ignited by a programme of server consolidation. “If you start off with the servers as priority number one, then it would not make any sense if you didn’t think about how you could then consolidate your storage,” says Egge.

    He also suggests that server and storage consolidation is nothing new, but a regular activity of a well-run IT department. “Consolidation is not a project, it’s a process. You should continually return to the issue of consolidation,” he says.

    Tikiri Wandaraqula, a server and storage consultant at IBM, goes further. He believes that consolidation should involve three parallel projects, covering the network as well as servers and storage.

    Such projects should always start with a thorough inventory of the IT infrastructure, he counsels. Companies should also undertake an analysis of the network, because if there are any performance bottlenecks on the network, a programme of storage consolidation may only make it worse by concentrating servers and their associated storage devices in fewer locations.

    “You can not consolidate if you do not have high bandwidth communications,” says Wandaraqula. This is particularly true if storage is being consolidated on network attached storage devices. “You are still using the same network, so network bandwidth can be badly affected,” he adds.

    As a result, consolidation may not initially be cheap. Wandaraqula suggests that organisations consider different financing options, such as leasing, as a means of spreading the cost. But in many cases, customers adopt a cautious approach. “Initially, they will just do file and print or just database consolidation – whatever is giving them the most pain,” says Wandaraqula. “When that is successful, then they will move on to other parts of their infrastructure.”

    The key benefit of consolidation should be the reduction of ongoing running costs and improved ease of management. “After the initial cost bump, the majority of the savings will be on the management side. There will also be an upside in terms of [data] availability,” says Wandaraqula.

    Studies by HDS of some of its major customers, including a US commercial airline and a major European bank, suggest a payback from consolidation projects of between two and two and a half years. But, IDC’s Egge has seen organisations that have achieved a payback within a year.

    There are a number of war stories from organisations that have failed to reap much benefit from their costly storage consolidation programme. Often, the reason for that, believes Fujitsu Softek’s Karen Dutch, is down to IT departments failing to change their processes accordingly.

    “Customers would implement a SAN and a year later we would go back and capacity utilisation would still be running at 30%. When we investigated, we found they were still doing everything as if it was still direct-attached,” says Dutch. The message: consolidated or networked storage architectures need to be costed carefully but, just as importantly, they need to be well implemented and well managed.

    Avatar photo

    Ben Rossi

    Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

    Related Topics