It is time for companies to “get smart” about the way they purchase and implement data storage, says Randy Chalfant, chief technologist of storage systems supplier StorageTek. “Everywhere I go, I hear CIOs complaining about high storage costs and troublesome storage management tasks. The simple fact is, they’re not doing it right,” he says.
Chalfant is an outspoken proponent of information lifecycle management or ‘ILM’ — a term used to describe the application of technologies and processes to the management of information across its complete lifecycle, from creation and use to archiving and disposal, in a way that recognises that the value of data to an organisation changes over time. His is not a lone voice. The term ILM is used not only by StorageTek, but by major rivals including IBM, Hewlett-Packard and EMC — although Chalfant claims that StorageTek originally coined it.
Despite the marketing hype generated by ILM, there is a solid business case for its adoption, for three pertinent reasons.
First, the volume of data under management has reached such levels that storing, tracking and retrieving information has become daunting. Recent research from IT analyst company the Meta Group indicates net annual storage growth will average between 20% and 25% for enterprise (monolithic) storage, 50% to 55% for midrange (modular) storage, and between 80% and 85% for low-cost, capacity-based (SATA/ATA) storage, yielding aggregate storage growth of 45% per year. In five years, such robust growth rates will yield 34 times more storage capacity on the floor to manage. “Without enhanced storage processes, management, and automation, effective utilisation of storage assets will remain a major data centre issue,” says Carl Greiner, a Meta Group analyst.
Second is the demand for high availability — users want rapid access to data of varying ages, with no threat of it disappearing forever. Third is the increased pressure on organisations to meet regulatory requirements and ensure better corporate governance. That means holding more data for longer, compounding the storage headache.
The typical response to these pressures, says Chalfant, is a knee-jerk reaction based on maintaining the status quo: storage managers buy more disk, continue to back up all data in the same manner and manage swelling data volumes through established, labour-intensive processes. That is, perhaps, understandable: the cost per gigabyte of physically storing data is lower than it has ever been and continues to fall. However, it is not a sustainable approach, says Chalfant.
ILM works on the principle that enterprise storage options vary widely in performance and cost. It uses this diversity by matching data to different tiers in the storage hierarchy, according to its business value. With an appropriate ILM process in place, more data can be managed without proportional growth in budget. The end result: better management of data at a lower overall cost per managed gigabyte.
ILM also promotes better alignment between IT and the business as a whole, says Greiner of the Meta Group. “ILM pervades the organisation by combining content management with storage infrastructure to bridge the gap between technological functionality and business requirements,” he says.
In an age when information delivery is increasingly viewed as a service to the business, better service levels will be an attractive prospect to many CIOs.