There are many benefits that virtualisation technology can bring to the business: for many, the goal of increasing the utilisation rates of the server estate beyond the industry average of 20% is reason enough. But few IT executives would look to virtualisation as a means of minimising the disruption caused by absent-mined employees leaving laptops in taxis or pubs. But that is just the kind of lateral thought that occurred to Ken Kaban, technical services manager at Australian drinks company Foster’s for the EMEA region.
Foster’s has embraced virtualisation across the board – for its servers, its networks and its storage. It is also more forgiving than most when it comes to lost laptops; indeed its salespeople argue that such losses are to be expected in the line of work. “We’ve used virtualisation to back-up laptops: we take remote snapshots and can bring them up in a virtual environment within an hour,” says Kaban.
Foster’s is hardly alone in the vanguard of virtualisation adoption. Ambitious plans at financial services giant Deutsche Bank will see “40% of the production environment virtualised” by the end of 2008, according to Stuart Haskins, chief technology officer at the German bank’s architecture and engineering group.
But it is not only at high-profile corporations that virtualisation is taking hold. It has also been enthusiastically adopted by smaller organisations such as the Association of Teachers and Lecturers (ATL), where the government pensions agency was running out of space for servers in its Trafalgar Square headquarters. It has used virtualisation to drive up utilisation rates. “It has allowed us to spend less time ‘feeding and watering’ the infrastructure, and concentrate on delivering applications,” says Ann Raimondo, head of IT at ATL.
Such is the positive sentiment surrounding virtualisation that some of the shortcomings are overlooked. For an industry that has frequently over-hyped advances, that is understandable, notes Bob Sibley, senior infrastructure strategy consultant at banking titan HBOS. “There are some potential pitfalls when dealing with virtualisation, but for us, the upside is too dramatic to ignore,” he says.
While most technologists have embraced the idea that virtualising infrastructure can help the IT organisation become more responsive, it is clear that some issues, chiefly ones regarding resource management, remain unsolved.
For adherents of ITIL, the IT best practice framework, virtualisation poses some, as yet, insoluble problems. Many organisations will use a configuration management database (CMDB) as the basis for tracking IT assets, but as Chris Swan, director of IT research and development at Credit Suisse explains, that is difficult when physical assets are virtualised. “We’re still working on how we can put together a dynamic CMDB that works in a virtual environment,” he adds.
Other leaders in virtualisation deployment, meanwhile, are being taxed by the issue of software licences. Calculating a fair software licensing structure is a thorny-enough issue in the physical world, says Stuart Tarrant, principal consultant at software vendor Symantec; in the virtual world, it is an order of magnitude harder. “What’s missing at this point is a toolset that can properly validate usage against a licence,” says Tarrant.
It would be wrong to categorise software licensing as solely a virtualisation problem, says Ian Pratt, professor at Cambridge University Computer Laboratory. Changes in the way software is being deployed and the development of multi-core processors are forcing a rethink of historical licensing models based on a ‘per-user’ or ‘per-processor’ concept. The whole IT world is re-examining ways of licensing software to more accurately ensure “we only pay for what we use”, says Pratt.
Virtualisation may still carry some risk, says HBOS’s Sibley. “If, like us, there was a real need to do something now, you take virtualisation warts and all.”