The virtual desktop

Local government does not have a strong reputation for pioneering bleeding-edge IT architectures. Historically, budgets have been tight and projects often conservative in scope. But those kinds of limitations can often breed creativity – as Dundee City Council found when it took a gamble on advanced virtualisation technology to do away with desktop PCs.

By virtualising the PC estate in the data centre, Dundee is saving £150,000 a year in IT support, PC upgrades, server consolidation and licence fees, while cutting management overheads and enabling investment in the sort of equipment usually out of local councils’ reach.

Virtualisation – creating a software abstraction of underlying hardware resources – has been proving its value in the data centre since the mainframe era, enabling organisations to partition their servers and storage systems as multiple, logical units. Now, recent advances have made the prospect of virtualising the whole desktop estate – the notion of taking the complete functionality of desktop PCs and hosting it in the data centre – a reality.

While many large organisations are already making some use of virtualisation to pool the resources of commodity x86 servers or storage units, the adoption of desktop virtualisation is only getting underway. In most organisations it has so far been confined to helpdesk staff or those needing to run legacy applications on a newer PC with an incompatible operating system (OS). That seems certain to change.

Dundee uses application virtualisation technology, one of the most mature implementations on the desktop (see box below, PC virtualisation: The technologies). Programs run on a small local virtualisation layer without requiring their installation on the operating system. A step up from traditional terminal services, it avoids conflicts, allows simpler centralised management and remote access, and requires less rewriting of applications.

Consolidation of departmental IT platforms into a corporate server farm was already well underway in Dundee as part of the e-government and freedom of information initiatives, using Citrix technology. But with 300 varied applications to push out to PCs, the project had hit a roadblock.

“Preparing the applications for virtualisation was a significant investment in terms of time, but we saw the benefits once they were rolled out.”

Ged Bell, Dundee City council

“Around 40% of our desktops were thin client but we couldn’t increase that penetration because of the disparate applications running across the organisation,” says Ged Bell, IT implementation manager at the Dundee council. “It was too risky to run them in a traditional Citrix environment.”

Using SoftGrid, an application virtualisation system from specialist vendor Softricity, 80% of the council’s 4,000 desktops are now managed centrally and deployed as thin clients. “SoftGrid allowed all the applications to run in a protected shell so they wouldn’t be able to impact others running on the servers.”

Almost all of the applications are now able to be run on thin clients (PCs stripped-down to their bare essentials), with up to 60 active users per server. Bell admits there was “significant investment of time at the start” to prepare applications for virtualisation, but “we saw the benefits once the applications were rolled out”. These include simplified patching, better licence management and a 30% cut in helpdesk and support costs (on top of the 35% already saved from the initial move to thin clients).

The result: Consolidation has freed up money for Dundee to increase levels of service in online payments and web access to council services. It has also built a second computer room with full disaster recovery (DR) capabilities for the first time, with an infrastructure of IBM blade and zSeries servers and a storage area network. “These are enterprise class servers, you don’t see them too often in local government,” says Bell, “but because we have taken out a lot of the cost at the edge on the desktop, we are able to invest in platforms that make colleagues [in other organisations] say, ‘How the hell did you do that?’”

But Dundee is not the only council venturing into virtualisation. Hampshire County Council consolidated a departmental architecture, allowing the central management of 4,500 thin clients and 3,500 Dell desktops across 400 sites, using Citrix Presentation Server, which uses virtualisation in the server level rather than on the desktop. The £40 million project is showing returns of £4 million a year, with Hampshire’s desktops’ total cost of ownership a third lower than the national average for local government.

The John C Lincoln (JCL) network of hospitals and clinics in Phoenix, Arizona, has also employed virtualisation to aid a centralisation initiative. ClearCube blade PCs free up space, for example in operating theatres, to reduce downtime and aid compliance with US data security legislation.

With that system in place, virtualisation can now increase the utilisation of each blade. ClearCube’s Grid Centre software dynamically provisions the right blade power according to each user’s needs as they connect, and can ‘hot-swap’ users within minutes if a blade fails. Nurses doing their rounds no longer access a PC in each room but share one and a half blades, accessed from thin-client terminals.

Third wave

However, virtualisation’s proponents claim the technology can reach even further to solve one of the fundamental problems of enterprise IT: the sprawling, under-utilised, labour-intensive and expensive PC estate. “First there was the mainframe era, then PCs came along and IT lost control,” says Ken Knotts, senior technologist for ClearCube. “Now there’s a third wave bringing it back into the data centre, where IT gets back control but provides the same user experience.”

Many early adopters of such PC virtualisation have been driven by the need to improve desktop security. Financial services company Prudential UK used VMware’s ESX Server product and the Windows Remote Desktop Protocol (RDP) to provide full virtual XP machines, kitted out with up to 100 applications, to 850 outsourced service centre employees in Mumbai.

The software – and the data – resides in Prudential UK offices in Reading. And 60 HP Proliant servers were divided into virtual machines, without the company having to rewrite any applications, and saving the cost and space overheads of buying 850 new PCs. Crucially for data protection, no customer information is left on Mumbai workers’ machines. Other technologies could have been used for similar results, but virtualisation offers greater flexibility if the outsourced applications ever change. Prudential’s remote workers can also access virtual images of their desktops using VMware technology.

However, the licensing complications created by virtualisation did cause Prudential some trouble. Microsoft has a highly opaque licence policy for Windows running on virtual machines, only recently agreeing to provide support for them. In Prudential’s case, the company had to negotiate a custom agreement for its deployment.

Slice and dice

As well as cutting up one large server into many small desktop images, virtualisation can be used at the desktop level to help secure the most sensitive parts of individual machines. Baker Hill, a subsidiary of credit information giant Experian, uses VMware’s ACE desktop virtualisation technology to isolate partitions containing personal financial data, and the applications that manipulate it, on its company laptops. It is faster and more efficient, given the laptop’s constrained resources, to encrypt just this set of data rather than the entire PC’s files, and the data is safe even if the laptop is stolen or lost.

“Running enterprise desktops in a VMware environment has become incredibly popular with customers recently, especially for use of offshore infrastructure,” says Raghu Raghuram, director of product management at VMware. “The desktop is more secure, more manageable and delivers a better end user experience because these services never go down. This is a very powerful paradigm shift for the enterprise desktop.”

The trend will be accelerated further by new hardware with built-in virtualisation, making virtual machines more stable and easier to install than a pure software system. In April, Intel launched its new business-focused brand, vPro. The management and security capabilities of the platform, designed to appeal to IT managers, are largely a product of virtualisation: security software can be isolated in a slim virtual machine, known as an appliance, which is purportedly invulnerable to the tampering of either attacker or user. A similar feature is already available in some Lenovo PCs that use chips equipped with Intel’s Virtualisation Technology (VT), and soon chip rival AMD will release its equivalent.

But today, IT decision-makers tempted to investigate virtualisation face the issue of how the different implementations of the technology – hardware, operating system or application; desktop PC, blade or server – all work together.

“The complexity of managing the solution that removes the complexity of something else is something we often overlook,” says Brian Gammage, a desktop specialist at analyst house Gartner.

“If you take the average organisation and divide the number of employees by 10, that’s the number of applications they have. If you start applying server-based computing to a few of those and stream a few others, it’s all getting a bit complex to manage. We need a more unified approach,” he says.

In April 2006, VMware tried to do just that with the Virtual Desktop Infrastructure Alliance, a loose grouping of around 20 hardware, software and services vendors who will test and integrate hosted desktop systems, all with VMware at their core. These include application virtualisation vendors Altiris and Softricity, blade manufacturers HP, IBM and ClearCube, and thin client pioneers Citrix, Sun and Wyse. Yet the company most closely associated with the desktop PC, Microsoft, has yet to sign up.

Thanks to Intel and AMD’s innovations, Gammage says 75% of PCs will have hardware virtualisation support by 2010. “Organisations should start planning for it now. Don’t see future versions of the PC operating system as big masses – it’s going to break down to be more modular,” he says, referring to appliances like Intel’s security functions.

As this suggests, virtualisation leaves Microsoft, and the PC industry at large, in an uneasy position. It can make OS migration easier, because it makes legacy applications easier to run. This could equally boost adoption of Windows Vista or rivals like Linux. While Microsoft is developing its own hypervisor, the technology which monitors (and so essentially controls) virtual machines, an open source challenger, Xen, is already gaining momentum – with support from Intel and AMD and a start-up, XenSource, launching in the summer to offer support.

Xen could break PC equipment manufacturers’ dependence on Microsoft’s release schedule for their own sales cycles. An open-source hypervisor could even mean PCs no longer require an operating system, suggests Gammage. But conversely, PC manufacturers could suffer from virtualisation, as it reduces the need for hardware upgrades and product differentiation. “The battle for control of the new PC platform is in the bit where the hardware meets the software,” says Gammage. “People need to be aware they are in a battleground.”

PC virtualisation: The technologies

Hosted PC virtualisation

This is the most common kind of PC virtualisation today. Products such as VMware’s GSX server and workstation products and Microsoft’s Virtual Server and PC Express enable legacy application support or hide IT management applications from the end user. A host operating system is installed on the hardware as normal, and above this a virtual machine manager can run one or more guest OSs. These guests can be isolated from changes to the underlying hardware, reducing incompatibility issues. However, most applications will suffer a performance penalty of 25% as a result of running on a virtual machine.

Hypervisors or para-virtualisation

A hypervisor is a virtual machine monitoring layer which sits between the PC’s hardware and the OS(s). Virtual machines can be installed direct to the hypervisor, lowering the performance burden to less than 10%. The hypervisor is much lighter than full virtual machines, making it faster to load, as well as more stable and secure. Gartner analysts tip hypervisor adoption to overtake the hosted model, as it adds less complexity for managers. Open source hypervisor Xen is gaining early ground in this area but Microsoft is expected to add similar capabilities to Windows by 2008. However, analysts warn that because of their proximity to the hardware, hypervisors could be a single point of vulnerability for PCs if they are not hardened against attack.

Hardware virtualisation

The decision of chip giants AMD and Intel to build virtualisation into their processors is one of the principle drivers of virtualisation to the desktop as it enables software, such as hypervisors or security tools, to run below the operating system for the first time. Both Intel’s vPro platform and AMD’s Pacifica chips will be available before the end of 2006, making software-enabled virtual machines more efficient, flexible and reliable.

Application virtualisation

Also known as application sandboxing or streaming, server-based applications run on an isolated software environment to avoid conflicts with other applications. Sandboxed applications do not require installation, configuration or testing on the underlying system. Repackaging the application is required before the first deployment, but then management such as patching can be done centrally and without fear of breaking other applications. While a minimum of code is sent down to run on the client machine, most applications still require a ‘full-fat’ PC, so thin clients cannot be used. Vendors in this area include Ardence, Softricity and Altiris. Citrix, with its Presentation Server product, makes more use of its terminal services heritage, allowing users to remotely access virtualised applications on a central server. This requires a network connection and because some work is required to tailor applications to a shared server environment, not all are compatible.

Further reading in Information Age

A virtual certainty? – Editor’s letter, May 2006

Microsoft embraces virtual change – October 2005

Pete Swabey

Pete Swabey

Pete was Editor of Information Age and head of technology research for Vitesse Media plc from 2005 to 2013, before moving on to be Senior Editor and then Editorial Director at The Economist Intelligence...

Related Topics

Virtualisation