A redundant device

For most of the last 20 years, CIOs have been locked into a battle for control of enterprise IT infrastructure. While they have made progress on many fronts, control over the realm of the desktop has remained illusive – until now.

In the data centre, consolidation and virtualisation strategies have reduced the cost and complexity of server and storage operations; automated systems management technologies are emerging that promise to revolutionise IT's ability to dynamically match processing capacity to demand.

Meanwhile, in the software arena, service-oriented approaches to integration and development are having a similar impact on IT's ability to build and bend applications to exactly match changing business goals. Today, indeed, CIOs are closer than they have ever been to creating the kind of flexible and predictable enterprise infrastructure that can be constantly tuned, and re-tuned, to the strategic goals of business. All that remains is to tame the corporate desktop. Here, ‘tame' is not too strong a word.

The wild frontier

Before the emergence of the IBM PC in 1981, IT professionals had a relatively easy life. Business computing was a scarce resource centred on complex mainframe installations which, while they could be difficult to manage, were at least firmly under the control of the IT staff. In those days, enterprise computing might not have been cheap, responsive, or even particularly productive, but it was secure, and it was predictable.

Since the arrival of the PC on the corporate desktop all that has changed. IT professionals are now charged with marshalling what amounts to a wild frontier on the periphery of their IT domain. "Personal" computing has enabled information technology to escape from the data centre, and colonise every corner of the enterprise. In the process, users, once entirely dependent on IT for all their computing needs, are managing their own data, writing their own applications and even specifying and procuring their own systems.

The PC has provided companies with a relatively inexpensive, open platform for delivering IT services to users, which has boosted personal productivity and fuelled unprecedented levels of application and process innovation.

For IT professionals, however, the PC has always been an object of suspicion. Initially, misgivings focused on the PC's lack of technical sophistication. Early versions of MS-DOS and Windows compared poorly with the "industrial strength" operating systems commonly used in the data centre, and the machines themselves were prone to system failure.

This is no longer such a cause for concern. "The PC has matured," says Keith Turnbull, VP of development at thin-client software vendor, Citrix, "it is now a great platform. It is robust, and it is a de facto standard." Unfortunately, this maturity has brought problems of its own. The PC is now an indispensable part of the business infrastructure, and "it gives people the freedom to do whatever they want to do," says Turnbull. "But sometimes you can have too much freedom. Untrammelled freedom leads to anarchy".

Not many CIOs are likely to concede that their IT estate has descended into anarchy. Nevertheless, a growing number are starting to notice a widening disparity between their ability to effectively manage infrastructure systems, such as servers and storage arrays, and their ability to achieve similar control over their desktop systems.

Organisations that have spent the past several years consolidating and virtualising server and storage systems, for instance, can already point to improvements in administrator productivity of 50% or more. And, according to IT industry analysts Forrester Research, further improvements in data centre productivity – as much as 100% by 2008 – are likely to accrue from the next wave of investment in automated systems management technology.

The picture on the desktop is very different, says Simon Yates, a principal analyst with Forrester. Although PC technology has not stood still, over the past 25 years, the industry has persisted in seeing it as a single-user device. The PC has been slow to acquire features that allow it to be easily embraced by the systems management and virtualisation regimes that are doing so much to streamline and reduce the cost of data centre operations.

Furthermore, a laissez faire attitude to PC procurement has seen ownership costs spiral. Low unit prices have meant PCs are seen as commodity devices; businesses have not taken a strategic view of PC purchasing says Yates. This game plan has resulted in added complexity.

Companies find themselves struggling to manage applications, practice version control, to upgrade patches and maintain machines built from different components. "Every time different hardware is installed, supporting different applications, a different version of the operating system, a new, separate desktop image is produced, which needs a new set of tools to support it," says Yates.

Many companies are now discovering this the hard way. According to research company Gartner, while the average cost of a typical corporate desktop is now just $800, the life-time cost of each machine is likely to be nearer $4,000. IDC agrees, and predicts that, with IT support skills in short supply, the total cost of desktop ownership will continue to rise.

The end of the PC?

The prospect of desktop costs spiralling out of control, and obviating hard won savings made in others areas of the IT corporate infrastructure is an alarming one, and it has sparked a reappraisal of desktop policies. Some of the most visible evidence of this is the decision made by many companies to lengthen their corporate PC refreshment cycles, and to belatedly impose some centrally controlled standards on the desktop.

Both these strategies have shortcomings. The lengthening PC life-cycle, which Gartner believes has stretched from three years to five, can actually be counter productive. Although it saves on procurement costs, it results in an ageing PC estate which becomes more expensive to support as parts fail, and software inconsistencies go uncorrected.

Imposing order on the desktop by standardising on a single corporate desktop image (a consistent set of components, applications and operating system versions) is potentially more productive; just difficult to achieve retrospectively. Also, end users are notoriously reluctant to cede control of what they see as a ‘personal' device.

So, what is to be done? How can organisations achieve the same control and efficiencies on the desktop that they now have in the data centre without compromising the desktop's ability to be tailored to the individual user? A growing number of companies are coming to the conclusion that the answer is in dispensing with PCs altogether.

Essentially, the cost and complexity of the modern desktop is encouraging some organisations to dust off an old idea: network computing. A decade ago, Oracle's Larry Ellison and Sun Microsystems' Scott McNealy loudly boasted that network computing would quickly kill off the PC (and, with it, they hoped, the waxing software market dominance of Microsoft).

They were wrong (on both counts). In the early 1990s network computing technology was still in its infancy, and was essentially a "one-size fits all" solution. It placed heavy constraints on end-users, and only relatively simple applications were suited to the recentralised approach that it offered, and could cope with the paucity of bandwidth available over pre-megabit Ethernet networks.

Since then, a lot has changed. Network bandwidth is now plentiful and, although the problem has intensified, network computing has matured to the extent that its proponents claim there are very few applications that cannot be delivered effectively. Indeed, Citrix's Turnbull claims the company's MetaFrame thin-client hosting suite can now remove 95% of applications from the desktop without compromising user experience. "Only applications with a very heavy demand for video still really require a PC desktop," he says.

Rather than providing a software shell in which to run desktop applications on a server, the latest iterations of Citrix's MetaFrame suite provide a set of centralised management capabilities, such as security, access and online support, without locking users into a rarefied application.

The architecture is operating system agnostic in both the server and the client domains, and can be applied to conventional desktop environments. This is an opportunity to apply new controls to legacy PC populations without abandoning them entirely. However, the most significant impact of MetaFrame has been to provide a ready-to-use desktop management infrastructure that has encouraged the development of a variety of new approaches to network computing.

As Citix's products have percolated into the networks of major companies, the tendency has been for terminal-based system vendors, such as Wyse Technologies and Neoware systems, to follow closely on its heels. The modern generation of "thin-client" terminal systems such as Wyse's are cheaper than both their predecessors and the average corporate PC (costing under $200 per terminal) and, thanks to their own and Citrix's software advances, have fewer of the disadvantages.

However, the appeal of thin-client systems does not depend on them being cheaper to buy than a PC. Instead, it is the opportunity for companies to remove valuable software and data assets from the desktop and relocate them in the data centre, where they are not only more secure, but also easier and cheaper to manage and maintain.

According to Wolfgang Staehle, Wyse's European president, companies which have taken the plunge and moved their PC estate to a thin-client environment have reduced their entire infrastructure costs (not just their PC costs) by as much as 35%.

Similar claims are made for other network computing architectures. The most recent of these, blade PCs, offers the same opportunity to re-house desktop applications and data in the data centre as thin-clients, but goes one step further – by implementing entire PCs as blades housed in a server rack.

Raj Shah, chief marketing officer for PC blade pioneer ClearCube, makes less grandiose claims for his company's technology than Staehle. Even so, ClearCube customers, says Shah, have reduced their PC management costs by as much as 40% in some cases. And the blade approach has other advantages that are more difficult to achieve in the thin-client world.

By implementing a PC as a remote physical device in its own right, as opposed to a logical image hosted on a server, Shah argues that ClearCube's approach does a better job of retaining the individuality that some users demand. Each PC blade, for instance, can be configured to support the same physical characteristics of an individual PC – exactly emulating a user's required configuration. When logging onto a PC blade server, users can even specify what grade of PC performance they need depending on their workload at that time – upgrading to a powerful workstation one day, or simply accessing a stock PC blade on days when they are doing routine word processing.

Solutions such as blade PCs are still in their infancy, but already analysts such as IDC are predicting a high growth future for them, although they will have to grow very fast to catch thin-clients. Having marked time through most of the 1990s, European thin-client sales in Europe grew 75% in the past three years, whilst PC sales have relied on the home market and demand for notebook systems to scrape into double-digit growth. This year, some analysts believe desktop PC sales may actually decline, whilst IDC expects thin-client sales to grow another 17%.

So is it time, at last, to write the obituary for the corporate desktop PC? Perhaps not quite yet. Alternative desktop system sales may be streaking ahead of PC growth, but in volume terms, terminal-oriented systems still have a long way to go to catch up. Still, there is no reason to expect they won't, and in a 2003 report, Gartner predicted that as early as 2008, the conventional PC will be a standard feature on fewer than 50% of corporate desktops.

   
 

Worldwide blade client shipments, 2004-2009
 
   

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics