Over the past five years, the client-server model has been in steady decline. As business applications have been rewritten to deliver functionality from servers to browser interfaces, the amount of application logic executed on the PC has been cut to a minimum – and in some cases to zero.
Aside from allowing access to applications from any browser, the appeal in terms of systems administration has been obvious: with no business applications code on the PC, client-server's biggest headache – the installation and maintenance of code on individual clients – goes away. However, the pendulum is swinging back. Analysts, including David Smith at Gartner, maintain that two major influences – web services and peer-to-peer (P2P) technologies – will lead to a resurgence of the ‘fat client', or as its supporters prefer to call it, the ‘rich client'. The difference this time round: the deployment and maintenance burden will be minimal.
Smith's argument is based on his belief that HTML web-browsers have begun to show their limitations as application interfaces. For one, web-based applications are not available when the user is disconnected from the Internet. Second, loading the browser with code has performance implications, as web pages download slower with each extra function they perform. Moreover, quality of service is dependent on the underlying network infrastructure, something that is often outside of IT's control.
Underpinning the issue is the fact that HTML is not the most efficient of programming languages. As users demand access to increasing amounts of application functionality through their web browsers, this can become a real problem for IT departments. "There is only so much HTML you can write before you end up re-writing the entire program," explains Evan Stein, director of European application development at credit rating company Standard & Poor's.
To solve these problems, Smith says IT developers will increasingly store chunks of business logic and data, such as role-specific settings for various applications, on the client platform. To supplement that, client devices will also dynamically draw functionality from servers in the form of web services.
Although this architecture has an increased reliance on client-side software, the traditional problems associated with client software, such as installation and maintenance, could be avoided, says Smith. Web services and peer-to-peer software both enable administrators to maintain central control of distributed desktops The web services-based components in such a set up would be drawn ‘on-the-fly' from servers, and P2P provides mechanisms for ‘synchronising' the objects held on different desktops.
Indeed, this hybrid of client-server, peer-to-peer and web services has already been designed into Microsoft's new .Net framework and its Windows XP operating system. For example, in Windows XP, the client software is not a fixed version of the package, per se, but a recent cache of the latest code being managed on the organisation's server, explains Peter Bell, business strategy manager for Microsoft's .Net developer group. This means that updates to the client software can be done centrally at the server and synchronised when the client device is online. It is, Bell argues, "a swing back to the edge of the network".
As that shift suggests, increased use of client software will inevitably increase the role of client operating systems. There are also paradoxical networking issues to consider: more caching and updating on the client means faster and more reliable networks are required; but better networks will actually decrease the pressure for more client-side logic. Clearly, the client-server debate has not run its course.