The return of the fat client?

Over the past five years, the client-server model has been in steady decline. As business applications have been rewritten to deliver functionality from servers to browser interfaces, the amount of application logic executed on the PC has been cut to a minimum – and in some cases to zero.

Aside from allowing access to applications from any browser, the appeal in terms of systems administration has been obvious: with no business applications code on the PC, client-server's biggest headache – the installation and maintenance of code on individual clients – goes away. However, the pendulum is swinging back. Analysts, including David Smith at Gartner, maintain that two major influences – web services and peer-to-peer (P2P) technologies – will lead to a resurgence of the ‘fat client', or as its supporters prefer to call it, the ‘rich client'. The difference this time round: the deployment and maintenance burden will be minimal.

Smith's argument is based on his belief that HTML web-browsers have begun to show their limitations as application interfaces. For one, web-based applications are not available when the user is disconnected from the Internet. Second, loading the browser with code has performance implications, as web pages download slower with each extra function they perform. Moreover, quality of service is dependent on the underlying network infrastructure, something that is often outside of IT's control.

 

Client-server evolution

Two-tier client-server: Popular in the early 1990s, two-tier client-server architectures store application logic on the client device, which accesses data from a server-hosted database. The benefit over mainframe and mini systems was that application processing was local, and, therefore, very fast. It could also sport a graphical user interface. The major drawback: updates and bug fixes to the client software had to be carried out directly on each PC.

Three-tier client-server: As large, processor-intensive applications such as SAP's R/3 began to emerge in the mid-1990s, a third layer of processing was added between the client and the database server. This middle tier handles most of the application logic across multiple servers, while the client device processes rudimentary tasks, such as data verification. Although the client contains less logic than in a two-tier model, the problems of updating software remain.

Underpinning the issue is the fact that HTML is not the most efficient of programming languages. As users demand access to increasing amounts of application functionality through their web browsers, this can become a real problem for IT departments. "There is only so much HTML you can write before you end up re-writing the entire program," explains Evan Stein, director of European application development at credit rating company Standard & Poor's.

Rich client

To solve these problems, Smith says IT developers will increasingly store chunks of business logic and data, such as role-specific settings for various applications, on the client platform. To supplement that, client devices will also dynamically draw functionality from servers in the form of web services.

Although this architecture has an increased reliance on client-side software, the traditional problems associated with client software, such as installation and maintenance, could be avoided, says Smith. Web services and peer-to-peer software both enable administrators to maintain central control of distributed desktops The web services-based components in such a set up would be drawn ‘on-the-fly' from servers, and P2P provides mechanisms for ‘synchronising' the objects held on different desktops.

Indeed, this hybrid of client-server, peer-to-peer and web services has already been designed into Microsoft's new .Net framework and its Windows XP operating system. For example, in Windows XP, the client software is not a fixed version of the package, per se, but a recent cache of the latest code being managed on the organisation's server, explains Peter Bell, business strategy manager for Microsoft's .Net developer group. This means that updates to the client software can be done centrally at the server and synchronised when the client device is online. It is, Bell argues, "a swing back to the edge of the network".

As that shift suggests, increased use of client software will inevitably increase the role of client operating systems. There are also paradoxical networking issues to consider: more caching and updating on the client means faster and more reliable networks are required; but better networks will actually decrease the pressure for more client-side logic. Clearly, the client-server debate has not run its course.

 

Curl paddles into the wave

Microsoft's .Net framework may be the most prominent web services design to draw on the concept of the ‘rich client' architecture, but it is not the first to do so. Curl Corp, a start-up founded on technology developed by the Massachusetts Institute of Technology (MIT) and by the inventor of the web, Tim Berners-Lee, is considered a pioneer in this area with its Surge runtime and development environments. The company has garnered considerable interest following reports of major productivity gains generated by the implementation of its ‘web/client' platform at flagship customer Siemens.

 

 

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics