The grid groundswell

 
 

 

The languid, post-bubble state of the IT market belies a foundation-shifting movement well underway.

Grid computing and related technologies promise to replace the existing architectural status quo and may be quietly shaking the industry out of its complacency. Grid now counts all major systems vendors, and a healthy and growing number of start-ups, as proponents. It is gaining intellectual acceptance in boardrooms not for its headline-making scientific applications – of which there have been many – but because of one very simple truth: It makes better use of computing power that is already in place. If businesses and service providers both feel that they over indulged in the 1990s, then grid offers a tonic of more efficient resource utilisation.

Grid computing uses peer-to-peer (P2P) networking, systems management and middleware technology to let applications share available processing power regardless of physical location. It implies comfortable evolution rather than disconcerting – and potentially expensive – revolution. The concept gains validity when put in the context of surrounding technological developments. Both the Napster-fueled interest in Internet-enabled P2P computing and the more prosaic development of space-saving blade servers for data centre applications tie-in to the grid evolution. Grid is also a pillar of grand, industry-driving concepts, such as self-healing, autonomic systems and network management schemes.

"Grid is the third phase of Internet computing," says Mike Nelson, director of Internet technology and strategy for IBM. "The first was communications, including email and remote file transfers; the second was the web, which was content-oriented and a one-to-many model; and the third is many-to-many, which we were all introduced to by Napster," says Nelson.

Most convincingly, the technology to deploy grid exists and is proven in practice. But grid applications to date have been scientific and in closed communities of interest (with the notable exception of several high-profile cases). Its commercial potential is the subject of much debate, and the obstacles to wide-scale deployment are formidable.

The resource-sharing concept behind grid is, in truth, as old as computer networking. The idea is to use all possible available central processing unit (CPU) cycles to accomplish tasks, rather than limiting application access to a single system. According to systems vendor Sun Microsystems, CPU utilisation on its workstations rarely exceeds 20% of available capacity. Indeed, utilisation rarely exceeds 40% even with server applications. Sun claims that the deployment of its 'Grid Engine' software in workgroup environments drives utilisation of available resources up to over 90 per cent. These figures are generally agreed upon within the industry.

"IT managers base their modelling on peak demand, so silos of resources are over-provisioned," says Rick Hayes-Roth, chief technology officer of Hewlett-Packard's Software Solutions group. Grid deployment solves that problem, saving money and using the processing power delivered by Moore's Law more efficiently.

The concept, therefore, sits nicely with the corporate IT belt tightening prevalent today. It also fits in with the notion that there is a massive over-supply of telecommunications bandwidth – both within corporate firewalls and between them. In that way, the major contributors to the technology market slowdown – over-investment in enterprise infrastructure and telecoms networks – can be regarded as selling points for grid.

But efficiency and related cost savings represent just one side of the grid rationale. The other is that the computation power it enables creates the opportunity to accomplish unprecedented tasks. It is this vision that drove the early grid thinkers.

Work on what is now known as grid dates back to the 1970s. Like the Internet itself, grid research derives from efforts funded by governmental organisations. The European Commission was one of grid's earliest patrons, according to Wolfgang Gentzsch, Sun's director of grid computing. It has its roots in the study of distributed resource management and high-performance computing (HPC). The applications were then – and many still are – project-based, where communities of interest, usually scientific and among universities, needed to pool resources toward a common goal.

 
 

Vendor backing

Established set

  • Compaq Provides a services program for grid development and deployment, using Platform Computing's distributed computing system.
    www.compaq.com
  • Hewlett-Packard Announced a plan in April 2002 to envelope grid into its HP Utility Data Centre policy-based systems management scheme.
    www.hp.com
  • IBM Has a well-documented history in the development of distributed computing and, subsequently, grid computing. IBM has been working on the theoretical applications of grid for many years. More recently, it publicised its intent to "grid-enable" its entire product line as part of the eLiza autonomic computing plan.
    www.ibm.com
  • Sun Microsystems Offers a complete grid computing solution, branded Grid Engine, via its acquisition of German start-up Gridware in 2000. It is focusing on workgroup cluster and enterprise applications.
    www.sun.com/grid
  • Platform Computing Has been delivering distributed computing systems since 1992 for specialised, high-performance applications. It offers a software suite for distributed computing that supports grid applications.
    www.platform.com

Start-up set

  • Avaki Emphasises wide-area access to data and applications, using its grid computing software. It is first targeting life-sciences applications and stresses the importance of so-called data grids as well as compute grids. Founded in 1998; venture-funded and relaunched as Avaki in 2001.
    www.avaki.com
  • DataSynapse Provides high-performance computing systems using clustering and distributed resource management technology. It is focused on the financial services market. Closed series B venture funding round in 2001.
    www.datasynapse.com
  • Entropia Uses Internet-based grid model for enterprise applications, employing unused PC processing resources. Founded in 1997.
    www.entropia.com
  • Noemix Is a small (five employees) company, founded in 1998, that develops distributed resource management tools and platforms, focusing on performance planning.
    www.noemix.com
  • Parabon Computation Delivers a system for taking advantage of unused PC processing cycles either locally or on the Internet.
    www.parabon.com
  • United Devices Develops distributed computing software and services. It is focused on the Internet grid model, but also markets to enterprise applications. Founded in 1999.
    www.ud.com

 

 

Grid projects were overlooked by much of the wider IT community for many years, particularly after client-server computing models gained support during the 1980s. But highly publicised grid applications have emerged in recent years. Notable examples include DataGrid and EUROGRID projects, which were both funded by Brussels. Even more celebrated are the global FightAIDS@home and SETI [Search for Extraterrestial Intelligence]@home projects.

Grid interest accelerated among academics and systems vendors after the delivery of the 'Globus Toolkit' in the late 1990s. Developed by the non-profit Globus Project of the Argonne National Laboratory and the University of Southern California's Information Sciences Institute, the toolkit is public-domain software that offers a set of security, resource directory, resource management, data management and communications functions required to implement grid.

One of the most alluring aspects of grid is that, while a radical departure in some ways, it encompasses concepts familiar to technologists. It is, for example, distributed computing at its heart – something most IT organizations have been deploying for a decade or more. It also takes advantage of server clustering and load balancing, and forces the issue on policy-based management techniques.

Market opportunities
The first step for major systems and software players will be to focus on departmental applications for their commercial marketing efforts. Although the terminology used changes from company to company, there is agreement on three real and potential grid markets: enterprise or campus, communities of interest and global.

Enterprise grids are generally viewed as providing a genuine market opportunity today. The resource-sharing technology exists and the oft-stated barriers to grid deployment – security concerns and platform heterogeneity – are more easily overcome when the application resides within a company's firewall.

The benefits of using grid computing for corporate applications are many and varied. First, businesses can get more out of the computers in place. This can, of course, yield financial savings. But it also can yield competitive advantages. In the cut-throat pharmaceuticals industry, for example, companies are deploying grid to dramatically reduce the time it takes to do molecular modelling. By dynamically pooling all available computing resources, these companies can reduce time-to-market by many months. In addition, grid can enable applications – and even businesses – that otherwise would be untenable without access to costly supercomputing resources.

The advantages of grid computing for enterprise IT are also far reaching, say those selling the systems. The capability to effectively share resources and manage entire corporate systems with a single view simplifies the process of achieving IT nirvana: truly matching technology to business objectives by using policies. If, for example, a life sciences company is beaten to market in one area of genetic research, it can quickly shift grid-enabled computing resources to other, more promising projects by adjusting policies. The risk of wasted IT capital is potentially eliminated.

There is nearly universal agreement that this is a viable present-day market. "The technology is absolutely in place today," says Ian Baird, chief business architect of Canada's Platform Computing, a grid systems company that has been selling distributed computing software into the corporate market since 1992.

But there is disagreement on applicability. Conservatives contend that the commercial grid market is really confined to departmental clusters or virtual workgroups, rather than complete enterprises. "At the departmental level, there are no obstacles," says Sun's Gentzsch. "At the enterprise level the obstacles are not technological, but instead [relate to] the complexity of the IT environment. There's a lot of work to be done there."

Indeed, the heterogeneous nature of most corporate environments demands a standards-based approach to grid, which does not yet exist. It is believed, however, that the widespread adoption of Internet Protocol (IP) and the Microsoft Windows desktop operating systems have helped reduce complexity. And in February of 2002 an IBM-led effort called the Open Grid Services Architecture (OGSA) was established, and much of the industry has endorsed it.

In addition to establishing standards for application-layer interoperability, the OGSA also addresses concerns about grid's co-existence with 'web services' architecture such as Microsoft's .NET, Sun's Sun ONE and IBM's WebSphere platforms. Web services technology allows business transactions to run on servers scattered over the Internet, and grid distributes and co-ordinates application processing across many systems. The two, therefore, should be complementary, not competitive. For that to happen, however, an integrated development approach from the ground up is required.

The next phase will be extending grid beyond the corporate firewall. That will be a big step, say experts. Universities in Europe and elsewhere are already using grid technology to benefit research projects that require vast processing power. Strong arguments are also made that like-minded but separate businesses in, for example, the aerospace industry could benefit immediately from sharing computing resources.

But security concerns and policy administration are seen as profound obstacles to a lucrative market any time soon. For a technology market to blossom fully, say veterans, it must become meaningful beyond high-end, scientific applications. Grid will have to penetrate the financial sector, for example – an industry where security concerns over grid's resource-sharing mechanisms are likely to form the primary focus of debate. "The notion of money transfer and risk will greatly affect the commercial market for grid," says Vernon Turner, group vice president of server systems at IDC, the market research group. He says that a high-street bank and a top-tier insurance institution may have every reason to co-ordinate work on complex transactions using a grid – but such collaborative projects may be limited by security concerns.

Industry hype
Marketing hyperbole reaches its peak when proponents discuss grid computing in its global context. Grid technology enables 'utility computing', as IBM refers to it. Rather than building and managing private IT systems, companies will be able to use a third-party grid to access computing cycles, data storage and applications logic on demand. "Enterprises have all the reasons to adopt utility computing," says HP's Hayes-Roth, "and economies of scale will dictate [third-party] data centres."

The same principle could be applied to the consumer market. There is a rare consensus among the major systems vendors on this future for grid and the computing world. But at same time, these same companies are quick to temper their enthusiasm by addressing one simple reality: This is not going to happen tomorrow.

"Service grids are years away from development," says Platform's Baird. Peter Jeffcock, Sun's grid computing group marketing manager, is even more blunt: "We don't see a lot of demand for global grid in the near term."

Nevertheless, each of the major players – IBM, Sun, HP, Compaq, as well as Platform and a growing number of start-ups – do not hesitate to characterise the aggregate, long-term market opportunity for grid computing as 'huge'. But specific numbers on the eventual size of the market are difficult to come by. The companies are equally as vague about timing. Dave Fish, president and CEO of Avaki, a venture capital-funded grid software start-up based in Massachusetts, is one of those willing to go out on a limb. "The huge market will be in the 2005-2010 timeframe," he predicts.

Ultimately, he says, the commoditisation of high-performance, high-density processing platforms makes it "inevitable" that grid's influence will extend beyond specialised scientific applications.

That helps to explain why systems vendors are pushing grid. After all, the eventual, large-scale adoption of grid seems to run contrary to their business models. "We're pursuing a strategy that foresees the commodisation of computing resources," HP's Hayes-Roth admits. "Grid will reduce the aggregate demand worldwide for boxes. But we see this as inevitable, so we embrace it."

   
 

Full to capacity

The ability to parse out application logic across multiple processor systems implies a suitable communications infrastructure. The requirement is more acute when so-called data grids are assembled, which will link stores of information across multiple systems.

In local environments, grid's bandwidth demands are not an issue – and are not likely to be for some time. Technical workstations today often have dedicated 10Mbit/sec or 100Mbit/sec connections to local networks, which are trunked into gigabit switch-based backbones. High-powered campus networks are also the norm.

The plot thickens, however, when global, or Internet-based grids enter the picture. At least one person studying the issue thinks grids will have a profound effect on public network bandwidth requirements. Robert Cohen, an economist with Cohen Communications Group in New York, believes that Internet traffic will grow by 400 times between 2001 and 2008 because of the increased use of peer-to-peer applications that grid enables. This estimate dwarfs most other forecasts.

"There will be a sea change in Internet growth over the next seven to eight years," he says. He goes on to assert that backbone bandwidth prices to end-users will drop by 75% annually over the next few years, thereby accelerating the adoption of wide-area grids across the world.

 

 
   

   
 

Quantifying 'huge'

The most powerful systems companies in the world are not shy about their commitment to grid computing. Compaq, Hewlett-Packard (HP), IBM and Sun Microsystems all pledge support for the technology across their product lines, and all say their financial opportunity is significant.

But how significant? Precise numbers are hard to come by. None of the major industry trackers have published figures that break out grid computing from other forms. The problem, to date, may be deciding what to track, as grid involves several types of technologies.

Still, the major players do try to provide some guidance on what they are shooting for.

Peter Jeffcock, grid computing group marketing manager at Sun, says that the technical computing market in 2001 was approximately $12 billion [€13.6 billion], and that somewhere between 20 and 40 percent of companies using technical computing systems have deployed grid.

Mike Nelson, director of Internet technology and strategy at IBM, takes a different approach, if no more precise. Nelson says that grid computing will represent from five to 10 percent of what companies spend on information technology (IT) within five years.

Go figure.

 

 
   

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics