The instantly responsive enterprise

It is Orlando, Florida, and a credit card holder makes a routine purchase over the phone. The payment goes through without a query. A few minutes later, the same cardholder attempts to make another purchase. This time, he is trying from Taipei, Taiwan. What does the automated credit approval system do?

What it should do is combine the two events and create a “possible fraud event”. That message will alert the authorisation system and any other systems that need to know.

 
 

Event-driven apps in action today

  • A ‘build to order’ manufacturing system : an event (‘an order’) triggers actions such as a requisition for parts.

  • A network or systems management system : an event (a failure) triggers the system to send out alerts.

  • A business activity monitoring system : Several business events (a build up of orders for certain products) combine to trigger a real-time analytical report.

  • A dynamic airline pricing system : each new booking ‘event’ triggers a new pricing calculation for the next buyer.

  • A process control system : if thresholds are passed, the system informs the management system that action is needed.

  • An automated stock trading system : an event (such as a price being exceeded) triggers the system to buy/sell.

  • Computer operating systems: an event (the mouse pointer is over an icon) triggers a system response.

     

  •  

    And it will happen so fast that, for the second purchaser, at least, the transaction will be stopped.

    But that will probably not happen today. Credit card companies have all kinds of systems for recognising and tracking fraud, but most can only react to a very limited number of events in real time. They don’t have what is being called an ‘event-driven architecture’.

    But they soon will have, if the IT analyst group Gartner is correct in its predictions. EDA, says Gartner analyst David McCoy, who cited the above example at the company’s recent annual integration and web services event, is “the next big thing”.

    By 2008, he and his fellow analysts think event processing will be mainstream, with most new business systems in large companies set up to emit vast amounts of event information. “Applications are going to start to get very chatty,” says McCoy.

    And it is not just applications. Leading thinkers such as Nicholas Negroponte of MIT, and Glover Ferguson, the chief technology officer of Accenture, have forecast that billions of RFID chips, remote sensors, and even a whole world of virtual objects and ‘avatars’ will soon start bombarding their monitoring systems with their latest news.

    That kind of information is valuable, but only to those who have set up an IT architecture that is flexible and powerful enough to use it. As ever, the arrival of this new acronym comes with an imperative: those that use EDA will gain important financial and strategic benefits, in terms of agility, shortened process time, simplicity and speed of reaction. The rest will risk obsolescence.

    The imminent arrival of the EDA has analyst groups and suppliers scrambling to work out the implications. Gartner has made EDA a central theme in its strategic advice and predicts a huge hype wave is about break: “This will be the subject of conferences in the next few years. This will be the cover theme of magazines,” said McCoy. (It already is).

    Meanwhile suppliers are already working on their systems, and marketing departments on their angles. IBM has developed a set of standards and procedures called ‘Common Event Infrastructure’ has even submitted some standards to the Oasis standards body and is promising new EDA products later this year (see box ‘EDA: A technology primer’). Tibco, whose publish/subscribe integration systems are partially event driven already, is expected to deliver an event management product soon; Oracle has introduced an ‘event infrastructure’ to handle large numbers of RFID messages. Dozens of start ups are working on specialist products, and system management companies such as BMC and Computer Associates are busy working out how their event-handling systems management products can be extended to the application level, especially in areas such as business service management.

    In fact, there is so much event-driven technology out there already, both in the labs and working at customer’s premises (see box) that EDA seems scarcely to justify the soubriquet of “the next big thing”. And, moreover, “the technical prerequisites are already there in most large enterprises. Any company of $250 million [per annum revenue] and up has an IT infrastructure that can support EDA” says Roy Schulte, the Gartner analyst who is credited with developing much of the thinking on EDA.

    But some technology changes will be required – as will, for most organisations, a lot of new architectural and strategic business thinking.

    “Today, all enterprises are event-driven to some extent already, but mostly implicitly and without real thought,” says Schulte. An event driven enterprise, he says, “has a deliberate strategy of employing [systems] design concepts that allow it to respond to as many different events as possible.”

    Businesses should rebuild their systems based on a consciously event-driven architecture, rather than use ad hoc, tactical systems, he says.

    On demand

    If the technical leap to EDA is not huge, then neither are the practical differences between an event-driven enterprise and a traditional business necessarily very dramatic. But the impact can be huge.

    To illustrate the point, Schulte cites the difference between a manufacturing company that builds to inventory, and one that builds to order. The latter is like an event-driven company.

    “Companies like Dell can manufacture to order within a 24-hour cycle. They save a lot of money by not having lots of products in their inventory,” says Schulte.

    Schulte believes that similar event-driven processes are beginning to emerge right across the commercial spectrum. “We didn’t just think of this [EDA] ourselves. It is based on what some of our leading edge clients tell us they are doing,” says Schulte.

    Some of these applications – such as programmed trading, which began on Wall Street in the early 1990s – are well known. But the key change that Gartner and others advocate is for event handling to be open and standard based, so that different systems can send event information to each other as standardised messages.

    But what are the architectural changes that need to be made? The consensus among software suppliers is that these will not need to be dramatic, although some suppliers will have to make major changes to their systems.

    A key point emphasised by those involved in this area – especially in the all important area of middleware – is that the EDA is the sibling of the SOA (service-oriented architecture). The idea of the SOA is to deliver software functions as loosely-linked services that can be plugged, unplugged or combined to form new applications. The idea of the EDA is that organisations can respond instantly to any relevant event.

    “The best use of SOA comes with use of, and understanding of, EDA. These two platforms together form the basis of the real-time, event-driven enterprise,” says Gartner analyst, Nefim Yatis.

    In fact, the infrastructure needed to support EDA includes the use of web services as a key set of communications and interface standards, along with the orchestration and management capabilities of integrating middleware products. Integration suites and business process management software tools are also likely to play a key role in handling the plethora of intra-process messages that event-driven working generates.

    That is not to say it is all in place. Although, as Ross Nathan, the CTO of SeeBeyond points out, “Standard middleware products have been doing EDA since it started,” there are still likely to be improvements needed in events handling, complex events management, workflow and state handling, and in the important area of rules engines (to program systems to automatically handle complex events).

    There is also some work to be done on standards. SOA architectures today, using web services, work through synchronous conversations – in which a service is requested and it gets a reply. EDA throws out large numbers of asynchronous messages that demand no reply. Web services standards to deal with this have yet to be fully specified.

    The design and configuration of EDA systems will also need to be different – because polling systems for information, as most middleware products do now – creates too much traffic. “To keep the traffic down, the application will have to tell us something has happened,” said Nathan. Either way, a robust and capacious network will be needed. The only truly exotic element in the EDA prescription is the use of complex event processing (CEP) – the pattern recognition and resolution technology which enables a wide variety of different events to be interpreted, and translated into an equally diverse set of responses.

    With the exception of ‘business activity monitoring’ and some specialist applications, Gartner is advising most clients to be wary of CEP at present, owing to product and infrastructural immaturity.

    CEP’s roots lie in the defence industry, where it was developed to provide missile guidance systems with the ability to distinguish between friend and foe in real-time. Since then it has seeped into commercial use at the heart of real-time financial trading systems and into some network management products.

    More recently though, CEP technology has begun to be productised by specialist developers who are applying it to a variety of commercial applications, and surrounding it with tools that make it much more accessible to business users.

    In the UK for instance, Apama’s Event Modeller provides financial traders with a dashboard which allows them to set and reset the rules that their electronic systems use to interpret and respond to real-time market feeds. In the US, Metatomix’s SMARTE system integrates federal and local government intelligence systems, providing a joined-up, real-time of essential security and surveillance information.

    It is the adoption of EDA and CEP by the bigger suppliers, however, that will really establish the technology. Two suppliers stand out – Tibco, with the evolution of its publish/subscribe systems, and IBM.

    Jason Weisser, IBM’s corporate VP of enterprise integration, says his company’s on-demand operating environment, akin to Gartner’s concept of EDA, “is predicated on the use of SOA as the adaption layer” between business processes and a set of loosely coupled systems services, linked to a complex event processing engine that controls the distribution of these services between the competing processes.

    IBM’s CEP engine, presently code-named Whitewater, is set to appear late in 2004. It will provide the last of three capabilities that IBM believes are essential to the realisation of true on-demand or event-driven working.

    The first of these, the ability of systems to sense and respond to different stimuli, is already ‘baked into’ IBM’s autonomic computing systems management vision, says Weisser. The second, an adaptive capability that allows systems to respond to changing events based on past experience, is also already embedded within IBM’s early on-demand offerings. The third capability, that Whitewater will empower, will introduce a proactive element to systems behaviour by diluting what Weisser calls the “specificity” that most conventional systems require to respond to events.

    At this points says Weisser, the on-demand operating environment is really moving into the realms of the business process layer, providing business users with a way of applying business rules to systems in much less restrictive way.

    When will all this happen?

    Gartner’s view is that EDA will become part of mainstream systems planning within four years, and will be commonly used in new applications in that timeframe. But, as with SOA and web services, there are huge migratory, architectural and business strategy issues that must be tackled – not all at once, but step by step. “It will take at least 20 years for the notion of EDA to come close to the potential of what can be achieved” concludes Schulte.

       
     

    The event-driven architecture: A technology primer

    Early in 2003, Gartner analyst Roy Schulte surprised CIOs and IT architects by declaring that, before most of them had even made a start on implementing the service-oriented architecture (SOA), its successor was at hand: the event-driven architecture (EDA).

    The EDA is all about designing and implementing systems that enable a business to respond to events as they happen. And because events in a complex, real-time world happen an awful lot, such systems must be robust and scalable.

    Gartner has been at pains to emphasise that parts of the EDA are not new – many different types of systems send out alerts when certain events happen. Perhaps the biggest difference between EDA and earlier models lies in its use of ‘publish/subscribe’ technology. Enterprise application integr-ation (EAI) and SOA are mainly ‘pull’ architectures, where applications that require information must request it from a server. With EDA, applications register for certain business events, and are then informed immediately whenever those events occur. (In this sense, the events are ‘pushed’).

    This allows applications to react promptly and appropriately to time-critical situations, while using IT resources in the most efficient way. For example, if a shopper has his or her credit card declined by a retailer because a purchase takes it over the limit, the credit card issuer would have an opportunity to raise the credit limit, or negotiate some arrangement to prevent a repetition.

    That opportunity is best exploited by striking while the iron is hot – not by writing a letter that arrives days or weeks later. A bank equipped with EDA systems could send the customer an email or text message within seconds, perhaps even before the threatened transaction is abandoned. The ability to respond to critical events without delay could easily confer a decisive competitive advantage.

    Vivek Ranadivé, the founder of Tibco, suggested a similar idea in his 1999 book The Power of Now. In a section entitled ‘The Event-Driven Revolution’, Ranadivé admits that “although the most complete implementation of the event-driven architecture utilises a real-time software integration infrastructure, any wise business leader can benefit from this winning approach.”

    Ranadivé continues: “Being event-driven is also a state of mind: a keen, continuous scanning of the horizon to anticipate events that change the status quo, and then applying event-driven tools to either shape change to the company’s advantage or surf the changes one can’t control in order to be first to the beach.”

    Ranadivé contrasts the “passive, ‘query me’ client/server technology used in most companies” with the “‘active’, even somewhat aggressive” event-driven infrastructure – which he then identifies with publish-subscribe, a paradigm that Tibco pioneered.

    A similar, but more advanced, idea is complex event processing (CEP), expounded by Stanford professor David Luckham in his book The Power of Events. Luckham argues that vast amounts of valuable information are latent in today’s distributed information systems; a new breed of software tool is needed to collect and present the data.

    One of Gartner’s first comments about business events came in its ‘Hype cycle for application integration and platform middleware’, published in May 2003. This report places ‘complex event-rriven applications’ in the ‘Technology Trigger’ region, to the far left of its well-known hype-cycle diagrams. That means that, in its view, EDA will be increasingly hyped, then will disappoint and disillusion, before finally entering the sun-drenched uplands of the “Plateau of Productivity”. That process will take several years – maybe five to ten.

    Gartner analyst Roy Schulte describes CEP as “sophisticated aggregation of multiple events”, and a business event as “a meaningful change in the state of a business or application system”.

    “Event-driven applications are those in which processing is triggered by the arrival of push-based information coming from outside of the component that performs the function”, he wrote in a recent report. One example: business activity monitoring (BAM) applications, which respond to events reported by the underlying, integrated systems.

    There are already more than a three dozen companies involved in pioneering work in this area, with a powerful group of start-ups complimented by established leaders in business intelligence, real-time integration and process management. (For more details, see main article)

    One technology that is expected to spark more use of event-driven systems is RFID – radio frequency ID sensors. These can be used to report back large numbers of events that need a response. Gartner fellow Tom Austin told delegates at a recent US conference that EDA would have to be able to handle the terabytes of data produced by ubiquitous RFID technology.

    Although some sceptics may say that they have seen a lot of this before, EDA is different in one key respect: it should be able to monitor large numbers of events – millions, or even billions – and detect so-called ‘complex events’, made up of specified patterns of simple events.

    As with many state-of-the-art IT concepts, EDA and CEP have entered the realm of possibility only because of huge advances in processing power, storage capacity and bandwidth availability. In 1990, even the most powerful commercial computers were just not fast enough to carry out all the necessary computation, while networks were too sluggish to deliver a constant stream of messages enterprise-wide.

    In a schematic EDA set-up, simple and complex business events are generated by ‘sources’ and immediately detected by ‘event listeners’ programmed to watch for those specific events. Whenever an event occurs, it is promptly distributed through a publish/subscribe ‘event channel’, which notifies all those applications (‘handlers’) that registered an interest in that particular type of event. The channel could be implemented using a variety of distributed computing techniques or standards, such as Corba, proprietary message-oriented middleware such as IBM’s WebSphere MQ, Java messaging services (JMS) or web services.

    Another key differentiator of EDA is the quality of being ‘event-driven’. Most things in the world of computing are event-driven in one way or another, but while EAI or SOA deal with the routine, everyday processes or events required to fulfil the activities of the value chain, EDA is there to take care of those relatively infrequent, exceptional conditions that must be addressed without delay if an opportunity is to be grasped or a failure averted.

    Adding EDA to an existing SOA might present some problems, but none that are likely to be insuperable. The two extra components required are a publish/subscribe mechanism and a complex event detection engine that can be programmed to recognise potentially important patterns of events. Such software is available from a number of sources – which is where some avoidable confusion is liable to set in.

    One of the shortcomings of the analyst-generated acronyms (as opposed to industry standards) is that they can be all too flexible, permitting many overlapping interpretations. A mixed bag of vendors claim to offer EDA solutions today, ranging from IBM with its WebSphere MQ Event Broker and Neon Systems with Shadow Event Publisher to the ‘event servers’ of specialist start-ups like Agent Logic, iSpheres and Rhysome.

    Gartner, in particular, is promoting the EDA with missionary zeal. “As enterprises run into the limitations of SOA,” predicts Gartner’s Yefim Natis, “they will turn to the architecture of events to complement and enhance their software environment”.

    But if companies have got along without EDA so far, why should they need it now? The answer: It is not so much a matter of doing existing IT tasks better, but of undertaking wholly new tasks – which, if well implemented, could give the business a head start over competition. That is primarily a business decision, not a matter for the IT department.

    If the technology is not entirely new, then nor are the concepts. EDA and CEP, for example, have much the same goals as those set out by Stafford Beer in his work for the Chilean government in 1971-73. Beer had a vision1 of not only corporations, but entire national economies wired for real-time data collection, allowing the notification of significant events within minutes or seconds. For a variety of reasons, ranging from hardware to politics, it came to nothing at the time.

    Today, the pre-conditions are at last in place to start realising Beer’s far-sighted plans. Most enterprises have efficient, flexible TCP/IP networks and plenty of powerful computers. They have passed successively through the stages of client/server, three-tier client/server, web infrastructure and J2EE or .NET based application servers. SOA has been generally accepted as the next architectural objective; and, conveniently, it forms an excellent foundation for EDA and CEP.

    So innovators and early adopters should find that EDA is technically within their grasp – although it most certainly cannot be deployed overnight. The most important question that has to be asked is a business one: Is the business ready, or even disposed, to take advantage of the potential benefits EDA can deliver? It would be worse than useless to spend heavily on an automated source of real-time information, unless the organisation is prepared to act on it.

     

     
       

    Avatar photo

    Ben Rossi

    Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

    Related Topics