In 2003, when the conference circuit was bursting with talk of SOA (the service oriented architecture), analyst company Gartner introduced another term: EDA – the event driven architecture.
The EDA, Gartner stressed at the outset, is not an alternative to the SOA. It should be thought of as a complementary approach to application and infrastructure design, a way of enhancing its power. When deployed together, they would help to make businesses far more flexible, agile, efficient and customer friendly. Gartner went further: by 2008, it said, EDA would be mainstream, widely hyped at first, but then commonly adopted and providing some big business benefits.
It hasn’t quite worked out that way. Almost everybody in IT management knows all about SOA, many are deploying it and almost everyone intends to. But the term EDA is not widely used and it remains little understood.
But that doesn’t mean that Gartner was wrong, although it may have overestimated the speed of take up. EDA is about the rapid processing of event information. Many of the core technologies in this area being used aggressively by a number of big organisations, and suppliers of EDA technology are claiming strong take up of their products. Tibco, Progress (Apama), Oracle, IBM and others already offer products in this market.
Not many of today’s applications can perhaps be described as full-blown “architectural” implementations. But it is possible today to find advanced event driven applications in financial services, IT systems management, telecoms, retail, gaming, fraud detection, supply chain management and many other areas.
Event driven systems are not at all new. An obvious example is ‘program trading’, or ‘algorithmic trading’, which has been used by financial institutions for nearly two decades. If a share price or an index drops at a certain rate, or below a certain level, for example, this is treated as an event, and this occurrence triggers an automated reaction – such as a “sell” order.
Similarly, a missile tracking system may not wait for a human to decide what to do when the screen lights up – it just orders the firing of the defensive missiles. Closer to home, most systems management tools today use event-alerting or some kind of “event driven” agents to manage both systems and inform people.
>Read more on Tech Events Diary
Arguably, event driven systems lie at the heart of computing. Even simple devices such as PCs are event driven: operating systems, for example, are built around “event loops” that handle a series of interrupts from the keyboard, mouse or the network.
The event driven architecture uses many of these ideas, but it is a scaled up, architectural approach to software and data management that gives the management of events a central place in the IT architecture.
In such an architecture, events are treated much like services, with open systems and interfaces designed to make best use of the vast and varied amount of information delivered.
Modern organisations aspire to be able to react to events (such as sudden changes in stock prices, or shortages in the supply) in near real time, or, even better, they would like to actually anticipate events before they happen, so that appropriate action can be taken.
EDA involves building a distributed mechanism that collects information from large numbers of events (in some cases, very large numbers) and then feeds it into software programmes that recommend or initiate actions based on these events. Critically, the systems should be able to react immediately – there should be no need to analyse reports from a database.
In recent years, several major developments have made event processing both a viable technology and a strategically important approach in IT.
First, influenced by the work of Professor David Luckham of Stanford University in his book Power of Events, software suppliers are now building CEP (complex event processing) platforms that have become the engine at the heart of an EDA. These systems are designed to quickly process complex bundles of events, creating clear outcomes from otherwise unconnected inputs.
A related innovation is that high level, possibly graphical tools, are now being developed that will eventually enable business people to create their own event-driven processes and rules, and to integrate these decision-making steps into business processes. Some observers have likened this development to the invention of the spreadsheet: relatively non-technical business managers will be able to decide what should happen if x, y and z all happen within, say, 4 minutes – and program it into their systems.
A third strand to the EDA revolution is that the underlying infrastructure is now available to deliver asynchronous event information – perhaps using some kind of enterprise service bus – in large volumes, in near real time. Previously, this would have involved very expensive, inflexible and specialist systems. Increasingly, all this event information – from sources such as trading systems, web interactions, RFID chips and remote sensors – will be dealt with using standard, open interfaces, making it easier and cheaper to build such systems.
The power of event technology is further enhanced because it so neatly complements at least two other popular approaches: Business process management, which involves modelling and implementing automated or semi-automated processes at a high level; and SOA, which effectively provides an infrastructure into which event based systems can plug. By plugging event data into business intelligence tools, it also becomes possible to replay events to analyse and simulate different outcomes.
Analysts say there is much more work to be done – not least in opening up applications so that they communicate event information better; and in building resilient, scalable systems. But as more and more information is communicated in real time, the EDA, even if it doesn’t eventually fly under that flag, will inevitably be widely adopted.