Apama has inverted the conventional approach to real-time data analysis. Typically, data is entered into a database and indexed before it is analysed. This approach, says John Bates, president and chief technology officer of real-time analytics company Apama, works best for “reasonably static scenarios”, and is neither sufficiently scalable nor fast enough for real-time analysis, especially when thousands of streams of data are entering a database concurrently.
In contrast, the Apama Matching Engine (AME) catches and analyses data before it reaches the database. Apama looks for patterns in data streams, whether the user is interested in a single change in a particular data value, or in a combination of changes across multiple data streams. Furthermore, AME identifies recurring patterns in data, alerting users to take action when previously stipulated scenarios occur.
Bates says Apama's approach to analysis is more scalable than conventional approaches, because it avoids the problems of performance degradation related to indexing large numbers of incoming information streams.
AME does not store data, just queries, and is not meant to replace databases – in fact, information from databases can be put through the engine to analyse a mixture of real-time and historic data.
According to Bates, AME can handle tens of thousands of transactions per second without the need for specialised hardware. This, he says, makes the system more cost-effective than rivals. "What we can do on three Sun servers would cost a customer $13 million in mainframe computers [using any other system]," he claims.
Apama's industry partners include Sun Microsystems, IBM and Oracle, while several large investment banks are currently piloting its software.
AME was commercially launched in mid-2001, and the challenge for Apama now is to convince companies that its unconventional approach offers noticeably improved performance.