Much of the focus of the current “big data” buzz has focused on strategic analysis: aggregating large data sets to spot trends, in order to improve business strategy.
But the growing volume, velocity and variety of data that businesses are producing can also be applied more tactically.
There is a field of technology, which has come to be known as “operational intelligence” (OI), that seeks to extract data from operational processes, quickly analyse it, a feed it back into the process to improve outcomes. And judging by a recent spate of acquisitions in the data management and middleware industry, this is a field that vendors expect to be extremely lucrative.
What is operational intelligence?
The term “operational intelligence” has become a catch-all term for data infrastructure technologies that allow businesses to monitor business events in “real time” (a phrase that should always be treated with some skepticism), analyse those events and take some automated action.
That action might be initiating a new step in a business process, or simply highlighting the event to a human operator.
According to IT analyst company Ventana Research, “increasingly employees and processes in business operations and front-line sales, service and support functions need to be able to detect and respond to events as they are happening. OI is a set of event-focused information gathering and delivery processes that can meet this need.
“Used properly, OI enables people to make better, faster decisions and makes it possible for automated processes to respond effectively to events by using business rules and incoming event information.”
There is one industry that is already a voracious consumer of OI: financial trading. Investment firms analyse stock market activity in split-second detail to spot meaningful patterns. These might be alerted to a human trader, or used as a trigger for automated trading.
- See also: Self-service business intelligence
But as more and more business processes rely on a growing volume of data, and the speed of business accelerates, the applications of OI are increasing.
Mobile telecommunications is a big market for OI, says Robin Bloor, founder of IT anayst firm The Bloor Group.
“The big thing for the mobile telcos is predicting customer churn,” he explains. “If you can spot when people are likely to switch to a rival, then you can make them an offer.”
The conventional approach has been to aggregate customer records, segment them into groups, and offer members of each group particular deals. OI by contrast, allows telcos to monitor individual customers' activity in real time, and send them a new offer as soon as they exhibit behaviour associated with churning.
Another industry making ample use of OI is advertising. Digital advertising platforms build a profile of web browsers based on their history and, in the time it takes to render a web page, auction off the advertising space to whichever advertiser wants to reach that kind of consumer most.
In fact, Bloor says, any major website operator is likely already doing OI, tailoring the content of the site to each user in real-time, by analysing their behaviour and reacting automatically.
“When latency between acquiring knowledge and acting comes down below hours, you need data streaming"
The umbrella term “operational intelligence” covers a multitude of technologies. These include business activity monitoring (BAM) and complex event processing (CEP). These both monitor data sources for significant changes, or a meaningful correlation of changes. BAM is usually associated with dashboards that allow manager to oversee operations, while CEP is more commonly used for ultra-low latency applications such as high-frequency trading.
An increasingly significant platform technology for OI is data streaming, the processing and analysis of data in memory before it is written to disk. This allows systems to react very quickly to new events.
“As soon as the latency between acquiring knowledge and acting comes down below hours, you're moving into an architecture that demands data streaming,” he explains.
One of a spate of recent acquisitions in the operational intelligence space saw middleware vendor TIBCO acquire StreamBase, a US company that sell a data streaming platform designed for “real-time” analytics applications.
Speaking to Information Age after the deal in June, TIBCO marketing director Ivan Casanova said the acquisition augments its existing CEP assets. “Our traditional event processing technology allows you to set up a rule that says, when a customer does X and when another piece of information from another system is Y, then execute Z,” he explained. “Stream processing is different. It allows you to take a data stream, such as a stock ticker, and say, when there is a 5% differential between the 30 day running average for a particular stock, I would like to be alerted to that or execute a trade,” Casanova said.
“That's a different model to our traditional event processing systems,” he said. “It's a different set of data, with a different programming model.”
“We should all be focused on taking the 'complex' out of complex event processing"
Casanova added that StreamBase brings with it a set of tools designed to make it easier for developers to build applications that incorporate stream processing. This will help expand the penetration the technology, he said. “We should all be focused on taking the 'complex' out of complex event processing,” he said.
Not long after TIBCO's acquisition of StreamBase was announced, German middleware supplier Software AG announced its intention to acquire Apama, the CEP company currently owned by Progress Software.
The acquisition brings Software AG a “mature event processing platform”, Forrester Research analyst Stefan Ried wrote after the deal was announced. Software AG already owns some CEP technology, Ried noted, but it is “lightweight” and “basic” compared to Apama.
Software AG's challenge is to achieve what Progress evidently failed to do – integrate Apama into its portfolio of products. “Software AG's acquisitions over the last five years have been very closely integrated into the existing middleware stack,” he wrote. “If Software AG does the same with Apama, it will be of huge benefit for customers facing hybrid integration scenarios.”
Another company building what might be described as an operational intelligence platform is Actian. The company was built around Ingres, a pioneering relational database developed in 1970s, that was acquired by CA in 1994 and sold to private equity in 2005.
- See also: Advertising at the speed of light
Since re-emerging as Actian Corporation in 2011, the company has acquired a number of data-focused start-ups. Most significantly, in April it acquired ParAccel, which developed the massively-parallel database that underpins, among others, Amazon.com’s data warehouse service RedShift. Bloor (who was worked for Actian) says the company is focused on providing “fast analytics” – the ability to perform complex analysis with ultra-low latency – which will form part of the operational intelligence stack.
One company focused explicity on OI is Vitria. An enterprise integration software provider that was founded back in 1994, the company was taken private in 2007 to focus on what it describes “as the world’s first operational intelligence suite”.
According to a recent Gartner Magic Quadrant, this suite – now literally named Operational Intelligence – is capable of “very high-volume applications because of scalable, grid-based architecture that leverages MapReduce principles”.
This is rare though, and for the foreseeable future, OI is likely to pan out in much the same way as business intelligence has done: a variety of “best-of-breed” suppliers providing components of the OI stack, one day giving way to a small number of dominant, enterprise suppliers.