Having empowered executives with business intelligence and data warehousing technologies, many organisations are now looking to mirror that in the operational data arena. Here, however, they find themselves contending with existing data access, analysis and reporting techniques that are less formal but highly valued by operational staff.
And while there are many good reasons for these decentralised techniques to be replaced, building a business intelligence capability that can support operational data use is a highly complex and very expensive task.
In certain circumstances, a third approach is advisable. BI tools that apply reporting and analysis directly to the databases that sit underneath ERP and other core business applications can give operational managers the benefits of BI while ensuring that the data they are accessing is current and consistent.
BI moves downstream
The tools that operational managers – in finance, in logistics, in stock control – use to manage the processes they supervise are many and varied, but they usually revolve around a system of spreadsheets and are more often than not based on Microsoft’s Excel application. The case for operational BI states that these informal systems are insufficient, and that by introducing BI technologies such as OLAP queries, reporting and analytics into the operational arena, performance can be better managed.
“There is no quality control across all the various spreadsheets in an organisation, so data quality can really suffer,” explains David Stodder, an analyst for performance management expert Ventana Research.
Stodder adds that the unsophisticated nature of spreadsheets means that they are less effective tools for analysing data. “It is more difficult to query data from a spreadsheet,” he says, “or to get dimensional views of the same data.”
But this case can be difficult to make to operational managers who are familiar with spreadsheets, who know how to use them and who know how to manipulate them to produce the numbers they want.
“A lot of operations managers will look at BI and think to themselves, ‘We are already making good use of spreadsheets, why do we need this?’,” says Stodder.
Out of place
That reluctance is exacerbated by the fact that traditional BI tools are often inappropriate for an operational setting.
The tried-and-tested model of BI sees data from operational applications extracted and transported to a data warehouse. It is through the data warehouse structures that the reporting and analytics take place.
However, moving data from operational stores to the warehouse has an impact on performance and takes time. For most BI systems, the data is typically extracted periodically in batch, during periods of lower transaction volumes. That means users are always examining an out-of-date picture.
As far as most executive decision-making is concerned, this is not too much of a problem: when running a monthly report, data that has been created in the past few hours need not be included in order to establish a trend.
In operations, however, this time lag can be an impairment. If customers call up wanting to know why products have not been delivered, for example, it is no good telling them where their delivery was yesterday. And a financial controller working to get an exact picture of where the company’s cash position stands at the end of a financial period cannot rely on a snapshot of the accounting data at the previous day’s close of play.
Traditional BI technology vendors argue that the solution to this is bigger, better and faster (and more expensive) systems. If an organisation requires real-time data, they argue, then it needs a data warehouse that is constantly being refreshed every few minutes throughout the day and that can hold vast amounts of data and still provide sub-second query responses.
Teradata is one of the world’s leading data warehouse vendors. It argues that ‘active data warehousing’ is the best way to provide operational staff with up-to-the-minute data. At the high end of its product range are databases that can hold many petabytes of data, and so keep a record of every operational event and make it instantly available to reporting, analysis or simply recall.
But Teradata’s products are aimed squarely at the world’s largest corporations. While he believes that one day ‘active data warehousing’ will become mainstream, Teradata CEO Mike Koehler admits that just 30 to 40 organisations have actually applied the model to its furthest extent.
Organisations that are not at the bleeding edge of technology adoption (or that do not have the deepest pockets) might find it hard to justify this kind of infrastructure investment when operational managers are content with Microsoft Excel.
There is an approach to operational BI that makes up-to-the-minute data available to reporting and analytics but that does not require a highly sophisticated data warehouse.
This approach works by treating the relational database that sits underneath a core business application as though it were a multidimensional OLAP database.
By understanding the inner workings of particular applications – GL works with Oracle and JD Edwards ERP systems – it is possible to superimpose a layer of logic over the application database that makes it available for queried reporting and analysis.
This is only made possible by mapping out the application database in extremely fine detail. “We spend all of our time learning about database tables,” says Gomersall.
The drawback of this approach is that it is best suited to situations in which users only need data from a single application – unlike traditional BI systems, whose power derives from their ability to integrate various sources. However, this is not as rare a situation as it once was, as many organisations have undergone a process of ERP rationalisation.
Despite this limitation, the technique of reporting directly from the application database challenges an assumption that has long been observed wisdom in the BI field: that data must be moved to a staging database before it can be usefully queried.
That argument is no longer accepted. “If the data is in one system, why should I have to put it in another to report on it?” asks
Part of this assumption relates to processor performance: there was a time when an operational processing could slow down dramatically if it was subjected to even the simplest query. But processor speed has significantly improved since that time, and transactional systems can tolerate queries.
So while it may at the moment be limited to certain circumstances, the approach exemplified by the GL Company’s tools points to a near future in which operational BI does not necessitate the building of costly extra infrastructure or the introduction of greater complexity – and is therefore much more likely to achieve adoption.