Over the last few years, numerous commentators have latched onto the grand concept of ‘real-time’ business. Business is changing so rapidly, they argue, that any executive worth their salary should demand a ‘live’ feed on a desktop dashboard of exactly how their operations are performing.
The reality for most is somewhat different: a weekly fixed-format report with little or no drill-down capability is more like it.
Certainly, there is no shortage of the raw, live data needed to drive such real-time dashboards. Enterprise resource planning (ERP) systems generate gigabytes of data daily, but it is almost universally kept separate from any analytical processes. These typically draw on historical and partial snapshots of the ERP databases, inevitably introducing a degree of latency into the analysis.
The question for today’s managers is how much latency is acceptable?
The notion that executives are on an accelerating treadmill where all information is needed now suggests that no latency is the goal. But this is too simplistic. Reducing this time-lag is not a universal requirement.
At construction giant, the Shaw Group, cutting out latency in its analysis of project financials was deemed a top priority. Not only did the sourcing of real-time views of project status keep a tight rein on costs, it also delivered a compelling competitive advantage.
Others can live with a delay. Instead of looking for real-time analysis across the board, managers at high street retailer Debenhams acknowlede that some delay between the recording of actual sales and the analysis of stock levels is acceptable. Its executives still see the business value in reducing the latency: it has moved from a monthly update of its data warehouse to a weekly one, and will move gradually towards a daily, and eventually a near real-time, update.
As is clear from such moves and the accounts throughout this Business Briefing, the fresher the data, the more confident the decision.