Supermarkets reported a run on US flags on 11 September 2001. Wal-Mart sold 116,000 on the day of the terrorist attacks, 225,000 the day after, and had stock to spare the day after that. But its rivals ran out on the 11th and were unable to gather further supplies, frustrating thousands of patriotic customers.
Wal-Mart reacted to events so quickly because it had access to the right information, at the right time.
Most businesses’ database and business intelligence technology allows them only to do so much – and usually long after the ‘event’. Querying ‘live’ data – the information flowing through transactional systems such as call centres, enterprise resource planning systems or teller machines – slows down the performance of IT systems. Adding more computing power can partly address this problem, but it is an expensive solution and can introduce weaknesses into operational systems.
Instead, most businesses base their reactions on analysis of day-old – or in some cases even month-old – information that has been fed into a separate, offline data warehouse. By divorcing this data from the live system,
analysts can carry out highly complex queries based on millions of processor-intensive mathematical calculations without affecting performance.
But using old data seems anachronistic in today’s fast-moving economy. Businesses often miss important ‘events’ in their business activities, such as an inventory shortage, or a fraudulent transaction, which have a negative impact on their bottom line. Sony, the Japanese electronics giant, fell victim to this syndrome in 2000, when a failure to identify a shortage of chips for its PlayStation 2 console meant it shipped half the number planned.
Emerging database and analytics technologies promise to solve that problem by giving users the best of both worlds. They are alerted to events in their core transaction systems as and when they happen, and can relate this information to events that have happened in the past. That is how Wal-Mart kept up with demand on 11 September. This knowledge can then be linked to reporting or alerting mechanisms that will trigger the appropriate reaction in either an employee or another system.
Better still, these new technologies do not require companies to rip out their existing database or business intelligence investments and retrain staff. Some act as an interface between companies’ transaction systems and their data warehouses or marts, performing analysis on a mirror of the live data. Others embed themselves into operational databases or business intelligence tools from companies such as Business Objects. These are ‘stealth technologies’ that simply reinvigorate a number of systems most organisations already have in place. In a recession, this argument is a compelling one.
Suppliers waiting for the tornado are beginning to ‘feel the wind’. Danish analytics software supplier Data Distilleries, for one, just reported quarterly revenues up 100%. UK start-up Aruna is also earning rave reviews, claiming that one customer saw a return on investment in just three days.
And the market is about to get much bigger. Internet technologies such as wireless sensors or radio frequency identification tags will make it possible for businesses to monitor the slightest movement in their supply chains. Agent-based technology that can analyse these movements will then trigger an immediate, appropriate reaction to any risks or opportunities that present themselves. Such developments will extend real-time analytics from the production line right into consumers’ homes.
General Electric’s CIO, Gary Reiner, calls this the ‘digital dashboard’ – an ever-changing interface that assesses situations and alerts the appropriate party when certain conditions are not met.
One day, all businesses that require up-to-the minute analysis will have one of these ‘dashboards’. Valuable, real-time business analytics, until now the preserve of the Wal-Marts and GEs of this world, is being democratised.