Intelligent moves

So-called business intelligence (BI) technologies offer an established and well-understood method of deriving insight from data. According to the traditional model, data is replicated from transactional systems into a separate repository – a data warehouse – where it can be sliced, diced and analysed without slowing the performance of those systems.

This is the approach that venue management company NEC Group took in order to derive greater value from its visitor data, as Andrew McManus, group IT director, and his colleague Adi Clark, head of insight, explained at the Information, Communication and Collaboration conference.

Before it launched its business intelligence project, the company, which operates such venues as Birmingham’s NEC and LG Arenas, was not capitalising on the customer data it collects.

“We had a huge amount of data in our organisation, but it was all over the place in silos,” McManus explained. This meant that the many thousands of visitors that passed through its venues were essentially anonymous.

By building a data warehouse for that data, NEC Group was able to build visitor profiles for particular events. Today, not only can the company target products and services to visitors, it can also use that information to market future attractions to previous visitors.

But while this is reportedly an industry-leading deployment, McManus explained that the data warehouse project was not technically complex by today’s standards. “This was a very straightforward technical solution,” he said. “It wasn’t an easy exercise to get the information into the data warehouse, but we’re not talking about heavy technology.”

Those comments demonstrated the degree to which BI has become a mainstream concern. However, among some of the largest and most sophisticated users of analytics technology, the observed wisdom of creating a single repository for data is being tested to its limits.

Achievable goals

Gerard Crispie is the former head of systems and infrastructure at HBOS’s Decision Science division, a unit that analyses the bank’s vast swathes of data to predict retail trends and design banking offers. At the event, Crispie recounted how the company redesigned the IT infrastructure that supports its Decision Engine, starting back in 2006.

“It was recognised that HBOS’s decisioning infrastructure lagged behind that of our competitors,” he recalled.

An alternative, more sophisticated infrastructure was therefore designed, at the core of which was a central data warehouse or ‘hub’. “Early on in the programme, it was thought that the data hub would be central to the decisioning infrastructure, and would even be used by other parts of the business too,” Crispie recalled.

The project was due to take many years to complete. Within months of starting, however, concerns were already arising about the progress of the project, especially regarding the data warehouse. “By the end of 2006, concerns arose concerning latency and scalability, and the timescales of delivery,” he said.

“These concerns led to a series of reviews, although the decision was taken to carry on,” Crispie explained. In April 2007, one part of the new infrastructure was successfully deployed, but the rest was scrapped due to concern for the performance of the central data hub, as well as the time and money it would take to build.

However, a lot of work had gone into designing this decisioning infrastructure, so it was decided that a compromise would be reached, in the shape of a simpler, though less centralised, system. By building a few data marts, based on datasets pulled from legacy systems, HBOS Decision Science was able to achieve many of the functional benefits of the redesign in just four months.

“This experience made me rethink my personal beliefs about getting all the data into one place,” he said. “Of course, it is important to have your information in one place, but how you get it together is the key question.

“A number of organisations have spent many years and lots of money developing a unified customer view in a highly structured central data warehouse, but I now believe it’s better to be able to deliver something in six to 12 months,” he concluded. “If you have to cobble together several data sources, then so be it.”

Pete Swabey

Pete Swabey

Pete was Editor of Information Age and head of technology research for Vitesse Media plc from 2005 to 2013, before moving on to be Senior Editor and then Editorial Director at The Economist Intelligence...

Related Topics