BT’s wrong numbers

Telecoms giant BT has endured a 10-year struggle with data quality, but it is now reaching a point where its management team believe the company has sufficient experience to start offering its expertise to clients. But how has BT reached this point?

Back in 1997, BT’s management realised that data quality issues were beginning to inhibit its operations: a series of mergers and acquisitions had entangled its business processes; its client-server architecture had weakened centralised controls; and billions invested in systems that were underperforming.

The most pressing concern was the inaccuracy of customer data. Marketing campaigns yielded poor returns, as customer details were out of date or simply wrong. BT had even posted marketing material to trains which had telephones on them. The impact on customer service was also as dire.

But customer data was not the only bug-bear. Financial operations had been impaired by poor data, and BT had encountered some difficulties in managing its physical assets: it often did not know what network equipment it had around the country, or whether it was working properly. This would result in unnecessary capital expenditure in replacing equipment unnecessarily.

Management decided that the reactive, fire-fighting attempts at data cleansing that were occasionally instigated by separate departments were insufficient.

“Data cleansing and data quality are not synonymous,” says BT’s Nigel Turner, BT's head of ICT customer management delivery. “With data cleansing, you can reach a certain standard of accuracy but if you don’t have a way to maintain that, the problem will just come back.”

To maintain the required level of quality, BT first standardised on a set of data quality tools from software maker Trillium. It also bought data profiling tools from Avellino, a company subsequently acquired by Trillium.

These tools were used in proof of concept projects, targeted at specific applications which gave a fast return on investment. This was a vital step in establishing “the case for industrial scale data quality management,” says Turner.

Getting senior management buy in was one issue, but driving it throughout the organisation was a quantum leap in terms of challenges, says Turner: “The hardest thing to change in data quality is not technology or processes but mindsets. You need to get people to think about data as a corporate asset and to take the management of that asset seriously.”

To smooth this path, employees were rewarded for maintaining the quality of data, and not just the speed of data entry. They were also encouraged to fix any errors they encountered. These controls were a critical component of the data quality initiatives, says Turner: front office staff might not see the consequence of poor quality data – such as an engineer being sent to wrong address – but these employees are ideally placed to tackle it.

Data input systems were revamped, increasing controls so that inaccurate data became more difficult to enter. “Now people don’t need to be reminded about DQ,” says Turner. “They do it as a matter of course.”

The next stage was to shift the focus from critical applications to a more holistic, enterprise-wide framework. This culminated in the establishment of NAD, BT’s name and address database. The NAD database acts as a central repository used to populate all other applications, ensuring that a consistent record is used across the company.

This database is also used by BT’s rivals, and plays an important role in ensuring that those companies can offer residential broadband services. That has some, such as broadband provider Bulldog, questioning the accuracy of the database. However, the final version went live in January 2006, and BT says it is functioning as promised.

“It took as a long time to get from the application focus to the enterprise-wide view of data quality, because we were early adopters,” he says. “But that journey can be made a lot quicker.”

Going forward, service-oriented architecture, says Turner, could be either a blessing or a curse in terms of data quality. “If we establish common data services, it will make the problem better. But if we fail to do that, and services get out of synch, it will be like client-server but worse.”

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics