How to tackle the great data quality challenge

Every year, Experian conducts a global research project to benchmark trends and interpret shifts in the data quality market. This year’s report is particularly interesting as it highlights a growing trend that businesses are increasingly aware of the potential of their data. More see it as a valuable asset that they can use to harness business intelligence, and in turn get closer to their customers to provide better products, services, experiences and overall, value.

Interestingly, this correlates with research we conducted last year in which CIOs stated they could increase their organisation’s profit by an average of 15% if their data was of the highest quality. Those organisations who proactively manage data as an asset with a joined up approach to data quality are the ones that reap the greatest value, however, the increasing volumes, variety and complexity of the data is adding to the challenge of data management.

Equally, it seems the cost of getting things wrong is growing as consumers – and regulators – are increasingly intolerant of data errors and more companies report lost revenue which is eventually traced back to poor data.

> See also: 3 steps to implementing a data governance programme

The findings highlight that while businesses see the value of improving their data quality, more than 90% still find it challenging. That’s the paradox we need to explore – between the desire to harness the strategic value of accurate data and the obstacles that businesses still face when wanting to adopt more sophisticated data quality strategies.

In order to harness the true potential of all that data there is a move towards more sophisticated data management, with more clearly defined ownership and processes which in turn appears to be fuelling investment in associated technology. As such it seems that the value technological solutions can bring is now finally recognised as 84% of all companies surveyed plan to make some sort of data quality solution a priority for their business during the next 12 months.

The question is, what will those technologies look like and how will they help companies to achieve greater data quality on the way to increasing revenue, improving efficiency and ensuring governance? The research shows that 88% of companies already have some form of data quality solution in place, yet nearly 3/4 of these companies plan to invest in new solutions in the coming year. This suggests to me that companies have not yet found what’s needed, an opinion supported by many of the other findings from the research.

93% of companies are actively looking for data quality issues but the majority still say that problems are unearthed when they’re reported by employees, customers or prospects.

A third of companies find issues by analysing the result of marketing campaigns, and 50% of companies say that their biggest data quality challenge is fixing issues before they impact the business. Only 35% of companies have a centralised data quality strategy.

This tells me that the current, fragmented approach to data quality management makes collaboration difficult and isn’t providing the best return on investment. My own experience is that solutions are often too technical and don’t allow the subject matter experts to contribute effectively.

On average, respondents believe that 26% of their data might be inaccurate and 23% of revenue is wasted in this way. The fact that businesses across the globe are not grinding to a halt due to data quality issues suggests to me that such statistics are unrealistic.

I’d infer from this that people don’t really know the state of their data and are speculating because they can’t easily get accurate answers using existing approaches.

In fact less than one company in four is using specialist data profiling and monitoring software to understand their data and detect data issues, so hopefully these capabilities are a top priority for the intended investments in new solutions.

The research also highlights a sharp correlation between an increase in company profits and the degree to which they use data quality solutions. Businesses need to have a comprehensive understanding of what data they have and why it is important to their organisations. They can then start to have a more sophisticated and proactive approach to their data management supported by 'self-service' technology which allows all the relevant players to contribute and collaborate in the most effective ways.

> See also: Is the data in your CRM a ticking time bomb?

One aspect that many organisations find challenging is knowing where to invest to gain the best results. Often this leads to months of discussion before work even starts. Don’t be afraid to try – even relatively small steps can deliver tremendous results. Whether you are a multinational or a SME there are now easily accessible technologies which will dovetail with your existing systems in hours rather than days, making it much easier to get telling results, fast.

With a defined data strategy, clear lines of responsibility and documented processes in place, the right technology can make a real difference to the quality and therefore the value of your organisation’s data. Within a relatively short space of time, you’ll start to see quantifiable results that have a direct, positive impact on your business.

Sourced from Derek Munro, head of product strategy, Experian Data Quality

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics

Big Data