Why data gets better by change and not by chance

Up to 96% of customer contact data is partially inaccurate, according to the Sales and Marketing Institute and D&B. This is a shocking statistic. This figure alone should have business leaders leaping from their seat in panic.

Can an organisation’s data really be in that bad a state? The short answer is yes. Over time, data decays at 2% per month, so a database is never static – it is constantly degrading.

Customers are constantly changing job roles, phone numbers and email addresses. A business is occasionally adding duplicates, spelling things wrong, and introducing bad data to the database. This situation is costing money and time, and it’s a needless waste of resources.

It sounds obvious enough when written in black and white, but it’s alarming how many businesses are sitting back and doing nothing about it.

>See also: The chief data officer can eliminate the £9M data-quality black hole

Data is so critical to business operations that companies are increasingly employing people to watch over it.

A chief data officer (CDO) is a relatively new role, but it’s been created because there’s a need for someone to give data a voice. This is particularly important at boardroom level, where many stakeholders don’t have full understanding the impact of poor data quality.

It’s time to give data credit as a business asset that cannot be squandered. Here are four ways to bring about change.

1. Assign the correct ownership to your data

Data quality improvement is not something you can leave with the IT department. It’s not a purely technical issue. It’s not something that can be dealt with in a few days, or a process than can be run to schedule. You need human involvement across the entire business.

To make progress, you need to assign ownership of data quality within the business, led by a CDO, or someone else with influence at boardroom level. This is the first step in a wider culture change that should help people see data differently.

2. Set long term goals for data quality

Data cannot be purified permanently during a one-off data quality project. For on-going data quality improvement, you need a coherent and complete strategy that runs alongside your normal operations.

You need to start with data capture, figure out where the errors are coming from, and put software in place to prevent those errors from being introduced.

You can then move on to using automated processes to catch verification errors and duplicates before they bed in. If someone’s job role changes, you should be able to reflect that quickly and without a great deal of manual intervention.

If you don’t stem the flow of new data errors, a one-off data quality initiative is like pulling the leaves off a weed in your garden. Yes, it solves the problem quite effectively in the short term, but unless you tackle the roots, that weed is going to re-grow sooner or later.

3. Join up and integrate your systems

Businesses often struggle with data quality because they store data in so many places. This occurs when businesses can’t get the functionality they need from their CRM, so they start using spreadsheets alongside it.

Sometimes, businesses have legacy systems running on ancient servers, storing data nobody can access. Not only that, but legacy systems tend to be isolated and occasionally totally unsupported.

If you don’t have an integrated view of a customer, you’ll never fully understand them. You’ll also miss opportunities to sell to them because their profile is broken apart. To make effective decisions, data needs to be out in the open; it needs to be put to work, and systems need to be integrated to support that.

From sales to marketing, budgeting to support, your entire business should be working from a single customer view. You may have to digitally transform certain parts of your system to do this, but the efficiency gains and ROI are always worth the investment.

4. Enhance clean data from third party sources

If you rely completely on your own data, you’re going to miss opportunities and leave the market wide open for your competitors. But if you plug your data into a third-party enhancement service, it can be enriched with information that gives you an edge.

Data quality assessment is made up of three key components: accuracy, completeness and timeliness. These three components never come together through chance. In order to maintain ROI, you’re going to have to invest in change.

Change is becoming more important as marketing evolves. Social media is inspiring new ways of talking to people, and better ways to advertise and connect.

>See also: How companies are losing millions through shoddy data

For example, marketers are increasingly looking at hyper-personalisation and highly tailored messaging, using online ads and social conversations. You simply cannot personalise your marketing message without knowing who you’re talking to. Data is the missing link.

The Internet of Things is increasing the rate of data acquisition. We are moving towards a real-time business environment, and 25% of businesses already have big data projects in production.

Customers don’t just expect you to make good, accurate decisions – they expect you to use their data to make those decisions near-instantly. If you’re leaving data quality purely to chance, you’re walking a very treacherous path because you can’t offer the focus and timeliness your competitors are offering.

It’s time for businesses to stop leaving things to chance, and prepare themselves for change. We can’t promise that it will be easy, but without change, you simply won’t have the insight to compete.

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics

Data