How companies are losing millions through shoddy data

There’s a debate between quality and cost in practically every business transaction. Like deciding whether to remodel the kitchen with a friend or a hired contractor, results are often dependent on skill level, time investment and willingness to absorb risks associated with the task at hand.

What’s surprising, however, is the same debate is happening in the C-suite regarding corporate data. It’s the same data that regulators demand meet strict compliance requirements, that the government checks against stringent legal standards and that organizations depend on to make smart financial decisions.

It’s the one item – well, millions upon millions of items – that can either make or break a business, and yet some still leave data quality up to chance.

That’s according to industry research firm Gartner’s year-long study on data quality released in late 2014, finding only a quarter of the companies surveyed had data quality standards enforced. The same report shows only 18% of organisations have formal metrics to measure data quality.

> See also: How data quality analytics can help businesses 'follow the rabbit'

Considering how vital data is today – from marketing efforts to risk management – the lack of data quality measures should be a wake-up call to business leaders. Gartner estimates the annual financial impact of poor data quality is $13.3 million, on average.

That’s combined with market analysis firm IDC finding data is doubling in size every two years, meaning problems are only growing. Suddenly having 'bad data' isn’t just a theory, but a multi-million dollar liability.

Fortunately, there’s still something organisations can do to correct any wrongdoings.

The different roads to data quality

Much like deciding how to remodel your kitchen, there are multiple ways to implement and execute data quality controls, ridding an organisation of 'bad data.' But just like the kitchen remodel, the path a company takes must decide between cost, quality, time and resources.

The first option is to build custom controls from scratch. The primary benefit of a custom-built solution is obvious: it’s designed to address an organisation’s most pressing business needs. This is certainly an attractive advantage, but it presents major challenges as well.

Cost is the main issue. The cost of a custom data controls solution is quite high overall. It’s expensive to develop because it requires a wide range of expertise, and maintenance becomes expensive when developers who created the original application move on to other projects or jobs.

> See also: How to tackle the great data quality challenge

Lack of insight on developing controls in-house is another significant issue. Deep knowledge of data controls is highly specialised, and IT leaders often develop their skillsets in other areas. Without the guidance of those who specialise in data controls solutions, critical pieces can fall through the cracks, causing wasted time or complete failure.

The second option is to purchase data control software to implement within your organisational infrastructure. In exchange for less direct oversight, purchasing pre-packaged controls software provides companies with the advantages of fast deployment and lower development costs. It also spares companies the need to invest time or money in contracting people who specialise in this technology.

Like anything else, pre-packaged solutions have drawbacks. Companies often choose this option in order to support one or more business processes. However, data quality is rarely a one-size-fits-all approach, and software will fail as a long term solution if it cannot adapt to meet changing needs. This is an issue with pre-packaged solutions, as many require substantial resources to customise or expand the controls beyond their limited scope.

Weighing quality and cost evenly

Think of it like the best of both worlds, like splurging on a licensed kitchen contractor but avoiding the custom cabinets to curb costs. In the same vein, organisations can buy a configurable controls solution that provides the benefits of a pre-packaged solutions with the flexibility of a building a custom controls solutions from scratch.

The optimal configurable controls solutions are designed based on best practices for each industry, incorporating capabilities for expansion and continuous growth to maximise long-term value. Capitalising on vendor expertise, they lower development, maintenance, training and audit costs without bogging down internal IT assets on complicated data quality projects. Doing so ultimately avoids homegrown mistakes without breaking the bank.

> See also: Where is your organisation on the data quality maturity curve?

Implementing these smarter data quality initiatives is a growing concern for organisations, with 86% of respondents now ranking it a priority, according to the Gartner research. From data governance to risk management, data controls are the backbone of business data today. And while just a quarter of companies have quality standards enforced, it seems to finally be an increased priority.

The threshold today for data quality standards is too high to risk getting left behind; smart companies will use comprehensive data controls as a marker for competitive advantage moving forward. In one way or another, companies should assess what makes sense for them now in order to implement data controls systematically across their organisation that are able to support their long-term business goals.

Sourced from Jeff Brown, product manager, Infogix

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...