Data may be the fuel that powers modern business. But when its potency is diluted or contaminated, it can just as easily cause the business to slow, misfire or even stall altogether.
Ensuring the quality of core business data is one of the major challenges of IT – and, indeed, the business. In fact, many of the problems that organisations have encountered when rolling out corporate-wide enterprise applications, business intelligence programmes or data warehousing initiatives can be traced back to weaknesses in how they manage the quality of their data.
The good news, at least according to analysts, is that the practices and technologies designed to deliver data quality have matured to the point where managers’ trust in the data they work with is building.
In fact, the impetus behind many data quality improvement programmes is flowing from the frustration of middle and upper management layers who feel their ability to work successfully is undermined by a lack of faith in their organisation’s data. As Ted Friedman, an analyst at Gartner, observes: “Awareness of the importance of data quality is increasing.” And that focus on quality is the bedrock of data governance drives to meet regulatory and internal requirements.
Of course, any amount of money can be spent – and squandered – on ensuring every last piece of data is accurate, complete, consistent, standard, unique and up to date. What analysts emphasise, though, is that information mangers need to define what data quality means to their organisation and what the benefits will be of refining quality. “As a primary data governance deliverable, you must define the data quality rules, standards and service-level agreements unique to your organisation,” says Rob Karel of analyst group Forrester Research. “Once you define the scope, objectives, roles and responsibilities that will frame your data governance initiative, you can begin the long, painful but extremely valuable process of truly making your data a corporate asset.”
So what do the more mature technology providers bring to the table to raise confidence in underlying data and hence enhance its value to the business? Among the most common features, Forrester cites:
• Data cleansing, formatting and standardisation tools that “promote consistency in data capture and reporting, and increase the effectiveness of data matching processes used to deliver uniqueness, verification and enrichment capabilities”;
• Data merging tools to consolidate and de-duplicate data;
• Data profiling tools to gain insight into the characteristics and scope of the data being held;
• Monitoring tools to ensure that rules governing quality are being dutifully applied.
Not every product is up to the job. Although there are many specialist data quality technologies, the options narrow when information managers have requirements for a robust, enterprise-class product that can support both batch and real-time quality services, complex and configurable matching and merging algorithms, and (in the case of customer data) multi-national address and name data standardisation and verification, according to Forrester’s Karel.
Identifying the best practice approaches and a capable set of technologies is arguably the easy part; making a business case for data quality programmes has always been more difficult.
However, today’s business climate is helping. Gartner’s Friedman recommends that IT and business leaders attempting to drive data quality improvements should “justify their efforts based on current business drivers and issues”, such as regulatory compliance, governance and cost-optimisation.
As he concludes: “Historically, only a minority of organisations have taken a proactive approach to managing data quality – the majority have endured the pains of poor-quality data and dealt with it in a reactive manner. However, with recognition of the impact of poor-quality data increasing as a result of contemporary business drivers, many are undertaking improvement efforts within the context of individual projects or occasionally on a broader, enterprise-wide basis.”
It may be overstating the case to say that the credit crunch is behind the move to more trustworthy, more valuable data but it has certainly added momentum.