Searching for a Master Data Management plan

It was not meant to be this way. Back in the late 1990s, two watershed developments promised to eradicate the chaos being caused by so-called ‘data silos’ as these had grown up in departments and subsidiaries.

One was enterprise resource planning (ERP), where a key selling point was the prospect that by integrating much of an organisation’s core business functionality into a single suite – manufacturing, finance, supply chain, HR and more – ERP would create and draw on a central, consistent set of business data. As a result of this, a customer being billed by finance, for example, would be the same as the one being supplied by manufacturing.

The other principle of hope was the enterprise data model. This was focused less on operational data and more on analytics. The goal here was to establish ‘a single version of the truth’ by mapping the architecture of corporate data across the entire enterprise.

While plenty of organisations bought into ERP and enterprise data modelling systems, few ever came close to realising those goals. The reality for the vast majority of businesses continues to be that data is dispersed across, and uniquely defined within, multiple different applications – and even within the same application.

Just ask Unilever. The consumer packaged goods (CPG) giant uses SAP’s ERP environment across its business units as standard, but because of the nature of its global business it is running (at the last count) 112 different ‘instances’ (separate implementations) of SAP. As one insider observes: “Each separate instance thinks it owns the definition of customer, price and so on. They are all designed on the idea that they own the data. And, it’s obvious, they can’t all be right.”

Unilever is by no means unique. Shell has some 45 instances of SAP, and 60 different implementations of JD Edwards’ software (now owned by Oracle).

But the multi-faceted nature of such large organisations means they are not physically capable of rationalising that disparity. CPG multinational, Nestlé, is known to have spent years trying to move to a single instance of SAP – and is rumoured to have spent between $3.5 billion and $4 billion not quite getting there.

On a more modest scale, Shell tried to get just five of its country subsidiaries to work off the same SAP core: the $120 million project never delivered.

That is not any kind of failure, says Andy Hayler, founder and chief strategist at Kalido, a data management software company that has worked with both companies. Despite the pipedreams of a single underlying database, people now “accept that this is the way of life”.

Waking up to that reality is one thing, but ignoring it is not an option. “Large customers already know this is a very big problem,” says Rich Clayton, VP of product marketing at business intelligence software vendor Hyperion. “The board is saying we need visibility into the business but that without standardised data they do not trust what they are seeing – or they shouldn’t be.”

“In effect,” observes Ed Wrazen, VP of marketing for international at data quality software company Trillium Software, “ERP has become a legacy system, a system that is widely federated.”

And that is when reality has kicked in. “People realise that their siloed systems were going to be part of the equation and that heritage systems [that were outside of the ERP suite] were necessary for running the business,” says Mark Massias, senior sales engineer at database and data integration specialist Intersystems. “That is part of a move away from rip and replace [of old systems] and towards integration.”

But living with that mix means organisations are making day-to-day and strategic decisions based on data that does not make sense.

An update to a customer record (say, a change to a delivery address) in one database does not find its way to other databases that also use that information – so a query across both may count the customer twice; the redefinition of the size of an item in the manufacturing process, say, is not replicated to the logistics database – with obvious consequences when it comes to loading shipments of the re-sized goods.

Such headaches have sent many IT people reaching for data cleansing tools. But those information governance efforts, though useful, need to go a step further, argues Emmanuel Sabourin, director of technology at supply chain software company i2. “The immediate need is to clean up the mess, but in trying to do so you realise that the mess quickly comes back. The important thing is to enforce proper processes so that the mess doesn’t come back, so you don’t create the mess in the first place.”

Synching feeling

What has become clear is that there is a desperate need for control, for some kind of master system of record that lays down the law on how data entities are created, changed and distributed. And during the past two years vendors from all walks of the software industry have been working night and day – or buying start ups – to begin to build such a master data management (MDM) system, knowing that the prize for the company that achieves the technological goal is huge.

But what exactly is MDM? Data integration software vendor Ascential (now part of IBM) offered this definition in a recent white paper: “MDM is the business process, applications and technical integration architecture used to create and maintain accurate and consistent views of core business entities across disparate applications in the enterprise.” It seeks to centrally control and distribute definitions of customers, brands, suppliers, financial indicators and so on.

But this is no look-up table: MDM has to establish the technical integration architecture for doing this in near real-time, with the MDM database checking and revising data constantly.

The idea is to push changes out to applications, says Bert Oosterhof, director of technology for the EMEA region at data integration software vendor Informatica, and ensure they are always in synch.

Those attempts to impose common business definitions need to be workflow-based and centred around a web-based repository where the ‘golden copy’ master data can be defined, published and distributed – not by IT but by the business people and operational systems that use and rely on that data.

That is a tall order. “The business and IT challenges associated with the ongoing management of master data have plagued companies for years,” said Henry Morris, an analyst at industry watcher IDC. “Organizations finding success with master data management technology illustrate the fact that with proper planning and discipline, significant progress can be made and measurable benefits achieved.”

“Too many implementations are being foiled because of bad data; MDM will drive much better data governance,” adds Trillium’s Wrazen. This point is stressed by Ascential in its white paper: “At the very time your business requires instant access to high quality information about core business entities, the facts that describe them are scattered to the far corners of more data transaction structures, databases, data marts and spreadsheets that you can count. No wonder you have 11 million customer identification numbers but only 8 million customers.”

“The cost of people working with more and more systems with copies of the same [but inconsistent] information was getting out of hand. It’s a big step forward to keep a master record [of that information],” says Oosterhof at Informatica.

One example of that is Shell Lubricants, a specialist division of the petrochemical giant. It found that compounds created by its R&D department were being categorised as completely separate products, even when they were identical except for the way they were branding. When it implemented an MDM product across the division, it found that instead of selling 20,000 different product categories, it was only actually selling 6,000. The cost implications were very significant.

To do that kind of job effectively, MDM needs to be a real-time control centre for data definitions. And it is only with the broad availability of high-quality bandwidth that companies could consider building a backbone that would drive master data to the relevant applications, says Christian Blumhoff, solution principal for MDM at ERP market leader, SAP. “Bandwidth is an enabler; it allows you to synchronise data [definitions] on the fly.”

That said, MDM is not all new. For many years organisations have used product information management systems in conjunction with their ERP system, highlights Blumhoff, and there have been specialist tools for integrating customer data and for rationalising financial information.

But these could not relate to data management outside of their own field, says Kalido’s Hayler. “The difference here is that MDM is a general purpose tool that is not specific to one entity type.”

That argument is echoed by analysts at IDC. MDM “is a mixture of old and new sectors, reflecting a history of enterprises attempting to deal with the challenge of providing a single view of the entities needed for conducting business and measuring performance,” says IDC’s Henry Morris. “What is new is the emergence of master data management infrastructure software that is purpose-built to support any and all types of master data, utilising a combination of techniques from data and content technologies.”

As that suggests, the people in charge of those processes should not be IT, but the users to whom the data means most.

“This fundamentally has to be a business problem,” says Kalido’s Hayler. And its rival in MDM, Hyperion, talks of moving the responsibility for managing changes in master data “out of IT to the business user’s desktop” and then automates those changes across the enterprise, [allowing] business users to “point and sync”.

Data stewards

Indeed, a recent straw poll of Kalido users found that the shift from IT to the users of the data was already underway. “There is an emerging trend for business executives to take ownership of MDM instead of the IT department,” the company observes. One user, Paul Highams, a technical architect at Unilever’s Global Solutions unit, says that the company’s changes to organisational structures and processes to improve the quality and timeliness of enterprise information hinges on MDM. “Master data management plays a key role in this strategy and is now the responsibility of finance – not the IT department,” says Highams.

Kalido’s poll found that the CFO and the finance department in general were the primary internal customers for MDM projects. Marketing chiefs and sales directors were the second biggest sponsors.

Which companies are those executives with? Aside from pioneer adopters Unilever, Shell and BP Commercial, MDM implementations are appearing in the UK at HBOS, BP, BT and Carphone Warehouse.

But this is just the beginning of a long journey. Master data management is in its very early days, says Clayton. “It is a requirement that needs to be sewn into business processes.” But any large organisation that thinks it can do nothing “to keep it consistent” is kidding itself.

Pete Swabey

Pete Swabey

Pete was Editor of Information Age and head of technology research for Vitesse Media plc from 2005 to 2013, before moving on to be Senior Editor and then Editorial Director at The Economist Intelligence...

Related Topics

Data Management