Synchronised universe

In February 2006, when consumer goods giant Unilever announced it would sell off its Bird’s Eye and Iglo brands along with most of the rest of its frozen foods operations, CEO Patrick Cescau tee-ed up the sale with talk of the business’s “good track record”, its “growth prospects” and its “proven leadership” across different markets. But, given the carefully defined subset of activities he was looking to carve out, those might have been difficult statements to back up.

While Unilever was keen to offload most of its frozen foods business, it had decided to ring-fence lucrative parts of the E2 billion unit, most notably the Wall’s and Heart ice cream brands and the highly successful Findus operation in Italy. With the overlap in customers, suppliers and even products, separating and making any kind of definitive statement on the financials of those operations would have been no easy task.

“Data definitions should be managed independently of IT operations. Stewardship has to come from the business.”

Bill Hewitt, Kalido

When the deal was finally completed in early November, the sale – to private equity group, Permira Funds for E1.72 billion – was even more specific than expected, covering frozen foods across just eight major European countries.

How was Cescau able to slice and dice its business in so many ways? Because, according to insiders, Unilever has spent years trying to build consistency into the way its different business units define their key data such as customer, product or supplier.

Unilever is just one of a couple of hundred pioneers of the still-emerging technology and process set known as master data management (MDM). While IT and business leaders have watched – often powerlessly – as common data has been defined in contradictory ways by different business units, subsidiaries and departments, making any kind of high-level analysis of performance difficult if not impossible, companies such as Unilever, Shell, Diageo and Carphone Warehouse have been applying MDM in an effort to rationalise how data is treated right across their organisations.

Subject independence

The business requirement for some means of establishing consistent views of data is hardly new – as decision-makers everywhere know. A typical scenario might be a meeting involving the finance director, the head of marketing and the production manger, in which the simple question of how many customers their organisation has, is answered differently by each. The finance view of how many customers might appear in the log of purchase orders held by the enterprise resource planning (ERP) system; the sales and marketing number might be drawn from the CRM system; the production system might show a confusing overlap between supplier and customer.

Master data definitions 

Customer data integration

CDI is a data management process where all prospect and customer data can be distributed to points of interaction in a timely and accurate manner. (Experian)

External data synchronisation

The use of an agreed-upon protocol for the exchange of clearly defined information between trading partners. 

Global Data Synchronisation Network

A web-based, interconnected network of interoperable data pools and a global registry that enable companies to exchange standardised and synchronised supply chain data with trading partners. (GS1)

Internal data synchronisation

When the relative attribute of a product (or any other key piece of data) such as part number or dimensions is accessed anywhere within the enterprise, it needs to consistently have the same correct value. (FullTilt)

Master data management

The business processes, applications and technical integration architecture used to create and maintain accurate and consistent views of core business entities across disparate applications in the enterprise. (IBM)

Product information management

The provision of product information – frequently scattered throughout disparate departments – for use in more than one output media or distribution channel.

Such a situation stems from the way systems have been built. “In the past, business leaders viewed data as owned by the application in which it ‘lived’,” says Colin Rickard, managing director for EMEA at data quality and integration software provider DataFlux. “The idea that SAP owns your accounts or Siebel owns your clients is all wrong. The data is not owned by the system. The core reference data should be available to all key systems, not just to one. Key bits of data need to be application-independent.”

To move beyond that, organisations need to de-couple their master data from their applications, to put in place a master data layer, says Vincent Belliveau, sales director for master data management at IBM’s Information Integration Solutions.

There have been efforts to create master data structures that pre-date MDM: in manufacturing, product information management (PIM) has been widely used for several years, as has customer data integration  (CDI) in areas such as retail. Attempts to standardise and classify external data that is moved between companies has centred around universal product codes such as EAN (European Article Number) and UCC (Uniform Code Council) and global data synchronisation initiatives like GTIN (Global Trade Identification Number) and GLN (Global Location Number).

But, while those have proved useful in specific areas (especially in retail) they have been bound by their narrow focus, says Bill Hewitt, CEO of at MDM and data warehousing software specialist Kalido. Any attempt to gain a corporate-wide view of product or customer, for example, is going to be thwarted if the definitions are only imposed on a part of the business.

In contrast, the MDM products that are now emerging are “subject independent” – and they need to be. What is required, say analysts, is a mechanism interlinked with a set of processes that lays down the law on how key data is defined and kept consistent across all the applications that use it: a so-called ‘golden copy’.

Buy in  

For many organisations, that may seem like a gargantuan undertaking. And, indeed, to date, take up of MDM has been almost exclusively among large companies – perhaps not surprisingly given that they are likely to have the most heterogeneous applications environments.

But adoption has also been held back by a lack of understanding of the problem –  and whose problem it is anyway. “It is often difficult for senior management to explain why the organisation needs to do this,” says Hewitt. “But business needs to get involved in the integrity of its data,” – not least to ensure it has the confidence to make decisions or fulfil compliance obligations.

The up-market nature of MDM at this stage is reflected in contract sizes. At Kalido, for example, the average licence sale prices is running at $400,000; the company has around eight customers in production with its MDM software – among them Shell Lubricants, Unilever and beer maker Labatt. The company now derives 30% to 35% of its revenues form MDM, up from zero two years ago.

But Kalido CEO Hewitt thinks that demand has barely got underway. “This is potentially a large market, and one at an early stage,” he says. “The selling price is high because big companies appreciate that they are buying something very important,” something that can impact margins through effective supplier management, that can enable mergers by bringing clarity to financials, and bring efficiency through the rationalisation of brands.

That gives analysts at Gartner the confidence to predict that within four years organisations will be spending round $1.06 billion annually on MDM software.

Viability

One important factor that will determine that take-up is the realisation that many of the technology pieces required to orchestrate data look-ups across the organisation are now in place.

Until relatively recently, the notion of being able to synchronise data definitions – to read and write backwards and forward between many systems and easily stitch them together was beyond most companies, says DataFlux’s Rickard. “It was just about possible through a lot of complex interaction, but that kind of knitting together was costly and typically one-off,” he says – and required lots of bandwidth. “But now with a system such as SAP or Siebel that can consume web services, all of sudden you can use web services to propagate data between many systems.”

“Do you want an up-to-date view of your customer, and make that available in real-time to agents who are interacting with your customer? Do you think that can do things for your business? Absolutely,” adds Rickard.

But enterprise-wide MDM is an elephant of a meal – and one that many companies will only want to tackle a bite at a time.

He cites the example of a multinational chemicals company which does not have any pretensions at this stage of obtaining a single view of all its data: it will be content to obtain a single view of two crucial areas: the materials it buys and the people it buys them from.

The data the company is looking at is very fragmented, so its best guess of, say, how much copper sulphate it is buying from a particular company is often out by a large margin,” says Rickard.

With only a fuzzy view, the chemicals company is unable to work out any savings it could make by using different suppliers. “It is hugely inefficient, leaving it unable to leverage information to drive down costs,” he says.

As part of several initiatives, the chemicals giant is looking to adopt the ‘eClass’ international classification and description of materials, products and services being championed by DaimlerChrysler, BASF, Audi and several other (mostly German) multinationals. As a key piece of master data, eClass is used to describe all materials handled by a company, with the idea that the definition is accepted both internally and in its supply chain. At this point, eClass includes 27,000 definitions of different materials.

How can such a standard be integrated into a business? Certainly, the idea of a purchase order clerk looking up a book containing 27,000 definitions is not credible. So the need is for the common definitions to be workflow-based and held in a web-based repository where the ‘golden copy’ master data can be defined, published and distributed.

“The only way to solve this is at the point at which the interaction is happening, to push out some sort of standard information to the user,” says Rickard.

In the case of the chemical company, the desire is to integrate eClass into its ERP system, while it takes the number of instances of SAP it runs down from 16 to one. “This is the application of one set of standards, in one place; the capture of an accurate purchase order, through the use of master data. This is critically important. It will have huge impact on their business in terms of margins because they can then apply analytics with some confidence that it will drive out value. But it is no ‘Big Bang’; it is one change to one part of their SAP system,” says Rickard.

That highlights one of the critical questions of MDM – who owns the data and who is willing to take responsibility to ensure its enterprise-wide quality.

“Too many businesses still see data quality as a costly problem owned by IT,” says Ian Charlesworth, an analyst at Ovum. “However, it’s no longer the responsibility of IT to ‘put it right’.”

Bill Hewitt at Kalido echoes the sentiment. “Data definitions should be managed independently of IT operations. Stewardship has to come from the business,” says Hewitt.

Ultimately, the IT organisation will still have to set structure for MDM, but the actual stewardship of the data has to rest with its creators – the business.

Further reading in Information Age

The Future of the Data Centre, March 2005

Find more articles in the IT Infrastructure briefing room.

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics