Attention to detail

It seems almost redundant to assert that one critical role of the IT department is to help the business make use of its data. Somehow, though, that objective has not always been the direct focus of IT projects; infrastructure upgrades, application deployments, hardware refreshes and the like have often taken precedence over data management. 

“Improving the quality of data has always been priority 11 out of ten,” jokes Rob Karel, principal analyst at Forrester Research. “It was something that everybody knew was important, but something else would always come up that people thought was more important.”

There are signs, though, that this might be changing. One is Information Age’s own Effective IT survey, conducted at the end of 2009. It found that master data management – whereby organisations establish a system of processes to ensure that a clean, reliable copy of critical data is maintained at all times – is the IT strategy that most respondents intended to deploy in 2010.

One might also view the flurry of activity in the data management market in recent months as a rapid reaction to changing customer priorities. This activity includes acquisitions: IBM acquired MDM vendor Initiate Systems in the last week of February 2010, and data integration supplier Informatica bought another MDM vendor, Siperian, the very next week. It also includes product launches; data quality vendor DataFlux announced a new data management platform in February 2010, only weeks after open source integration supplier Talend introduced its own MDM product. Clearly, suppliers in the space believe there is demand to be met.

However, Karel sees this activity not as a knee-jerk reaction to recent changes, but as a series of moves that have been in the works for months, if not years, and that were postponed by the economic downturn.

"A lot of the expensive IT systems haven’t acheived the promised return on investment. One of the root causes was the fact employees did not trust the data that was flowing through the systems."
Rob Karel, Forrester Research

The process that Karel believes may have at last pushed data management into the top ten list of priorities has in fact been a long one. “A lot of the IT investments made in the past were very expensive, long-term investments such as massive ERP implementations or data warehousing and business intelligence projects,” he explains. “But they often didn’t receive the kind of cross-enterprise adoption that was required to get the promised return on investment, and one of the key root causes for that was the fact that employees did not trust the data that was flowing through the systems.”

The growing adoption of data management technologies seen in recent years has been, Karel argues, motivated by a desire to “rescue those strategic invests”. And the economic downturn has for many organisations acted as a “compelling event” that has driven them to launch such rescue missions. 

Mastering data

As far as software vendors are concerned, master data management is where the action currently lies: Forrester Research estimates that the MDM market will be worth $1 billion in 2010 and is growing at a rate of 20% a year. This help explains why vendors from surrounding disciplines are encroaching into the MDM space.

According to Tommy Drummond, VP of product marketing at Informatica, the company’s $130 million acquisition of Siperian was prompted by high demand for MDM among its own customers. “Many of our customers are implementing MDM solutions at the moment to solve their challenges around compliance, better reporting and reducing cost,” he says.


Page 2 of 3

He argues that the foundation of a successful MDM project is a data integration platform, comprising the integration and data quality tools Informatica already sells. The opportunities for cross-selling between Informatica and Siperian customers are therefore considerable, he says. 

The benefit of Siperian’s technology in particular is, Drummond claims, the speed with which it can be deployed. “The product allows you to import your existing data model, whereas our competitors define a data model that you need to conform with,” he says. “The result is a faster implementation time, and that’s a key benefit.”

Another company moving into the MDM space is DataFlux, a data quality vendor wholly owned by analytics giant the SAS Institute. It recently revealed a new data management platform that combines integration and master data management functionality with its existing toolset. 

It has been able to do this thanks to a decision by parent SAS to move a significant slice of its research and development resources and intellectual property based around integration technologies into its DataFlux subsidiary, in order to give those technologies a life of their own. 

A critical technological component of the data management platform that has allowed DataFlux to offer MDM features – and a direct result of research from SAS – is data federation, says CEO Tony Fisher. “When MDM first started, the idea was to persist the master data management in a separate store,” he explains. “That has turned out to be extremely impractical for a number of reasons, chief among which is that the real data and the system of record is in your applications, and you’ve got a potential disconnect between that and your MDM.”

Federation allows organisations to build a virtual master record that is always in sync with local application databases, Fisher explains. “Federation allows an MDM environment to create a virtual connection of the variety of sources that are part of the MDM system, which is why it is a key component of MDM.”

For Forrester’s Karel, the development by DataFlux of its new platform is a good move, but a catch-up move. “They are now doing what IBM, Informatica and SAP BusinessObjects have done for a number of years,” he says.

Open data movement

No software market would today be complete without a commercial open source vendor, and in the data management space that role is filled by French company Talend. Having built its own data integration and data quality products, Talend changed tack when it came to MDM and last year acquired the relevant technology from compatriot Amalto. It launched the resulting open source MDM product in January 2010. 

According to Talend co-founder and chief architect Fabrice Bonan, the MDM market is ripe for the kind of price disruption he says Talend has already caused in the integration and data quality spaces with its previous products. 

“Today, the MDM market is exactly the same as the data integration market five years ago, meaning that there are a few vendors with very high prices but with few differences in product,” he says. “We want to deliver the same message that we did with data integration: you don’t need to pay hundreds of thousands of bucks to start an MDM project.”

"The MDM market is exactly the same as the data integration market five years ago, meaning that there are a few vendors with very high prices but with few differences in product"
Fabrice Bonan, Talend

But Bonan argues that there is a reason to adopt open source MDM beyond the price point. In order to make sure that the rules defined in an MDM system – rules governing how an address should be formatted, for example, or which system of record takes precedence in the event of a conflict – are propagated throughout the data estate, those rules must therefore integrate with other data management products such as integration and quality tools.

But in proprietary systems, Bonan claims, this is not always possible – at least not cheaply – even if the MDM tool and data quality tool are sourced from the same supplier. 

Talend’s MDM rules are defined in an XML format, meaning that not only can they be integrated into other standards-based data management tools but into applications as well, Bonan says.

Integration as a service

These moves by Informatica, DataFlux and Talend all reveal a move in the data management industry towards integrated platforms comprising data integration, data quality and MDM. IBM has already compiled such a platform – the recent addition of Initiate Systems, which operates in the healthcare sector, had more to do with a recent US government healthcare stimulus package than any gap in the IT giant’s technology suite.

That is not the only trend shaping data management technology, however. Last year, Informatica launched a cloud-based integration product – based on SaaS pioneer’s application platform – in the US, and brought it to Europe in early 2010. 

The case for Informatica Cloud is simple: businesses that use’s CRM tool or other SaaS-based applications eventually feel the need to integrate data contained in those tools with on-premise systems, and this is what it does.


Page 3 of 3

Most Informatica Cloud customers use it to integrate CRM with on-premise ERP or finance apps, says Daniel Niemann, VP of business development for the service, while a few are doing cloud-to-cloud integrations and others on-premise to on-premise.

Mastering integration

But Niemann insists that this is more than just a point integration tool located in the most convenient environment for cloud apps. He believes that, just as SaaS applications such as have allowed business units in corporations to acquire functionality independent of the IT department, Informatica Cloud will help business analysts – who typically design information architectures before handing them over to the IT department to implement – master their integration challenges by themselves.

“With a typical integration project, a business analyst would decide what fields have to go from one app to the other, and the changes that need to happen to that data, and they would describe all this in Excel,” he explains. “And they would then hand that over to some IT guy who would recreate that Excel spreadsheet in middleware.

“So with Informatica Cloud we’ve built a user experience in an integration product that the business analyst guy can use to actually put his integration into production,” Niemann says. This is not motivated by a desire to grant business analysts self-determination, he adds; it is rather the fact that their time costs less than that of middleware experts. 

DataFlux’s Fisher says that components of his company’s new platform were designed to be used by so-called ‘business users’, i.e. non-IT experts. “We’ve done a number of things such as reporting and dashboarding that are designed to help the business user understand the IT infrastructure,” he says.

"We have seen a shift of the responsibility for data management from IT to the business user"
Tony Fisher, DataFlux

This has been motivated by what Fisher described as “a shift of the responsibility for data management from IT to the business user. What we see now is that the business user is having a very active role in technology decisions, especially around data.”

What Fisher is referring to is by far the most important shift currently under way in the field of data management – much more significant than system design or vendor consolidation. This is the slowly developing recognition that the business must take ownership of and responsibility for the quality and accuracy of data. 

Data governance

“The vast majority of data management enquiries I field from my clients are not about technology,” says Forrester’s Karel. “They’re about the soft stuff: how do we organise ourselves, how do we build a business case, how do we engage the business, how do we define return on investment, how do we decide roles and responsibilities. This is what my clients are feeling pain around; this is data governance.”

The term data governance describes the best practices for handling data that organisations must adopt if they are to derive the most value from it, and minimise the risk to which they are exposed in doing so. It is, Karel says, “an incredibly immature set of practices”.

“Those organisations that have done it well have done it in a very targeted fashion, not enterprise-wide data governance because the problem is too great,” he explains. “If you aim to govern the 20% of the data that impacts 80% of your processes and operations, that’s a great start.”

Businesses seeking to solve deep-rooted data quality problems that have undermined their long-term IT investments must begin by sorting out whose responsibility it is to maintain which data, Karel says. Decisions about technology follow long after.

“All these vendors are tools vendors, so absolutely they are going to evangelise about the tools that support the data governance processes,” he says. “But the tools are not solving the problem. They are simply enabling a data governance process that must be designed and owned by the organisations themselves.”

So what does that mean for the industry? Will the gradual adoption and maturing of data governance lead to the anticipated MDM sales, or will customers discover that, with discipline and best practice, they can make do with traditional data integration and data quality tools? 

“That depends on the business objectives of each company,” says Karel. “For some organisations with complex environments, data governance programmes may absolutely lead to the development of a strategy that would recommend the use of MDM technologies.

“But for other organisations, where their master data requirements are really focused on enabling a trusted view in a targeted data warehousing environment or in a CRM application, data integration and data quality technologies may be all that they require.”

Pete Swabey

Pete Swabey

Pete was Editor of Information Age and head of technology research for Vitesse Media plc from 2005 to 2013, before moving on to be Senior Editor and then Editorial Director at The Economist Intelligence...

Related Topics