The integration imperative

When three giants of broadcast and communications – NTL, Telewest and Virgin Mobile – came together in mid-2006 to create Virgin Media, the resulting group faced a mammoth IT challenge: how to integrate the 24 core databases that held its billing, customer and other critical information.

From the outside, this might have seemed like as an arcane problem, buried deep within the cogs of the group’s IT systems. But the stand-alone nature of these data sources was being felt at the frontline: call centre agents, for example, were having to access all 24 distinct systems just to respond to customer enquiries, creating anguish and frustration on both ends of the line.

Integration may be an age-old IT problem, but it is one now being thrust back into the limelight amid the many mergers and business consolidations sparked by the credit crunch and the pressure to hone efficiency. While many working in IT are reviewing their future, those in integration are expecting high demand for their services. As one freelance systems integrator currently working for a major UK bank told Information Age, there is “at least seven years” of solid work ahead for integration specialists like him following October’s corporate carnage.

Certainly, the market for data integration software is showing no sign of slowing. Gartner estimates that sales will rise at an average of 17% annually over the next four years, to hit $3.16 billion in 2012, up from $1.44 billion at the end of 2007. And the recent financial results of independent vendors are supporting that upbeat outlook – even suggesting that integration will see a ‘crunch dividend’. Informatica, for example, reported annual revenues up 16% to $455.7 million for 2008. As Gartner suggests: “Contemporary pressures are leading to an increased investment in data integration in all industries and geographic regions.”

Others are picking up on that sense. “Obviously the credit crunch is making people think carefully about their investments,” says Nick Millman, senior director on business intelligence for consultancy firm Accenture. “But we haven’t seen any decline in demand for data integration.” As well as merger-driven integration, organisations are under pressure to exploit the data they already have, he says, and that typically means pooling disparate data sources.

There is a realisation that data can hold greater value to a business than simply something amorphous to be harvested and stored for reasons of legality or compliance. This represents a seismic shift in the relationship between business and IT, and it is no accident that the rise of data integration has closely tailed the popularity of ‘data-use’ sectors such as business intelligence and service-orientated architecture (SOA), with the ready supply of ‘clean’ data drawn from different sources a critical precursor to the success of either.

Vital spark

At a broader level, demand for integration has also been spurred by a growing appreciation that data is the lifeblood of the organisations – not just in terms of providing the basis for sales opportunity and business efficiency, but as a vital bedrock for business performance management and regulatory compliance.

“People sometimes ask whether IT matters, but not whether data matters,” says Informatica’s CEO Sohaib Abbasi, citing research showing that “71% of executives recognise data as their most important asset, but only 43% say they are fully using it to gain a competitive advantage.”

Integration has moved far beyond its basic function in ETL (extraction, transformation and loading). Increasingly, large data integration projects are undertaken as a prelude to the implementation of corporate-wide, standard business intelligence software.

The value of any business intelligence application, of course, is hugely dependent on the data pumped into it: “Data integration is the glue that holds those data models together. It’s absolutely important to get the data right,” says Millman.

Integration is also viewed as a precursor to the adoption of service-orientated architecture (SOA). “I see SOA as a key trend [in data integration],” says Adrian Simpson, business consultant with SAP. “SOA is helping organisations realise the value they get from pulling together islands of data. Interoperability has become more mainstream as tools are increasingly designed to be flexible with open standards.”

While the market for data integration still supports several platform-independent companies, other leaders in the sector provide tools as part of a larger information management ‘stack’. “To a certain extent, the sector is commoditised,” explains SAP’s Simpson. “We compete with technology vendors selling a tool bag, and application vendors selling data integration as part of a process.”

The competitive landscape for data integration tools is subsequently one entwined in the portfolios of the large providers, such as IBM, SAP and Oracle. They have become major forces in data integration, largely as a result of acquisitions: IBM through its buy-out of Ascential, SAP following its purchase of Business Objects and Oracle as a result of its acquisition of Sunopsis.

Independent providers such as Informatica, iWay and ETI, attribute a large degree of their success to their Switzerland-like position, offering a fat catalogue of connectors that can deal with a huge number of data sources and targets. Beyond offering independence, such vendors seek to differentiate and up-sell beyond pure integration by providing a suite of related offerings. The result has set the scene for more market convergence, with data quality and master data management (MDM) solutions in particular being added to existing information management portfolios across the sector.

Like integration, data quality has enjoyed something of a free ride on the coat tails of business intelligence (BI). In keeping with the old adage “garbage in, garbage out”, companies like SAP Business Objects see the technology as essential for building trust in BI systems: expensive BI applications are pointless if the data fed into them is riddled with duplicates, false entries and multiple versions of the truth.

“Data is only as good as it is when it is entered into the system,” says Millman. “There is a clear trend around data quality; if you embed it in the data integration layer it saves a lot of effort later. Many leading data integration tool vendors have built in a data quality [product].”

SAS-owned data quality software vendor DataFlux has seen strong demand for its products ever since this dawned on the corporate world.

“[Data management] is a good market to be in and a hugely growing software sub-sector, growing at 25% to 30% growth on year,” says Colin Rickard, DataFlux’s managing director for EMEA, explaining that compliance pressures over the last few years have indirectly made companies “very cogniscent” of their data “and driven a lot of business for us”.

Independence remains as much of a priority in data quality as it does in integration. DataFlux CEO Tony Fisher observes that “SAP and Oracle are putting together pretty good platforms, which is fine if you are 95% Oracle or SAP, but most organisations are not. If you are going to have a data integration platform it has to integrate with everyone; you have to support the entire landscape.”

Question of independence

Agnosticism is at the heart of any discussion on data integration. “If you acquire a company where the IT is ‘an SAP shop’ where do you go to reconcile Oracle data?” asks Informatica’s general manager of data integration, Girish Pancha. “Do you trust Oracle? Do you trust SAP? Or do you trust an independent specialist? The number one alternative to our technology is hand-coding – so it’s a great place to be for us.”

The platform giants are quick to deny such an issue exists. Since integration by definition must be agnostic, any hint of vendor lock-in is counterproductive.

“I don’t see it as an issue, it’s not one we come across,” says SAP’s Simpson. “An awful lot of companies run SAP, and we’re quite happy to go head-to-head with the best of breed [in data integration].”

The question of independence is complicated and depends on a company’s vendor ecosystem and the individual project, says Accenture’s Millman: “It’s hard to draw definite boundaries; you have to look at [the solution’s] connecting functionality. If you decide to go with a single vendor stack [for BI], then go with that stack [for integration].”

The market may be mature and dominant by familiar names, but new challengers are emerging with propositions. Talend, for example, is coming to market with an open source data integration package, backed by $12 million in venture capital and with co-founder and former CEO of Business Objects Bernard Liautaud guiding its management.

That team expects to fill the demand for independent data integration as some of the larger best of breed vendors are swallowed up.

“Everyone knows Informatica is not going to be independent for very long. They’re a prime acquisition target,” suggests Yves de Montcheuil, Talend’s vice president of marketing. “We think of ourselves as ‘at least as good’,” he says, adding that after just 18 months of operation the company has built up 400 paying customers including Virgin Mobile, AOL and Yahoo. “35% of our clients are larger companies, which is a good sign for the market,” he adds.

Certainly a key strength, and one that plays well in the integration arena, is the sheer number of connectors its open source community has created – the core currency of any integration solution.

“The developer community provides a very small part of the core product, about 20% to 30%,” says CEO Bertrand Diard. “But one third of the connectors come from the development community. [Our competitors] might have 30 to 40 connectors, while we have over 400. We are ready to play,” he concludes.

Raven Zachary, an analyst with The 451 Group, predicts Talend’s open source model could be “strongly disruptive” in such a highly controlled sector.

“Open source will make data integration more accessible, allowing companies to make better use of silos of information without the hindering effect of existing cost structures,” he says. “Generally the incumbent commercial vendors publicly ignore open source vendors. But when you have a conversation with their sales teams you notice they have been trained up to have a compelling argument against it,” he says.

Accenture’s Millman acknowledges that the open source model for data integration “is being increasingly looked at,” but suggests it is “too early to say whether it will take off or not.”

And despite progress towards making integration tools more accessible outside major platform vendors, Gartner warns that “the market has not yet reached a point at which data integration is typically achieved via a single platform or suite. Technology buyers have been forced to acquire a portfolio of tools from multiple vendors to amass the capabilities necessary to address the full range of their data integration requirements.”

That is a situation likely to continue, at least in the short term. But the likelihood is that the strong demand expected over the coming years will gravitate towards vendors with the most comprehensive offerings.

Related Topics