Data quality woes fuel DataFlux’s rise

Data quality vendor DataFlux is somewhat unique for an acquired software company. Unlike others which are quickly absorbed into their new parent’s structure and become ‘added functionality’, following its purchase by business analytics giant SAS in 2000, DataFlux was allowed to retain its management and identity. It has drawn on the resources of its parent company to spread across the globe, but has otherwise been left alone as a profitable sideline.

Data quality is not a particularly exotic sub-sector of the software industry, and even the company’s CTO and co-founder Scott Gidley is quick to acknowledge it “lacks sex appeal”. But clean data is the foundation for any company that expects to extract value from internal or external data, as further investments in things like expensive BI tools are pointless if the data fed into them is riddled with duplicates, false entries and multiple versions of the truth.

Gartner puts DataFlux at the top of its game in this important but very specific function, placing it in the coveted top right corner of its magic quadrant for data quality tools – ahead of SAP’s Business Objects, and the more data integration-focused Informatica.

But in today’s competitive software market doing one thing well is no longer enough; DataFlux’s standalone period appears to be over with the company announcing at its recent customer event that it is developing ‘Project Unity’, a “next generation enterprise management platform encompassing data quality, data integration and master data management” that will draw on SAS’s experience in data integration and ETL development.

“[Data management] is a good market to be in – 25% to 30% growth per year,” says Colin Rickard, the company’s managing director for EMEA. “It’s a hugely growing software industry sub-sector.”

Much of that growth is driven by increasingly demanding compliance legislation, which according to DataFlux CEO Tony Fisher “has motivated many organisations to take a close look at their data.”

“Compliance has been a huge driver for the last 24 to 36 months,” he says. “It’s made a lot of businesses more cogniscent of their data and driven a lot of business [for us]. For four or five years our primary purpose was evangelism, but now organisations are beginning to understand data as a corporate asset.”

Beyond legislative drivers, an abundance of mergers and acquisitions, particularly in the financial sector, promise golden days ahead for data integrators. Although “right now I wouldn’t describe the economic climate as positive in any way shape or form,” says Rickard.

“We’re in the knee-jerk reaction period where people stop everything they’re doing,” adds Fisher. “Projects get frozen while everyone assesses which way things are turning.”

However Rickard says while “a significant” number of companies DataFlux deals with are cutting back on projects, “we are having conversations with banks, including some of those that have merged.”

“They have a lot of decisions to make and branding to sort out, then they will focus on increasing efficiencies,” he predicts.

While the storm blows over, the company is investing heavily in R&D. Fisher explains DataFlux’s move to a more general line of business with ‘Project Unity’ will allow SAS to sell components of its technology rather than large-scale licences, a model that should broaden its customer base and bring it up against the likes of Informatica. But Fisher sees the DataFlux’s data management platform competing primarily “with the big platform vendors like IBM,” and even Microsoft.

“SAP (with Business Objects) as well as Oracle are also putting together pretty good platforms, which is fine if you are 95% Oracle or SAP, but most organisations are not. If you are going to have a data integration platform it has to integrate with everyone; you have to support the entire landscape.”

This is perhaps the main reason why SAS allows DataFlux its autonomy. Independence and agnosticism are important selling points in the data integration space, attributes strongly marketed by incumbents in the sector.

Master data stroke

Along with data quality, another related area, master data management (MDM) could well prove another strong revenue base for DataFlux. Notoriously hard to implement, the prospect of a ‘single version of the truth’ accessible in real time is an easy sell for most businesses – until they realise just how much time, effort and hand coding is required.

“MDM today is a complicated implementation,” Fisher acknowledges, explaining that the challenge is probably something only an enterprise could consider. “The tools are getting better but today you need expertise in the organisation, I don’t really see that in the mid-market.”

The company’s new ‘qMDM’ solution draws on the company’s data quality roots – particularly its accelerators – and uses a process-based framework to allow a phased implementation rather than a plunge.

“We built it to meet the needs of enterprises who are struggling with the commonly accepted ‘all-or-nothing’ approach to MDM that is prevalent in the market,” says Gidley.

Related Topics