Teradata enjoys healthy appetite for analytics

If the 1990s and early 2000s was the age of applications, then the current period of enterprise IT is arguably the era of analytics. High-end corporate IT systems are no longer sold on the promise of process automation, but on the possibility of finding competitive advantage through detailed analysis of data.

But amid recession and constrained IT spending, is the business case for analytics strong enough to justify the not-inconsiderable infrastructure investment that goes with it? Judging by the financial performance of data warehousing technology vendor Teradata, enterprise organisations seem to believe that it is.

The company, spun out from US technology giant NCR in 2007, grew revenues by 12% year-on-year to $470 million in the three months ending 30 June 2010. This growth was driven by a number of new customer wins, Teradata said, albeit primarily for its North American division. And unlike most IT vendors that are now reporting strong year-on-year revenue growth, Teradata’s performance in 2009 was stable.

This all reveals that the value of sophisticated analytics is something that enterprise organisations believe in and are prepared to invest in, says Martin Willcox, Teradata’s director of platform marketing for Europe. “Business leaders, especially in financial services, tend to think a recession is the worst time to cut back investment in understanding your business,” he says.

Nevertheless, Teradata’s move in 2008 to diversify its product range with various more ‘affordable’ data warehouse appliances now seems well timed. Arguably, though, its hand was forced in this regard by competitors such as Netezza and Greenplum (acquired by EMC in July) that use cheap, commodity hardware to build their data warehouse appliances.

While he insists Teradata’s products are priced “competitively”, Willcox also argues that its market-leading technology translates to a lower total cost of ownership (TCO) for customers, in a number of ways.

One important characteristic of Teradata’s enterprise data warehouse product is its ability to support ‘mixed workloads’ – database queries of different kinds that use hardware resources in different ways – simultaneously.

In order to support mixed workloads, a database management system must be able to assess how much processing power each query is going to require, and prioritise queries according to the needs of the end-user. “There’s only so long a call centre agent can talk about the weather [waiting for a database query to run], but the analyst running a new customer segmentation model is probably going to press enter and go across the road to Starbucks anyway,” explains Willcox.

“You need to be able to differentiate between those user requirements and to treat them differently.” Most database management systems manage mixed workloads by classifying each query before it runs. “That is a valuable approach, and one that we adopt,” says Willcox. “But what you really need to do is continue monitoring and managing the query after submission, because there’s only so much you can tell before you send the work to the database. That is something we do well.”

By supporting mixed workloads in this way, Willcox argues, the Teradata enterprise data warehouse can be used for all manner of different projects, and therefore deliver a lower TCO. “If you have mixed workload [functionality] that is naive, then in practice every time you build an application you need to build a new database because that’s the only way you can keep those workloads separated.”

Technology leadership requires constant vigilance. Recent developments in Teradata’s data warehouse product include the ability to manage geospatial data ‘natively’, and forthcoming versions will include tiered storage functionality, allowing the system to use fast but expensive solid-state storage for frequently accessed data and cheap but slow storage for the rest.

All of these features are designed to allow organisations to use the Teradata warehouse as their single data repository, thereby justifying the acquisition cost. One reason for doing this, Willcox says, relates to the environmental impact of IT systems: “It’s already the case in some locations that the energy costs for operating hardware over a three-year period exceed the acquisition costs, and that’s a trend that is only going to continue.”

“The approach that some organisations have of storing data a dozen times, so that there’s one copy in marketing, one copy in sales, one copy in production, etc, is simply not sustainable. You are not going to be able to store petabytes of data four or five times around the organisation – the energy cost will be too much.”

Pete Swabey

Pete Swabey

Pete was Editor of Information Age and head of technology research for Vitesse Media (now Bonhill Group plc) from 2005 to 2013, before moving on to be Senior Editor and then Editorial Director at The...

Related Topics