The analytical eye

It goes without saying that presenting information in a visual form helps people to understand it. One need look no further than the numerous graphs that fill presentations, financial results and performance reports to see that data visualisation is an established part of mainstream business culture. 

A crude explanation of the value of presenting data in graphical form is that the human visual system processes information in parallel; a series of numbers presented visually can be perceived simultaneously, and so trends can be spotted immediately. The alternative approach, to read each number in turn and try to build a mental image of the trend, is constrained by the limits of the mind’s working memory. 

But while histograms, line charts and scatter plots are to be found everywhere in working life, the full potential of the visual system not only to understand but also to analyse information is still underused by business information systems, from desktop spreadsheet software to million-pound business intelligence architectures.

Stephen Few, founder and principal of Perceptual Edge, is an independent analyst and consultant whose work focuses on data visualisation. Few argues that the notoriously high rate of failure among business intelligence (BI) projects can be attributed, in part, to the failure of software vendors to design their tools to reflect the way human beings perceive information.

“Business intelligence is really just a new name for data warehousing,” he says. “The emphasis has always remained the same, and that is a focus on technologies for gathering and transforming data, supposedly making it ready for use but rarely taking that extra step to make data meaningful for people.”

Of course, top of the range BI software allows users to create all manner of charts and graphs with more effects and visual adornments than one could ever need. But this is not the same as designing a system that lets the user exploit the potential of their visual system to perform analyses they would be unable to do mathematically.

In fact, says Few, overdecorated graphs can have the opposite effect. “There has been this mad race among BI vendors to try to ‘out-sizzle’ one another with flashy visual effects without asking whether they are going to serve a purpose. And what happens is in most cases is that not only does it not help, but it actually gets in the way.”

However, there are a small number of software vendors whose technology is based on academic research into human computer interaction and data visualisation, supporting ‘visual analytics’ at its most powerful. Two leading examples of this, Few says, are Tableau Software and TIBCO Spotfire.

An algebra of graphs

During the 1980s, while working for a then little-known company called Pixar, Stanford University professor Pat Hanrahan was the chief architect of a system called RenderMan Interface, which defined an algebra for describing three dimensional visual objects. While that system went on to underpin Pixar’s phenomenal success in computer animated films, Hanrahan developed another algebra, in this case for describing graphical representations of data.

Tableau Software, of which Hanrahan is now chief technology officer, is a data visualisation vendor whose tools are based on this very technology. The algebra developed by Hanrahan is used by the Tableau tool kit to query a given database, so that core concepts in data visualisation – such as the width or colour of a bar in a chart – are built into the operations of the system at the most granular level. 

According to Dr Jock Mackinlay, Tableau Software’s director of visual analysis – who in the 1980s was part of a team at the Xerox PARC research facility that coined the term ‘information visualisation’ – this has two effects that greatly aid the user. 

The first is that it allows the system to analyse the data in order to decide which visualisation is most appropriate. “We’ve worked very hard on defaulting, so that you typically get the kind of view that a visualisation expert would recommend for you,” he says.

Secondly, it means that the user can change the visualisation as and when they please. Most visualisation systems use a wizard-style interface, says Mackinlay, “You have to know from the beginning what view you need. In Tableau, you just have to know something about your data, and then your visual system allows you to explore,” he adds.

According to Stephen Few, the ability to change the visualisation ‘on the fly’ is the key to data exploration. “The problem with a lot of software is that to get from one view to the next the mechanics end up being so cumbersome that we’ve forgotten what we’re looking for by the time we get to the next view, so it doesn’t support our analytical functions.”

Continued…

Page 2 of 3

Mackinlay explains it this way: “If there are too many subtasks [involved in changing the view of data] they distract the user from the cognitive task of answering a given question. But if you remove the subtasks, and make it easy enough to use so that they can go through many views rapidly, they can actually concentrate on the question answering task.”

The argument is that allowing users to explore data visually enhances their analytical capabilities. Not only might users spot unexpected trends, they can also test their own hunches or hypotheses – or the claims made by others – by viewing data in multiple forms. “If it’s easy to explore, it’s easy to validate,” says Mackinlay.

A third use case for data visualisation software lies in demonstrating known trends and patterns to other people in an accessible and comprehensible fashion. The power of visual exploration, Mackinlay argues, allows users to find the perfect view to convey a trend. “If you are trying to author a story that involves data, you want a tool that allows you to explore quickly,” he says.

UK utility contractor Morrison Utility Services has used Tableau Software as a reporting tool for commercial and operational data since 2007. The software was chosen primarily for its cost, says  business process and systems manager Alan Darnell, but the visual, analytical functionality of the tool has allowed the company to provide customers with greater insight into its operations.

The company has a contract with the National Grid worth close to £1 billion, for example. “Every year we need to cost all the replacement works we do for them,” says Darnell. “Tableau allows us to show them the cost in any area on a map at whim, which is a really big deal for both us and them.”

“Our customers’ perception is that we’re very good with the management of our data and that we’ve got our fingers on the pulse,” he says. “And that is really helped by the fact that we’ve got these reporting tools.”

The tool also allows sales workers to monitor their own performance by analysing operational data on the fly, analysis that would have previously required the involvement of the IT department. “The big thing that Tableau has given us is that it has made reporting business driven not IT driven,” says Darnell. 

In memory exploration

Like Tableau, TIBCO Spotfire’s ability to support visual exploration derives from innovation at the data level. The original Spotfire product was developed at the University of Maryland’s Human-Computer Interaction Laboratory, before being commercialised in 1997 and acquired by middleware vendor TIBCO in 2007. 

“The original research was around trying to make data in databases easier to understand.” explains Brad Hopper, director for industry solutions at TIBCO. “The state of the art then, and still today, was [database query language] SQL, but it’s still a very poor interface for understanding data because it’s a programming language,” Hopper argues. “With SQL, you need to write a script to filter data; clearly, filtering information visually is more usable.”

By adopting a visual metaphor for databases queries, Hopper says, “you can bring different data together very easily, without needing to know SQL, and find the relationships and trends easily.”

Continued…

Page 3 of 3

Spotfire’s alternative approach was an early example of a so-called ‘in-memory’ business intelligence tool, wherein data to be analysed is moved into random access memory for greater speed of analysis. 

It is not, however, an entirely in-memory system, examples of which are prone to crash when applied to high quantities of data, Hopper claims. Instead, a summary aggregation of data is brought from disk into memory and presented visually so that it can be filtered and explored. When the user ‘drills down’ into a more detailed subset of the dataset via the visual interface, that underlying data is pulled from the disk into memory without, Hopper insists, any disruption to the user’s all-important train of thought. 

Most conventional business intelligence tools have ‘drill down’ functionality, but Hopper argues that these are still simply reports at lower levels of granularity; the user is not using the visual interface to navigate the actual data.

“Traditional business intelligence systems provide to the report the absolute minimum amount of information it needs to render the chart,” he says. “The ability to span multiple in memory instances that populate themselves in response to visual cues is something that is very highly differentiated [for Spotfire].”

Hopper echoes Mackinlay’s point that traditional BI systems curtail visual exploration by requiring that the user knows what view will be most enlightening before generating a given chart. “The key difference between a visual analytic tool and a more traditional business intelligence tool is that with a BI tool you need to know which questions you want to answer in advance,” he says. “And you might be able to create the most glorious report to answer that specific question, but nine times out of ten somebody is going to look at that report and have another question and that is going to require that you go through the whole cycle again.”

Superficial understanding

Few says that although the field of data visualisation has been given a bad name  “because some of the stuff sold under that term is just horrid”, suppliers such as Tableau Software and TIBCO Spotfire are gaining recognition, and users are beginning to realise what can be achieved.

“When I show people some of the more enlightening techniques that can be used in analysing data, there really are ‘a ha’ moments when they realise that their reach could be dramatically extended if they just had a better tool to work with,” he says.

This has not escaped the notice of the conventional BI suppliers, says Few, and some have launched what they claim to be ‘visual analytics’ solutions. “The larger players are running into sales situations where their customers are saying ‘why can’t you do that’, and so they have been making efforts to emulate what can be done in tools like Tableau and Spotfire,” he says. “But because their understanding is superficial their efforts are just disastrous.”

Given the above mentioned companies’ academic backgrounds, one wonders why the conventional BI suppliers don’t simply hire the brightest and best brains in data visualisation to make their products more accessible and more exploratory.  “They could certainly go out and hire the right people, but they haven’t done that yet and I don’t know why,” remarks Few. “Maybe in some cases, their egos are too big to accept the fact that they don’t have the skill set.”

Pete Swabey

Pete Swabey

Pete was Editor of Information Age and head of technology research for Vitesse Media plc from 2005 to 2013, before moving on to be Senior Editor and then Editorial Director at The Economist Intelligence...

Related Topics