You wouldn’t drive a car staring at the dashboard – so why would you run your company that way?

When you’re driving a car, you spend most of your time looking out the windshield: What’s ahead? When to turn? Any unexpected obstacles in your way? Every now and then you glance at the dashboard. But that’s just to check how fast you’re going, whether you’ve got enough gas, that the engine isn’t running too hot, etc.

So why is it that in the world of Business Intelligence (BI) and visual analytics we primarily focus on data from internal systems, providing a view on internal operations and past performance, while largely leaving data on the external business environment and the future out of the equation?

The three waves of business intelligence

Corporate decision-makers feed on data from a wide and growing variety of sources. This was in fact the realisation that led to the first wave of BI software in the 1990s that enabled companies to aggregate and make sense of the growing amount of data available. This was in the early days of the Internet, and prior to the Internet being recognised as a major force in the world, let alone a source of data and intelligence.

As a result, BI soared and was used as a technology tool focusing on data from internal systems. When properly implemented, BI systems give companies a valuable insight into all aspects of the operational process; from the performance of the call-center to the results of the latest efforts to fight customer churn.

> See also: Unreliable storytelling is a fact of life, so how can we get to the truth in data analytics?

In the early days, report filing was largely in the form of custom-made reports in excel, prepared by IT and delivered to executives and managers. Editing a report, let alone building a new one was a task that required analysis, project management, a lot of overhead and many man hours.

As reporting became more sophisticated and businesses sought opportunities to trim operational costs, the second wave of BI was born, enabling individuals to interact with data while using the more sophisticated business and data discovery platforms.

Decision-makers could formulate new questions and explore the data by a simple click of a mouse, no longer having to rely on pre-made assumptions or long-forgotten questions that were important when a report was conceived, but were no longer relevant.

Also, these tools emphasised the visual presentation of data, capitalising on the fact that human beings are visual creatures, able to consume an astonishing amount of information at once, through our eyes and then quickly recognise patterns and notice anomalies.

However, the data preparation and layout of the visual analytics still required a technical skill-set beyond the grasp of the typical business user.

Currently, we are in the third wave of BI platforms – the age of self-service visual analytics. Business users now have increased access to powerful visual analytics tools, and – although still somewhat at the mercy of the organisation they work for – the internal data to analyse.

For most of BI, the Internet is still an afterthought

For historical reasons mentioned above, BI platforms, especially the solutions that stem from the first wave of BI, still seem to view the Internet as an afterthought. You can relatively easily sync BI platforms with various internal databases through ODBC connections and aggregate data from a variety of on premise enterprise systems. But if you want to integrate them into a simple online API – not so much.

This is slowly changing, but the larger problem we need to work through is that much of the data on the internet, whether from public sources, financial and economic databases or market research companies, doesn’t even exist in APIs or other well-structured, machine-readable formats.

'The problem is that a lot of this is delivered in static formats: PowerPoint decks, Excel sheets, PDFs, not as structured and actionable data. As a result, too much of it ends up idling on hard drives with no valuable way to search, compare or access at a later time, let alone to keep an eye on updates to the underlying data. Therefore, this data, what I call Market Intelligence in aggregate, has not made its way into BI meaningfully.

Seeing the whole story that lives within your data

With the possibility of external data being highly valuable to meaningful business intelligence, the real power comes when you combine internal and external data. When you are able to see, enrich and analyse your internal data in the context of the external, one is able to see how all the pieces of the operation work in conjunction, how each affects the other.

For example: How does the weather affect my supply chain? I have my sales per region, but how well am I really doing giving the population and demographics in said regions? What socio-economic factors characterise the areas where my stores do well? How are currency rate fluctuations affecting my quarter?

We've already mentioned a variety of external data sources. Public data alone can be valuable when it comes to providing context to the operational data. But syndicated data is also big business.

According to Outsell, a research company that studies the information and research market, traditional Market Research alone is close to $30B. Add IT research, web and social media analytics and the number is closer to $50 billion. And that’s leaving out financial databases, sensor data, geospatial and medical data, all of which is already crucial to many organisations business, but largely disconnected from the operational data coming from the internal systems.

> See also: The road to high quality analytics starts with high-quality data: here's how to get it

It is clear that the visual analytics platforms of the future, will need the capabilities to bring all this data into the same place, within the same systems, at the fingertips of the business users that need them; whether that is visualising the latest sales numbers, looking at market share tracking from a research firm or associating the two to see if the internal numbers are stacking up with competitors' projected numbers.

In 2016, we are going to see a mash up of BI and Market Intelligence platforms, a mash up that will be transformational for both industries:

BI will have to deliver not only the software, but also the data that helps their customers understand their business, while market Intelligence will have to learn how to distribute their data in a more modern way than they currently do.

Together, these intelligence platforms will have to figure out how to deliver joint value in a way that best suits both of them and more importantly, their customers. Maybe we will call this unison 'Unified Intelligence'? Whatever it will be called, we are in for an interesting transformation of these two major industries, so buckle up and enjoy the ride.

Sourced from Hjalmar Gislason, VP of Data, Qlik

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...