From global to local: how enterprises can embrace truly ‘pervasive BI’

There’s a general trend in business that, as business intelligence (BI) becomes mainstream, it’s no longer going to be the province of specialists. Many are now moving beyond narrow and focused analytics projects into broader adoption that sees data embedded into thousands of everyday decision-making points and workflows throughout an enterprise. This has given rise to the buzzword ‘pervasive analytics’.

Pervasive analytics can be defined as business intelligence that is deployed at all levels of an organisation and across it, from finance and marketing to HR and manufacturing; from the boardroom to the shop floor. It also provides huge value to an organisation when deployed both inside and outside it, sharing information with other businesses, customers, suppliers and partners. Pervasive analytics offers the mechanism to create actionable data based on information, and is inextricably linked to agile growth.

As a term, it’s superseding ‘big data’ by promising more of a ‘focused data’ approach by ensuring the different data streams deliver the right information to the right person at the right time.

> See also: Breaking out: how BI is moving beyond the enterprise's four walls 

‘Big data – although it offers an organisation a mass of useful data – is difficult to consume by business users, and often the interfaces are more suitable for data scientists,’ explains Richard Neale, EMEA marketing for  BI software firm Birst. ‘Pervasive analytics, due to its more structured approach, allows all business users to access their data.’

‘Because more employees are involved in the decision-making process and more departments have easy access to the information, pervasive analytics allows organisations to make quicker decisions and gather information faster.’

Cultural shift

In pervasive analytics, the process around the management and monitoring of business intelligence is ingrained, fluid, constantly ongoing, and happens in real time.

As Heleen Snelting, data scientist at TIBCO Software, explains, this approach can facilitate a cultural shift by broadening out the concern and responsibility of data insight from being the sole domain of data scientists and specialists to an organisation-wide interest, established as a valuable asset at the heart of business change.

‘A critical feature in facilitating this is the role of self-service analytics, which is intrinsic to achieving greater accessibility to the latest information,’ she says. ‘By equipping every individual across an organisation to drive the business metrics through the use of data, the opportunities for growth, competitive advantage and enhanced business value expand vastly.

‘This is further compounded by the benefits of learning from historical intelligence and applying that knowledge to real-time data to influence what is about to happen, in turn avoiding or minimising risk and optimising opportunity.’

CIOs can empower the business to better leverage BI by making agile analytics pervasive and delivering predictive analytics at every point of customer engagement to drive new levels of innovation and business performance.

Tangible evidence suggests that pervasive analytics is far from hype. Leading organisations have already begun to realise significant returns, borne out by findings from the Boston Consulting Group, which having surveyed 1,500 global senior executives discovered that information-driven enterprises that leverage analytics generate 12% higher revenues than their counterparts.

And according to figures from the Aberdeen Group, organisations with pervasive BI saw a 24% average increase in profit across operations because of how it improved performance.

What is important?

But reaching this state is not an easy task. For the pipe dream of pervasive analytics to come to fruition, organisations have to embrace automation, flexibility and agility.As businesses change strategies, methodologies and new business priorities, what is useful for making decisions today may not be important tomorrow.

‘This lack of flexibility, especially when dealing with a rich technology map, has taken costs through the roof, making traditional analytics quite rigid,’ says Oracle’s EMEA director of big data, Michael Connaughton.

‘The industry goal is to make this more agile, by creating tools that help companies find the right balance between operations, customer experience and impact, cost- profitability analysis, automation and innovation.’

According to Connaughton, there are five key elements for a pervasive analytics project to come to fruitiom. First, data needs to be captured, cleaned and pre-processed to ensure that you have the right information, in the right quantities to work with. If businesses can only get some data within a given time frame and they’re unable to operate the workflow, or if data is incomplete or riddled with inaccuracies or duplication, then they should not be using it to make decisions.

Secondly, models must be validated and refreshed to make sure they are accurate and represent the current situation and trends.

> See also: How scaling up data is the key to effective BI

‘Industrialisation,’ says Connaughton, ‘is another key step. What works for a 'canned' sample or test may not work on real-world datasets, or may not scale out. In a development or data lab environment, many conditions are relaxed in order to achieve rapid results; however, it is critical to ensure that the whole process gets revamped and duly optimised. Integration with other systems is part of this activity too.’

After the solution is industrialised, businesses must also make sure that it is properly operated. Automation plays a critical role in avoiding human mistakes when running analytical processes.

After creating any models, businesses need to make sure they are tested, validated, industrialised, and properly automated and operated, says Connaughton. ‘If models are not launched and refreshed, outcomes

will not be produced and it can lead to instability and a representation of an old business picture. Ultimately, the outcomes of the models need to serve the decision-making process.’

Finally, outcomes must be actionable and used for decision-making, he says. ‘As part of this, it is critical to link models with business intelligence tools or monitoring systems to provide visualisations that make the data accessible.’

Traditional attitudes to BI and analytic solutions have always made assumptions about the way different people in organisations work with data, resulting in strict roles. Many BI solutions and IT architectures have offered separate products to demonstrate data, such as dashboards and discovery tools, aimed at specific audiences.

For example, visual discovery platforms are used by those who use data to answer questions and deliver insights to the company. However, with pervasive analytics being used for an entire organisation, different departments will no longer have their own targeted data. ‘This will suit some parts of an organisation,’ says Neale, ‘but for today’s modern mobile and data-savvy workforce, traditionally structured IT architectures will need to change.’

Traditional IT structures also hold much of their data across multiple spreadsheets, which are managed by many departments and allow many views of the same data, as well as individuals to edit the data.

‘Therefore, this data cannot be trusted to make timely decisions as it could be inaccurate,’ says Neale. ‘This can be resolved by automating the modification of these multiple data sources to gain real-time visibility in order to make better decisions, hence why adopting pervasive analytics is so beneficial to an organisation.’

‘By equipping every individual across an organisation to drive the business metrics through the use of data, the opportunities for growth, competitive advantage and enhanced business value expand vastly.’

> See also: Five reasons why BI and analytics are the top CIO priority

Just as it has been the bane of big data projects, the number-one enemy of pervasive BI is undoubtedly data silos. Data stored in disparate, unconnected sources can prevent it from getting to work and producing meaningful insights. However, networking data sources together can help companies avoid this problem.

‘Rather than dumping all information into a data lake which makes it hard for business users to navigate, the sources can be joined together and analysed to spot patterns that can lead to insights,’ says Neale.

The biggest challenge exists when there is central data that is critical to the whole business – finance data, say – while other sources either exist within specific teams or are brought in from outside.

Rather than combining data centrally in a monolithic data warehouse, which is time consuming and likely to end with user frustration, it is worth looking at how to bring data together in a smarter way. Centrally governed data can be mixed with individual data to provide the flexibility that users are looking for.

Using cloud-based BI, companies have the opportunity to create virtual instances of data that network different forms of that data into one place. As Neale explains, ‘networked BI’ creates a network of interwoven BI instances that share a common analytical fabric.

‘This enables organisations to expand the use of BI across multiple regions, departments and customers in a more agile way, and empowers these decentralised teams to augment the global analytical fabric with their own local data,’ he says. ‘The result is local execution with global governance, eliminating data silos once and for all and dramatically accelerating the delivery of BI across the enterprise.’

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...