In the early 2000’s Doug Laney, vice president at research firm Gartner, defined the three Vs of big data: volume, velocity and variety.
Since then, following the development of the big data trend, industry expert Mark van Rijmenam has added four extra Vs: veracity, variability, visualisation and value, to demonstrate how to deal with the demands of big data.
So what are all these new Vs and what do they mean for big data? Has the concept of big data really changed?
Veracity, the first new V, is essential for big data.
For the analysis to be correct, the data itself must be accurate.
The large volume and velocity of big data means that it is statistically likely to have a large number of errors.
Good business analytics software should already have a data validation process. This means that, from the beginning, errors and discrepancies are spotted, so only high quality data is subsequently analysed, reported and actioned.
Variability is an extension to Laney’s original variety.
Big data collected from multiple sources can take a variety of different formats.
Data can now be collected from transactions, social media comments or sensors. While this large amount of data can be beneficial to companies, it can be overwhelming if they don’t know how to properly action it.
For example, in the retail industry companies improve their brand perception and customer loyalty by monitoring variables including purchasing habits, social media interaction or in-store complaints.
The problem with variability, especially for social media, is that most analytics software will be unable to register the meaning of a tweet in context. Therefore, it cannot correctly evaluate whether it is a positive or negative reaction.
Using advanced analytics software, businesses can perform sentiment analysis on this kind of data. It contains algorithms that can interpret the context of the message and decipher the correct meaning of the word in context.
This makes the data collected accurate and means that it can be visualised correctly, as either a positive or a negative view.
Although analysing big data can yield lots of useful information, many business leaders will struggle to make use of this if it is not presented in an easy to understand format.
Powerful business analytics software is able to analyse data from multiple sources and convert it into one manageable stream of data.
Businesses can then use this to change their processes quickly and easily, rather than having to manually correlate data across hundreds of files, documents and databases.
One of the specific pressures of big data is speed.
With the rise of the Internet of Things (IoT), more and more devices are becoming equipped with sensors that can feed back data. Although data analysts would traditionally use manually generated reports, new monitors such as smart meters demand near real time reporting.
This is clearly impossible to do without the use of analytics software and means that quick and easy visualisation interfaces are essential.
Although many analytics programmes can capture data in real time, some software relies on SQL queries to perform searches. Even for the most technologically savvy employees, generating these queries still takes time.
This means that the queries are not truly representative of real-timing reporting.
Software such as Connexica’s CXAIR features a user-friendly search-engine style interface, which allows anyone to generate accurate, up to date, reports and means that employees can view data and make decisions instantly.
Smart visualisation is truly the first step to the democratisation of business intelligence — the ability for anyone in the business, regardless of their technical ability, to gain actionable insights from big data.
This means that, instead of spending hours poring over complicated reports business leaders can roll out self-service analytics, making data accessible to all, from administration clerks to C-level leaders.
The value of big data is in the analysis, not in the data itself.
Research firm Mckinsey predicts that big data has a potential annual value of $250 billion to Europe’s public sector. However, this data is pointless and worth nothing if it cannot be effectively actioned.
For example, the UK Mid Kent Services (MKS), a local authority partnership consisting of Maidstone, Swale and Tunbridge Wells Borough Councils, used Connexica’s CXAIR business analytics software to deal with a staggering volume of data.
The data consisted of over 20 million car parking records, 500,000 service-call records and 65,000 council tax records. Certainly not an easy task.
In the past, administrative staff had to spend time combining data from multiple data sources in order to prepare reports for managers. This was obviously a time consuming and inefficient manual process.
Managers could see all of this information in one format, meaning that it could be transformed into actionable data.
By eliminating the manual processing it was able to deliver reports much more quickly, more accurately and with information that could be actioned in a more timely manner.
In the same way that Van Rijmenam’s additional Vs help us to better understand the complexities of modern big data, the same is true for the way we analyse data.
Given the increase in the amount of data that businesses must now manage, it makes sense that they use effective ways to gather, sort and analyse it. This is where good business analytics software is indispensable – with the right software and the right plan, maintaining manageable and actionable data is no longer a daunting task.
Greg Richards, sales and marketing director, Connexica