Big data in the enterprise

Why big data?

The latest ‘it’ thing right now is artificial intelligence (AI). Although AI has been around for decades, it’s only recently that it has progressed into mainstream consumer environments. Due to its high cost of entry, this industry has been mostly dominated by brands with deep pockets and access to massive amounts of data; that is because AI is nothing without today’s other great buzz phrase: big data.

AI with limited data is often no more than a set of rules, which will return rudimentary answers. Data is instrumental in helping AI devices learn how humans think and feel, and also allows for the automation of data analysis. Without enough data – AI’s raw material – we would see something similar to the terrible example of the “AI-powered” help that was Microsoft’s Clippy.

>See also: Why do big data projects fail?

However, with the recent explosion of data, algorithms can now be trained to deliver a better result and help us do our jobs more efficiently.

Applications of big data and what is big data?

An example of what AI can do when powered by Big Data is Google’s ever evolving translation service. Over ten years ago, Google moved from a rules-based system to a statistical learning AI-based system – using billions of words from real conversations and text to build a more accurate translation model. However, that was just the beginning.

Now businesses in all industries are joining the likes of Google. Today many fashion retailers, such as ASOS, are offering AI-powered services to anticipate customer’s needs and provide better services.

To help ASOS’ customers express their own sense of style, they’re using AI image-recognition software like Wide Eyes, to analyse customer photos – locating items such as hats, skirts and handbags – to recommend relevant collections within their current catalogue. This near instant analysis has been made possible by training the software with thousands of images.

As demonstrated above, the user experience benefits of using Big Data to help customers describe what they want is self-evident, but that’s only the beginning. The diverse application of big data across many different industries is endless. Many brands are now even using big data to help them make better marketing decisions by creating tools like the Customer Lifetime Value models.

>See also: How big data and analytics are fuelling the IoT revolution

Using AI and big data algorithms – like Random Forest, Cosine Similarity and Deep Recurrent Neural Networks – to analyse all possible influencing factors and returning factors that will make the most impact, telling you whether or not you should spend your marketing dollars to encourage repurchase on certain customer segments.

Each of these AI applications requires a lot of data to be successful. The Big five – Google, Apple, Facebook, Amazon and Microsoft – don’t just have Big Data, but they have petabytes of data recording our every digital movements.

That being said, big data and AI are not beyond the reach of the rest of us. It’s not just sheer volume that matters, but the quality of “Big Data”. Take the datasets available via Transport for London as an example; it’s a great initiative to expose their historic journey data making beautiful visualisations like Oliver O’Brien’s Tube Heartbeat.

However, this alone doesn’t give you much insight into what customers are experiencing, where they are going, the reason for delays, failures etc. In order to derive interesting insights into the why, you need to marry data with context – like weather, events and other factors that could affect transport.

For those of us outside the Big five, is it too late? Do they have all the Big Data sewn up? No, we’ve seen many big brands (some outlined above) join the Big Data game.

>See also: The information age: unlocking the power of big data

However, data should be retained and guarded, it is an asset that should be recognised on your Balance Sheet. If used properly, it could give you a competitive advantage over others.

How do you process big data?

Early Big Data processing used techniques like Map Reduce, but data scientists need higher level tools that require less programming to drawing correlations between different data sets, solving scientific, social or industrial problems.

Apache Spark is a leader in this area, providing elegant and simple ways to express complex analyses that you can run on small sample data sets quickly before running analysis on big data sets by effortlessly distributing tasks to many machines.

The big data killjoy

The General Data Protection Regulation (GDPR) comes into full force in May 2018, across Europe and will replace existing data protection guidance. The legislation is intended to protect individual’s Personally Identifiable Information (PII) by unambiguously stating what customers are signing up for when providing their data. Moreover individuals have tighter control over their data including; specific rights for erasure, accessing ‘their’ data records and changing their consent.

>See also: How can a business extract value from big data?

For example, if a supermarket requires that a customer provides personal data to fulfil a specific service that they have asked for that’s one thing, but keeping that data afterwards and using it to target that customer for marketing purposes, long after the service has been actioned, requires specific actionable consent to be granted.

Some experts predict half of all consumer data stored today could become redundant or will need to be deleted to be compliant with this new regulation (Information Age). So should we give up on big data? No, although we will no longer be able to capture as much data as before with vague statements about what we intend to do with it, GDPR brings an opportunity to fine tune the customer value exchange, engender trust and loyalty from the customer and make every piece of data matter. From a practical level it may mean that we have to make an effort to recapture consent and restate intent for processing in advance of May 2018.

Although this seems like a lot of trouble in the short-term, harnessing big data using AI is worth the effort; firms who are not embracing such technologies are already lagging behind in productivity terms and lose out on the competition.

 

Sourced by Andrew Liles, CTO at Tribal Worldwide

Avatar photo

Nick Ismail

Nick Ismail is a former editor for Information Age (from 2018 to 2022) before moving on to become Global Head of Brand Journalism at HCLTech. He has a particular interest in smart technologies, AI and...