Is deep analytics finally a reality for the masses?

Until fairly recently, it was only financial sector businesses with deep pockets that could utilise truly huge amounts of diverse data in real time.

Deploying streaming analytics and event processing technology, banks and other institutions are using complex algorithms and low latency messaging to take split-second decisions as they trade in the markets.

Now, however, a confluence of factors means the same technology is being adapted for use in very different sectors, such as telecoms, manufacturing or retail, where individual businesses generate masses of data.

One of the key dynamics has been the falling price of parallel processing chips, which has occurred just as their computational power has increased hugely, following Moore’s law about the doubling of transistors in integrated circuits every two years.

>See also: Harnessing the power of community in analytics

Similarly, the cost of memory has fallen markedly, so that it is no longer prohibitive for a business to place several terabytes of its data in RAM.

In simple terms, this means medium-sized business can now conduct tasks that were previously out of reach and required big farms of servers.

These new opportunities have led to a rapid change in attitudes towards in-memory computing, since it now takes a few hours to load and index huge amounts of data that would previously have taken days.

All kinds of data can now be stored and used in an entirely transient way that would have been inconceivable a few years ago when memory and processing power were so much more expensive.

Traditionally, companies were only able to analyse a limited amount of their internally-generated data in a timely fashion. Now, however, they potentially have access to enough capacity to draw it in from external sources, along with their own historical records that could go back decades and to submit it all to comparison and pattern matching.

Indeed, in-memory data management finally makes big data consumable when used in conjunction with the technologies that were formerly only cost-effective for stock market trading and high-performance banking operations.

Where breakthrough innovation is happening now is in combining this computational power and in-memory management with pattern-matching technology, using advanced time-based algorithms – creating a smart data layer.

A simple example of how this works is in the supermarket sector, where it is often said that if bananas are not sold within ten minutes, something is seriously amiss within a store.

Using smart data layer technology, it is possible to analyse the mass of till data to show that sales of bananas in a store are not currently matching the usual pattern. This then triggers a stock-take, using in-store sensors and then if necessary, an alert to the manager via his smart badge so that action can be taken to put sales of bananas back on track.

This is a straightforward example, but there are two further ways in which this technology is being used to bring major gains. One is in the creation of long-term business rules to govern processes, such as supply chain functions, manufacturing or complaint-handling.

In pharmaceuticals, for example, businesses are able to use this technology to track their supply chains and spot the gaps in handling when fakes may have been substituted or stock stolen, for example. They can also use it to flag up when consumption rates have changed.

Equally, by deploying smart data layer technology to monitor the information streaming from sensors, manufacturers of silicon wafers and high-grade copper wire have been able to lock in huge improvements and reduce waste. The technology ensures that processes are always operating optimally, by adapting to changing conditions in real time. For one wire manufacturer, waste has dropped tenfold, resulting in huge savings and a product of much higher quality.

The second area of application is much more transient, correlating for example, customer relationship management information, marketing and website data for short-lived campaigns of 24 hours. A retail company can load all the transactions from the last 12 hours and use it for a closely targeted flash marketing campaign, simply disposing of the data the following day.

The algorithms within the smart data layer can look at the data, including public sentiment from social media, to help drive a rapid, short-lived campaign in a particular location. This kind of assisted decision-making has led to campaigns that have lifted sales immediately.

>See also: From insight to action: why prescriptive analytics is the next big step for big data

The smart data layer is a technology whose time has come, created by the perfect conjunction of cheaper and more powerful processing and memory, along with the use of advanced time based algorithms.

However, if companies are to ensure they reap the benefits from these opportunities, they will need the data scientists to tailor the data, rules and algorithms to their precise needs. They also require access to substantial amounts of data and to the hardware on which this technology rests. Most of all they must have the expertise to integrate it all effectively.

Not only will it yield a huge return on investment, it is also capable of doing so in a few weeks, compared with the many months it might otherwise take an enterprise’s IT department with traditional methods.

 

Sourced from Matt Smith, Software AG

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics

Analytics
Big Data
Data