Clearing the blurred lines around real-time analytics

Despite real-time analytics well-deserved attention, many businesses would be hard-pressed to define what truly constitutes “real-time.”

When a merchandiser at a big box retailer talks about “real-time analytics”, for example, he or she may actually want a sales dashboard that is updated several times a day.

But when a marketing manager at a mobile telco talks about real-time analytics, he or she may want the capability to automatically send offers to customers within seconds of them tripping a geo-fence.

Similarly, someone in capital markets trading may have expectations of “real-time” systems that are measured in microseconds.

Since appropriate solutions to these different problems typically require very different architectures, technologies and implementation patterns, knowing which “real-time” you are dealing with really matters.

Before you get started, pause to consider

Real-time systems are often about detecting an event – and then making a smart decision about how to react to it.

The observe-orient-decide-act or “OODA loop” gives us a useful way to model the decision-making process.

>See also: How self-service analytics is helping the travel industry take off

So what can a business leader do to minimise confusion when engaging with IT at the start of a real-time project?

Understand how we will detect the event that we wish to respond to.

Sometimes this is trivial. Other times, rather tougher – especially if the “event” we care about is one when something that should happen does not. Or which represents the conjunction of multiple events from across the business.

Clarify who will be making the decision – man, or machine?

The marque 2 eyeball has powers of discretion that machines sometimes lack. But its carbon-based owner is not only much slower than a silicon-based system, but is only able to make decisions one-at-a-time, one-after-another.

If a human is put in the loop, you normally move to a “please-update-my-dashboard-faster-and-more-often” territory.

Being clear about decision latency is also important – how soon after a business event do we need to take a decision? And implement it?

>See also: Bridging the business intelligence and analytics gaps

There will need to be an understanding surrounding whether decision latency and data latency are the same. Sometimes you can make a good decision now on the basis of older data. But sometimes you will need the latest, greatest and most up-to-date information to make the right choices.

Balance the often competing requirements of decision sophistication and data availability. Do you need to leverage more – and potentially older – data to take a good decision? Or can you make a “good enough” decision with less data?

Can you have your cake and eat it?

Consider this – you want to send an offer in near real-time when you are within half-a-mile of a particular store or outlet.

You can do so solely on the basis of the fact that someone has tripped a geo-fence – which means that the only information you need is their location – where they are right now.

But what if you want first to understand whether you have made the same offer to them before, how they did or didn’t respond, which offers other customers who have previously exhibited similar behaviours to theirs have or haven’t responded to in the last 6 months, etc…?

Then you need also to access other data in addition to their current location that may be stored elsewhere, outwith the streaming system.

>See also: The 3 pillars of big data analytics potential

In this case, the cost of choosing to give your customer a more sophisticated and personalised offer is the time it takes to fetch and process that data, so “good”, here, may be the enemy of “fast”.

Here there needs to be a decision between “OK right now” or “great a little later”. That trade-off is normally very dependent on use-case, channel and application.

Playing the game

Of course, you can try and game the system – by pre-computing next-best actions for a variety of different scenarios.

This way, you can try to be fast and good, by merely fetching the result of a complex calculation made with lots of data in response to an event that you have just detected, instead of actually getting the underlying data and running the numbers.

But then the price you pay is reduced flexibility and increased complexity. And by definition, decision latency and data latency are different when a business “cheats” like this, because you’re making the decision based on the data from previous interactions, not the latest data.

There are different costs and benefits associated with all these options.

There is no wrong answer – they are all more or less appropriate in different scenarios. But make sure that you understand your requirements before IT starts evaluating and streaming in-memory technologies.

 

Sourced by Martin Willcox, director of big data, Teradata

Avatar photo

Nick Ismail

Nick Ismail is a former editor for Information Age (from 2018 to 2022) before moving on to become Global Head of Brand Journalism at HCLTech. He has a particular interest in smart technologies, AI and...

Related Topics

Big Data
Data
Storage