In the business of extracting value from data, parallels can be drawn to the book/movie entitled Moneyball.
Admittedly, because it’s such a great story, Moneyball is in danger of being over used as a case study for how to use data to create competitive edge.
The movie tells the story of how the Oakland A’s manager, Billy Beane, employs sophisticated analytical models of player stats to drive his player recruitment strategy.
In turn the A’s are then able to outperform their peer group and, in fact, be competitive with teams spending much more money on players. The received wisdom from this story? That clever data analysis, like Rumpelstiltskin turning straw into gold, can transform an organisation’s competitive abilities.
Well, yes and no. Because what people often forget is that automated data analysis alone didn’t turn the As into a success.
Instead, it took a human, in the form of Assistant GM, Peter Brand, to translate the data output into something meaningful that then allowed Billy Beane to identify and hire the right talent.
In short, Brand made the data relevant to the use case that Beane put in front of him – help find players who will win, but at a fraction of the budget it normally takes to be successful.
We know today that we have no shortage of data. Quite the opposite in fact: the frequent complaint heard from people is that they are drowning in data.
But what people are actually struggling with is the question of data relevance. All too often, business decision makers have data pushed at them from their internal teams without context or a real understanding that data.
It should be celebrated that organisations do now recognise the value of data, but always need to remember that data only becomes truly valuable when it is presented with relevance and context.
>See also: The missing piece in the big data puzzle
The coming of age for the chief data officer is a reflection of organisations’ realisation that, while straightforward analysis can be automated, it takes a human to actually ensure the figures have some context – and therefore relevance – to the organisation.
Looking at real life examples of this happening in practice, you only have to look at how badly the pollsters got it wrong in both the Brexit vote and also with the recent US Presidential election.
In both cases, pollsters fell into the trap of accumulating data and then presenting it as quickly as possible to their various audiences without truly looking for context from what they were seeing.
In the US, people didn’t necessarily vote directly for Donald Trump; many voted against the ‘establishment’ as they saw it.
But this ‘context’ for voting preference wasn’t picked up; instead pollsters looked for a straight choice of Trump or Clinton or Don’t Know.
Would a more considered, contextual approach have helped the pollsters predict different outcomes in both cases? Perhaps not. But it would have provided them with a more balanced view on a prospective outcome.
This is the lesson that should be learned from the Oakland As and Peter Brand; your data strategy needs to be founded on getting the right mix of automation and intelligent human analytical input.
So, do all you can to automate data management processes but go the last mile and invest in the clever people who can then find the context behind the data.
At the same time, balance the desire to get analysis as quickly as possible with the need to ensure that you also get the right context and relevance. Do so and, like the Oakland As, you may just turn your data into ‘gold’.
Sourced by Michael Whitehead, CEO & Co-Founder, WhereScape