World Cup teams and corporations get ahead with analytics

The World Cup has come a long way since its inaugural 1930 tournament, which saw thirteen teams battle it out in the Uruguayan capital of Montevideo. European teams played off the back of a three-week trans-Atlantic sea voyage, and the final saw Uruguay clinch the trophy before an audience of 90,000. Fast forward to Russia 2018; private airlines replace ocean liners, and over 3 billion global fans are expected to tune in to the final. Football has become a truly globalised and sophisticated industry, with the financial stakes growing ever higher for a team to succeed amongst fierce competition.

When competing at this level, relying on talent and athleticism alone is too much of a risk for the manager and the board they report to. Football teams, with their armies of coaching staff, are increasingly harnessing the potential of technology, and are reinvesting their money in technology innovations that can help them to stay ahead.

>See also: Sports data: why it can fall flat and why it doesn’t have to

Professional clubs now routinely collect vast quantities of data from training sessions and during matches. Sensors detect players’ movements along an X-Y axis, while personal monitoring systems track physiological metrics of each player such as heart rate and ‘smart ball’ technology tracks a ball’s precise position on the pitch. As seen in this World Cup, a free kick can seal a match and data is even collected on the point of contact to determine the spin on the ball.

Vast amounts of data are accrued in real time, but the problem is those raw data are not intrinsically useful. The true challenge lies in deriving patterns and trends from this mass of information that can give real, actionable insights that help win the game. The answer to this problem is data analytics. Leading football clubs employ a ‘ghost team’ of scientists who use powerful analytics tools to explore the huge amounts of data. Here, having a fast in-memory database makes it possible to sift through millions of rows of data to identify rich seams and build new analytics to interpret the data in myriad ways.

>See also: How the NFL brought game-changing tech to its London matches

Players receive tailor-made training programs based on their individual stats which range from in-game performance to analysing the physical condition of a player, adapting their training program to lower their risk of injury. Advanced analysis of player’s data provides a dynamic team model that factors in the strengths and weaknesses of each player. These are then aggregated to support the manager’s observations and create a winning team strategy.

The German national team is one squad that doesn’t shy away from digitalisation. Despite its hardship in Russia 2018, the team credited data analytics for their 2014 victory. Its data science team used descriptive analytics to scrutinise its match history to find its average possession was 3.4 seconds. This left its players vulnerable to offensive tactics, so in 2014 the team trained hard to reduce possession down to just 1.1 seconds and outmanoeuvre the opposition. By adjusting their style of play, the team was able to maintain an advantage over their opponents throughout the competition and finish as victors.

>See also: What can businesses learn from sports clubs that use technology?

Football is just beginning to embrace the power of data analytics, but data-driven strategies are not just limited to sport. Any competitive business can get ahead by gaining insights that help them respond rapidly to changes in the market or consumer behaviour, just as a football team responds to the changing conditions on the pitch.

Large organisations need to be aware of what is happening in the present moment, but when they have multiple products, sales channels and global operations having it’s a complex and huge scale challenge to have a proverbial “finger on the pulse” on the business. Employees record and produce internal information every second, which can be analysed against the market environment using the organisations marketing data and third-party subscriptions. Building an infrastructure to aggregate and correlate these sources on the fly makes it possible for Business Intelligence (BI) or data science teams to perform timely analytics. Leading data-driven businesses benefit from actionable insights to identify opportunities early, improve operational processes or target potential customers – think Black Friday sales, online gaming or sports betting.

>See also: Cloud-based data analytics taking over the world of professional football

If timely (often called real-time) analysis wasn’t enough, the emerging technology of prescriptive analytics is being used to predict the future. By combining statistical modelling, neural network and machine learning techniques to interrogate real-time and historical data, prescriptive analytics can suggest courses of action to a business and predict the outcome of those decisions. Ah, the long-awaited money-making machine? Not so fast!

Getting prescriptive analytics right requires arming your “shadow team” of data professionals with the means to analyse large data sets and identify the right data quickly and responsively. This will help them to build the analyses and machine learning model they need and verify that it is answering the right questions. Finally, your “machine” needs to feed itself. That means building a performant data engine to suck in all these real-time data and simultaneously underpin the sort of responsive, up to date business intelligence dashboards your players need to win against their opponents.

Currently, there’s much discussion about training machine learning algorithms (which is very justified), but the missing ingredient is often speed – crunching the volumes of data required demands contemporary infrastructure. Businesses have invested hugely in migrating from proprietary monolithic databases to pour their data into Hadoop data lakes and data warehouses, but now they lack the common data analytics infrastructure to bring agile and responsive data analytics to life. Modern databases employ massively parallel processing (MPP) technology to combine server resources and increase throughput, intelligently load data into memory.

>See also: Place your bets: do odds improve as sports enter the predictive age

It’s also helpful to look past traditional organisational siloes. In many organisations, the BI team still does the hard graft that keeps the business running with BI tools and SQL queries, while the data science team (where present) is under pressure to score a winning goal by employing new techniques and different query languages and programming languages (R, Python, MapReduce). If you have a single source of truth in a data lake or warehouse, then it’s imperative that you find a way to make that data available for both teams to use on their terms with their tools. Would celebrated team player Leicester City give its strikers this information, then leave its midfielders in the dark?

Ultimately, data analytics exist to empower people to make better tactical choices in a changing environment. It is a practice of exploration, gaining perspective and making proactive decisions that allow any organisation to become more agile and responsive, and stay ahead of the game.

By André Dörr, Data Engineer at Exasol

Avatar photo

Andrew Ross

As a reporter with Information Age, Andrew Ross writes articles for technology leaders; helping them manage business critical issues both for today and in the future

Related Topics