How stream processing and data analytics can bring smart city projects to life

There is a significant buzz around smart cities and some exciting technology projects coming to fruition as more and more governments take serious steps towards modernising their infrastructure to cope with increased population in megacities around the world. Most of these government projects aim at providing smarter services and better quality of life for their citizens.

Among notable examples of smart city projects is the one of Moscow, where during 2018 CCTV cameras helped to solve more than 3500 crimes in the city, a reported increase of 16% in comparison to 2017. Las Vegas is another example where the city council is piloting countless new technologies in public spaces in a bid to boost its smart city brand and finally City of Canterbury Bankstown in south-west Sydney that has secured funding to progress the city’s smart waste management project.

This comes as no surprise since government expenditure on smart city initiatives and the overall smart city platforms market is expected to grow from $104.6bn in 2018 to $223.3bn by 2023, at a CAGR of 16.4%.

How does real-time data and stream processing help smart city projects provide better infrastructure and sustainability to a city’s future outlook? And why should governments consider the use of stream processing frameworks for their smart city initiatives?

First and foremost, by definition, Wikipedia refers to a smart city as an urban area that uses different types of electronic data collection sensors to supply information which is used to manage assets and resources efficiently. This includes data collected from citizens, devices, and assets that is processed and analysed to monitor and manage traffic and transportation systems, power plants, water supply networks, waste management, law enforcement, information systems, schools, libraries, hospitals, and other community services.

This means that in a Smart City various connected devices generate data continuously which needs to be analysed in real-time, using some a scalable and robust data processing framework. Stream processing with Apache Flink is a natural fit for the above situations because of the framework’s unique ability to respond to the specific aspects of a smart city project.

Putting the ‘smart’ in smart city: Getting data management right to build the city of the future

The effective management of data is fundamental in the successful construction of the smart cities of the future. Read here

In smart city projects, data is continuously produced

Smart city projects rely on numerous and various connected devices that send valuable information about what’s happening in a city or urban area. Connected devices could be anything from trams, buses and other modes of public transportation sending real-time traffic and congestion information, data from CCTV monitoring across different locations in the city or location data from mobile devices. This data is generated as continuous streams of events which makes stream processing the perfect choice for processing a continuous flow of data in real-time without relying on static query/response ways of data processing that add extra complexity and latency to any computation and decision making.

Data and Events could arrive in the processing system with delays or out-of-order

Because of the distributed environment where data is generated in a smart city project, being able to process it based on event-time is crucial. For example, in areas with limited network connectivity (such as areas between underground stations) or in cases of extreme weather conditions data can arrive with delays at the processing system or out-of-order. With Apache Flink’s support for event-time stream processing, data and analytics teams of such smart city projects are able to process the data based on the timestamp that they have (essentially based on when the event really took place) to detect patterns, and make any necessary adjustments, for example to traffic lights or telematics screens in bus stops or send any appropriate communication to the public.

Unlocking the potential of the Internet of Things for smart cities – a CTO’s perspective

Smart cities is an umbrella term to describe a city that utilises technology to enhance the lives of citizens by improving services. Read here

In the majority of the cases, the decisions are based on specific time windows

Because of the continuous and real-time nature of computations and architecture in a smart city project, data and analytics teams need to slice information in specific time windows to detect anomalies, find specific patterns and make any necessary transformations that will ultimately enable the city to react to situations and urban events in real-time. As an example, if data shows that connected buses show no location change for more than a 20-minute window, then probably there is a situation of emergency in the extended area and the police should be alerted or navigation around the extended area should be altered.

A notable example of Flink’s deployment in smart city projects is the city of Warsaw in Poland, which used Apache Flink to power its Vehicle Movement Analyser and the Vehicle Delay Prediction systems as part of the city’s VaVeL smart city project.

A different example is one of the city of Hangzhou in China. Hangzhou is a metropolis of more than 7m inhabitants and a few years back was ranked the fifth most congested city in China. The city has dropped to the 57th place now in terms of its congestion and that’s partly due to the deployment of Apache Flink powered artificial intelligence platform, City Brain.

The platform gathers data from different sources across Hangzhou, such as video from intersection cameras and GPS data on the locations of cars and buses in the city. City Brain then analyses the information in real time as it coordinates more than 1,000 road signals around the city with the aim of preventing or easing gridlock.

Within a smart city project, various IoT devices generate data continuously which needs to be analysed within a short period of time. The key is having access to technology which is able to obtain valuable intelligence from a large amount of real-time produced data.

Written by Marta Paes Moreira, Product Evangelist, Ververica (formerly data Artisans)
Written by Marta Paes Moreira, product evangelist, Ververica (formerly data Artisans)

Editor's Choice

Editor's Choice consists of the best articles written by third parties and selected by our editors. You can contact us at timothy.adler at stubbenedge.com