My TV is spying on me! How IoT innovators can strike a balance with privacy concerns

Voice-controlled smart TVs raised privacy concerns earlier this year with reports people could unknowingly have their private conversations recorded in their own homes when the voice recognition functionality was enabled.

The idea a connected device might be snooping on your conversation is something straight out of George Orwell’s 1984 and is of course becoming an increasingly widespread phenomenon – voice recognition is becoming commonplace, for example, within everything from car dashboards to mobile telephones.

The Internet of Things (IoT) holds significant potential for growth within a great number of innovating and creative companies. However, the potential for privacy intrusion where voice activated features are used, for example, is also very real.  

As we are surrounded by more and more devices in the home with networked ears and eyes, what precisely are the obligations of companies with the ability to “snoop” from a privacy perspective?

>See also: Why privacy concerns will hinder trust in the Internet of Things

The legal framework

The relevant legal framework with which to assess the privacy and data protection issues raised by the IoT in the EU is composed primarily of Directive 95/46/EC (the “Data Protection Directive”).

The Data Protection Directive applies to all processing of personal data (which includes spoken voice data) carried out where a data controller is established in an EU country, or importantly in the context of the IoT, where a data controller makes use of equipment situated in the EU.

To recap, the “data controller” is the person (or entity) who determines the purposes and manner in which personal data is processed. In the context of connected TVs, the data controller could be, say, a TV manufacturer established in the EU or a TV manufacturer who is established outside the EU but who collects voice data of users in the EU via voice recognition functionality on a connected TV.

Where the Data Protection Directive applies, the data controller is then obliged to comply with the various obligations set out within it.

In the context of a connected TV manufacturer, this would include, for example, ensuring that any processing of voice data is “legitimate”, typically via the consent of its users.

The issue of what constitutes valid consent is a particularly complex area, with different views across the EU as to what it means and how it is obtained. However, it is questionable whether consent would be deemed valid if a notice that “voice data will be collected by a TV manufacturer when voice recognition functionality is enabled” was buried in a privacy policy, for example.

Further obligations on a TV manufacturer include the obligation to process the voice data only for the specified purposes for which it was collected, and to not keep it for any longer than was necessary to fulfil those purposes.

The identity of the controller, the purposes of the processing, the recipients of the data (if any), the existence of the rights of a user to access their data, and so on, should also all be set out in a clear and comprehensive manner in the data controller’s privacy policy. And the controller should ensure it has the consents to process data it believes it has before any collection or processing takes place.

Keeping data secure

Where personal data, including voice data, is collected, the Data Protection Directive goes on to provide that the controller “must implement appropriate technical and organisational measures to protect personal data”.

Consequently, any controller of voice data remains fully responsible for the security of that data. If security flaws resulting in data breaches are the result of an inadequate design or maintenance of the devices used, this would engage the responsibility of the data controller.

In terms of sanctions for data breaches, there has been a recent push for more aggressive fine levels and enforcement in the EU as a result of too many companies taking a half-hearted approach to data protection compliance, a view expressed by the enforcers across Europe.

Expected over the coming months is a new Data Protection Regulation for the EU (the “Regulation”), which will replace the existing Data Protection Directive and usher in sweeping changes with proposals to beef up and alter the current regime.

A key part of the Regulation is that larger fines for data protection breaches – 2% to 5% of global turnover, or up to €100 million– have been proposed.

Fines for serious breaches have already increased significantly in the UK in recent years, with each offence now punishable by a fine of up to £500,000, while fines in other EU states, such as Spain, France, the Netherlands and Germany, can be equally significant.

Civil action against data controllers where breaches occur is also a realistic prospect and particularly worthy of note is that there is an increasing trend in EU countries, such as the UK, to permit privacy claims via the courts even where no financial loss has occurred, significantly broadening the circumstances in which data protection litigation can be brought and damages awarded.

>See also: Get ready: the Internet of Things is the final nail in privacy’s coffin

Privacy by design

The bottom line in light of the potential risks involved is that companies manufacturing IoT devices and otherwise providing smart services need to be thinking about ‘privacy by design’, which has been a key mantra coming out of Europe for some time now.

Essentially, companies must now demonstrate that they are taking data protection seriously at the design and implementation stage.

In practice, it is necessary to perform security assessments on systems and services as a whole, in addition to training staff and having policies in place dealing with key issues such as data handling, data access for users, breach notification and so on.

Whilst well drafted, user-facing privacy policies can help, far greater levels of transparency about data processing will be also be necessary, along with clearly signposted opt-outs and user-controls.

When investigating a violation, enforcers are unlikely to have much sympathy for organisations that have taken a lackadaisical approach to compliance.

Conversely, demonstrating that efforts have been made to empower users regarding choice over how their data are used, to update old policies and to retrain employees should help to reduce the risk of regulator fines and civil action.

 

Sourced from Rafi Azim-Khan and Steven Farmer, Pillsbury Law

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...