Privacy in the digital age: honouring the customer

Protecting data is more important than ever in an era of stricter regulations, where businesses have an obligation to honour the privacy of customers

privacy digital

As the 2018 GDPR deadline approaches, there will be a greater demand for more ferocious and effective security services and solutions – ones that can detect a threat as well as prevent it. Advanced algorithms, AI and advanced machine learning will increasingly represent privacy in the digital age

Protecting personal data should be a top priority for organisations across a range of industries. Currently, data breaches are dominating headlines, and while companies suffer corporate embarrassment, financially the consequences are minimal.

From 25 May 2018, however, that will all change. The EU’s General Data Protection Regulation (GDPR) will come into effect from this date, and from that point organisations that don’t protect customer data adequately will be committing financial suicide.

At the moment fines are tame. TalkTalk was fined £400,000 by the Information Commissioner’s Office (ICO) in October after 157,000 personal account details were stolen a year earlier.

If the same data breach occurs post-GDPR, the fine will be closer to £70 million. From a business standpoint, therefore, it will be crucial to protect personal data.

Robert Hoffmann, CEO of 1&1 Internet SE, suggests that Tesco Bank, after it was hacked in November this year, would be liable for a fine upwards of £1.94 billion if the regulation was in effect today.

The initial financial fallout, however, is probably not even the worst part. In a world of fierce competition, with disruptors coming from all angles, maintaining customer loyalty is more important than ever.

Using TalkTalk as the example again, after the breach its share price fell and it was fined, but it also began to
lose customers (and lots of them).

By May 2016, TalkTalk’s profits had halved and 98,000 broadband users had left the provider. The situation
has steadied since then, and in August it added 48,000 mobile users to its books and 36,000 fibre customers.

>See also: Big data vs. privacy: the big balancing act

However, this was most likely a result of slashed deals and an expensive marketing campaign.

The London Underground is lined with TalkTalk promotions. The point is, protecting customer data should be a top business priority moving forward, because post-GDPR no prisoners will be taken.

However, financial fallouts aside, businesses have an ethical responsibility to protect customer data, and new regulations will ensure this.

The ethical challenge

Consumer data is the figurative lifeblood of many organisations. It sustains them. A company must make a competitive return for its shareholders, but it should do so in an ethical way.

The same principles that guide an individual’s understanding of right and wrong should determine business practice.

The line between using a customer’s data and abusing it is fine, but there is a distinction. ‘Fair use of customer data,’ defines Jocelyn Paulley, director of Gowling WLG, ‘is when the person or business collecting the data has been open and transparent about what they intend to do with that data.Where there is transparency, there will be no exploitation.’

At the beginning of 2016, this issue was raised in the US. A group of privacy watchdogs called on the Federal Communications Commission (FCC) to increase privacy regulations on broadband providers.

The groups argued that broadband providers, such as Comcast, AT&T and Verizon, would exploit the wide ranging
data they had access to in order to target people with advertisements, according to Reuters.

>See also: Across the pond: the EU-US Privacy Shield

They added that the issue could ‘increase the potential for discriminatory practices derived from data use’.

This scenario is one example of how easy it would be to cross that invisible ethical line that resides within an individual’s mind and a business’s core of operations.

If as an organisation you are using customer data for one thing, like providing broadband, and you then begin using that data to enhance revenues through other modes, like advertising, an ethical issue arises.

Drawing a line

Using customer data in the right way but also to the benefit of the organisation is achievable.

Personalising offers or customising promotions is not an abuse if a company is transparent about how it will use a person’s data.

It shouldn’t be moving in the shadows. The ICO highlights the necessity of transparency in complying with both the Data Protection Act 1998 (DPA) and, even more so, the impending GDPR.

The most common way to provide this information to a customer is via a privacy notice. Under the current law of the DPA, an organisation must detail who they are, what they are going to do with a person’s information and who the information will be shared with.

These are the basic foundations on which all privacy notices should be built. Post-GDPR, however, these basic moral principles, like the more stringent financial consequences, will be expanded and enhanced.

This is representative of the hyperconnected state of the world. As data has become increasingly accessible, so
has the scope to abuse it.

‘The real challenge,’ says Paulley, ‘is how to tell people as clearly and simply as possible about how their
data may be used without that information becoming an annoyance and ruining the user experience, and how to keep that information current when it constantly changes as companies adopt new plans and find new uses for data.’ GDPR is rising to this challenge.

Privacy post-GDPR

GDPR includes rules on giving privacy information to data subjects in Articles 12, 13 and 14, according to the ICO.

These are more detailed and specific than in the DPA and place an emphasis on making privacy notices understandable and accessible.

The GDPR says that the information provided to people about processing personal data must be concise,
transparent, intelligible and easily accessible; written in clear and plain language, particularly if addressed to a child; and free of charge.

Indeed, in this new regulatory landscape, ‘brands will need to be clear, unambiguous and precise as to what they will do with data, including a separate permission to use data to “profile” an individual’, remarks Antony Humphreys, key account manager at Adestra.

>See also: Privacy Shield: how to comply when data regulations go off-road

‘An individual should have the right to access personal data held by the business and to have his or her personal data deleted. These are the requirements set forth in the EU’s GDPR,’ comments Carl Spataro, VP, deputy general counsel and chief privacy officer at MobileIron.

GDPR enforces stricter rules and is a response to the privacy needs of today, but even the best intentioned might fall at the first hurdle.

Securing the data should be the first step in an organisation’s journey towards compliance, but is the most prone to risk and failure.

Readers will be aware of the daily battle that organisations face trying to stave off cyber attacks and subsequent data breaches.

This constant fight against cybercriminals is where privacy in the digital age will be won or lost. Identifying and locating the data that is at risk is the first challenge in securing it.

In this regard, the ethical practice of a company could be regarded as a moot point. ‘No matter how honest a firm’s approach, there is often a technology issue – not an ethical one – that causes a lack of data visibility,’ says Tamzin Evershed, legal director at Veritas.

‘Over half (59%) of data held by UK companies is dark, meaning that no one, including senior management, knows its contents or even where it is stored.

This problem is only set to get worse, with an expected 44 zettabytes of data in the world by 2020.’ It is true that at the moment the situation looks quite bleak – a view shared by Jamie Gallagher, general manager at RelianceACSN.

‘The security industry is broken. Companies employ a mix-and-match patchwork of tools that has lulled them into a false sense of security.

But the industry has failed to educate organisations on how to manage their security and get the basics right first.

This has led to a complete lack of understanding around the value of the digital information held by most organisations.’

>See also: Change is coming: the GDPR storm

However, there is a flip side. Before 2000, everyone was scared about the Millennium Bug, so they made lots of changes to their systems and used it as an opportunity to modernise them.

GDPR should have the same effect. Currently, businesses are on the losing side of the cyber battle. But the impending GDPR has, to an extent, shaken board members, or the C-suite, out of their apathy.

Jes Breslaw, director of marketing and strategy, EMEA at Delphix, acknowledges this: ‘Data governance is coming to a head. Companies are going to be forced to have, from the top down, a data protection policy that seeks to put people, process and tools in place to deal with this.’

Privacy in the digital age The digital economy is becoming increasingly entwined with the traditional economy, and it is driving progress and innovation.

Data is the essential cog in this machine, and GDPR goes some way to protect it. Every organisation must have ethical codes in place when dealing with the privacy of customers’ personal data.

The majority adheres to this, and for those that don’t the new chief data protection officer (CDPO) – a requirement for big enterprises under GDPR – would most likely find out and report them.

These individuals will own a company’s data and are immune to whistle-blowing. The biggest threat to privacy in the digital age comes from lacklustre cyber security defences.

As the 2018 GDPR deadline approaches, there will be a greater demand for more ferocious and effective security services and solutions – ones that can detect a threat as well as prevent it.

Advanced algorithms, AI and advanced machine learning will increasingly represent privacy in the digital age.

Comments (0)