Balancing data privacy with ambitious IT projects

When GDPR was adopted in 2016, before it came into force last year and Privacy by Design was for the first time due to be a legislative requirement, there was outcry amongst the business IT elite that hundreds of digital transformation projects could be derailed. Many had been scoped and designed years before. Retrospective Privacy by Design was in many cases impossible, as it would require stopping the project(s), re-evaluating and in many cases, starting from scratch.

After all, digital transformation aims for the freeing of data around the business, while Privacy by Design looks to ensure its confidentiality. Surely the two goals are mutually exclusive? Surely the need for Privacy by Design is destined to at least restrain digital transformation projects and their impact on the business, if not stop them in their tracks?

Clearly, this was hyperbole. As with many business problems, a practical and effective compromise is necessary, and also entirely possible.

Your digital transformation to-do list for 2019

To elevate your network in the New Year, prioritise these three high-impact areas, according to Layne Levine, president of the Enterprise and Wholesale business unit at Windstream. Read here

Balancing act

For digital transformation projects that look to use sensitive data (such as employee records or customer profiles) it is an incredibly difficult balance to find. It is also a difficult balance to simply describe. It is most clearly explained through practical examples that show where the balance has been missed, and the consequences it had.

We have included two scenarios below, based on our experience. The errors these companies made might seem obvious, but look again dispassionately. Are you certain none of your projects are heading towards similar issues?

Scenario one: the over-ambitious retailer

A major European retailer was in the midst of a major digital transformation project that aimed to analyse its workforce’s use of various physical and digital resources in order to design a more productive working environment. It planned to then use AI to identify redundant, inadequate and over/underused resources.

Most intriguing was that the retailer was then planning to incorporate employee HR records into the analysis, alongside the interactions with the resources. Cross-referencing usage reports from online tools, IoT data from access cards and from hardware with employee attendance, training and performance ratings would lead to a clear and objective analysis of who had productivity problems. This output would then automatically determine who was entitled to pay rises and promotions. The goal was to remove any danger of bias (conscious or otherwise) amongst line managers in performance reviews.

Clearly this was a delicate use of data. And while the technical infrastructure used was robust and secure, protection of individuals’ privacy had not been key to the project’s design. Privacy was in fact the antithesis of some of the project’s objectives. The most alarming part was that employees were the subjects of automated decision-making without any form of consent or even awareness, contrary to GDPR. This meant that the project had to be abandoned, refocused and redesigned, at great cost to the business and some individuals’ internal reputations.

Fundamentally, this was a case of over-ambition and insufficient collaboration. The first stage of the project – the analysis of resources use – was perfectly legitimate, and highly valuable, assuming the data was at least pseudonymised if not anonymous. It was the step into automated profiling and the use of individuals’ data that caused the most problems.

Retail digital transformation: Another industry ripe for disruption

We’re covering industries that are already on there digital transformation journeys. Retail, one of the most competitive sectors, is up next. Read here

Scenario two: the dangerously naïve developer team

A global organisation that produces medical devices for the healthcare market used IoT technology to monitor and record the usage of every individual device for product development and preventative maintenance. Regardless of the relatively benign purpose, because of the nature of these medical devices and the broad approach to data collection, the usage data that the developers were collecting was inherently sensitive. Healthcare data is classified as “special category” data by GDPR as well as others, which brings with it additional prohibitions over its use and heightened penalties for its mishandling.

More concerning was that neither the patients, the healthcare professionals nor the business were aware of the collection and use of the data. No framework was in place to govern its collection, use or storage. No processes were documented. Furthermore, the business had not yet appointed a data protection officer.

Once the legal teams began their GDPR preparations, they quickly discovered this data use. Rightly, they immediately insisted development was ceased, regardless of the imperfections in the devices and software that the development team had identified, and of the cost of not rectifying them to the business.

It was a perfect example of an ambitious, well-meaning digital project overreaching itself because privacy had not been incorporated into its design from the outset. The results were dramatic – delays to product development, disenfranchised users, frustrated investors and additional costs in re-architecting the product.

The top 10 strategic technology trends for 2019, according to Gartner

Blockchain, artificial intelligence, empowered edge, privacy and ethics, quantum computing, immersive experiences, augmented analytics, autonomous things and digital twins drive the Gartner Top 10 Strategic Technology Trends for 2019. Read here

What should these companies have done?

The two examples above show the consequences of paying insufficient attention to data privacy in sophisticated IT projects. But there is just as much danger to be found in applying the law imprecisely, and restraining the business’ activities unduly. This is why modern IT projects require privacy architects.

These are people who can assess the business objectives of the project and recognise the privacy legislation it is subject to and the requirements they entail. Crucially, this is not just privacy consultancy.

Privacy architects are unusually skilled in both privacy and technology, able to input into the technical aspects of IT projects just as readily as advising on the necessary regulations, meaning privacy by design will not hamper the business’ objectives.

Without such oversight, digital transformation projects run the risk of exposing the business to non-adherence, loss of customer trust and unnecessary expense. Whose project and reputation will be next?

Written by Sophie Chase-Borthwick, director of Data Ethics & Privacy at Calligo
Written by Sophie Chase-Borthwick, director of Data Ethics & Privacy at Calligo

Editor's Choice

Editor's Choice consists of the best articles written by third parties and selected by our editors. You can contact us at timothy.adler at stubbenedge.com

Related Topics

Data Privacy
GDPR