If you are heading for the beach the next day, and you are worried about that spare tyre around your waist, there is no fix. You can try and disguise the problem, breathe in a lot, wear a baggy t-shirt with vertical stripes, or spend the day in the water. Or you could come clean, of course. And just own up to the fact you are not as skinny as you would like. But if you are heading for the beach later in the year, say six months on, then you could resort to the gym, and indeed a healthy diet. It seems that privacy by design is like that. It does not have to be a cost to a company, it can give a company a real commercial edge, in just the same way as going to the gym can give you more energy. But it won’t provide any tangible benefits the next day.
He also does something that is quite refreshing for a privacy advocate, he speaks human, unlike so many of his peers who speak in tongues, GDPR rule 11.7.2 which states “bla bla” and then goes into a sentence that is so long that it seems to stretch into eternity.
It’s one of the oddities of GDPR, it makes it a requirement to make privacy policies and opt in procedures easy to understand, but it explains the requirements in words that seemed to be designed to solve insomnia.
Privacy by design, is the idea that when organisations develop new products or services, they think about the privacy implications from the start, in much the same way that a constructor thinks about foundations for a new property before building the walls. Privacy by design does not have to be ‘the’ founding stone of a new product, but it does have to be ‘a’ founding stone.
But that requires work. “Well, I’ve been likening privacy by design to things like going to the gym,” he says. “I can tell you how to work out, I can tell you how to do cardio, I can tell you how to lift weights and build muscle. You have to be motivated to get off the couch and go to the gym. (Are you a gym instructor, too Jason? — Ed) Similarly with a company, the benefits are long-term. In the future, you’re going to have stronger products, you’re going to have better products, your customer base is going to trust you better, but it’s a long-term payoff.
“I recently went on a hike with a cousin of mine and he bailed on the first day, he couldn’t make it up the mountains carrying his backpack, even though we told him six months before that he needed to start training and getting in shape, but he didn’t. It’s not like he could say ‘oh well let me go exercise tomorrow and I’ll come back the next day and I’ll be all better and ready to hike. No, he should have put in the work six months earlier.”
For tech’s sake: Reconciling emerging tech and the GDPR
In the lead-up to ‘G-Day’, critics warned GDPR would have a chilling effect on innovation and called on regulators to abandon core GDPR principles in favour of emerging tech. But by pitting privacy against innovation they missed the mark on both. Ironically, their pleas revealed a striking resistance to change and a vigorous defence of ‘business-as-usual’. There is no need for tech and GDPR to be at odds. Privacy lawyer, Abigail Dubiniecki, takes up the story.
Practices what he preaches
Jason sort of applied the idea of privacy by design in his own training. He was an IT guy, armed with a degree in mathematics and information systems. “But I wanted to get more into privacy, and everyone I saw in that business was a lawyer, so I went to law school and became a qualified lawyer,” even though he has never practised it.
What has that got to do with privacy by design? Well, it’s a good metaphor. Understanding privacy, wanting to be an expert in privacy was the foundation for his law studies. He studied law with privacy considerations always there, in the back, and occasionally front, of his mind. Just like privacy by design, privacy was no after-thought, it was the key impetus to taking the degree.
Apple’s Tim Cook gushes over GDPR, but then it doesn’t give stuff away for free
It helps if you are called Gerard
We then turn to some bad news if your name isn’t Gerard or you don’t play lacrosse.
“There was a recent news case about some AI in recruiting being used and it said that the best indicators of success in a job were being named Gerard and playing university lacrosse.” (Is that why Steven Gerard did so well for Liverpool? – Ed)
Of course, the finding is not, shall we say, 100% accurate. It turns out that some people do quite well for themselves, even if their name isn’t Gerard, or Lacrosse isn’t their sport.
And that brings us to data. It is in the company’s interests to ensure the data they use to make decisions is accurate, isn’t drawn from biased samples; does not contradict diversity. It seems obvious, but Jeremy says that companies are not applying these common sense ideas.
“I was at an event recently and there was somebody speaking about using artificial intelligence, I think it was in recruiting or something, and a woman raised a hand and asked a question about how do you prevent discrimination. And the person speaking was just clueless, they had no idea. They just didn’t understand what the person meant.
“But this person on the stage, who was purporting to be some kind of expert in artificial intelligence, was just so narrowly focused that they had no clue that was even a possibility of such issues. People have to be aware of that, otherwise they can’t build in controls to prevent it.”
We risk a digital crisis in 2019 akin to the 2008 banking crisis, warns data privacy lawyer
Privacy by design and the IT people
So we asked Jason to think about privacy by design from the point of view of IT, of the CTO, or IT people, what do they need to bear in mind?
Jason refers to Dan Solove’s taxonomy of privacy. Jason explains: “He has identified 16 different types of privacy and four privacy violations in four categories. So there’s information collection, such as surveillance and interrogation; there’s information processing; information dissemination; and intrusions which are not information privacy violations but they’re privacy violations nonetheless.”
“When I train people, I talk to them and they’re like ‘oh, I didn’t think of that’. So just getting them to think about these kinds of things before they make decisions is a critical first step. Because again, we’re so used to doing the same thing over and over again that it becomes a road and then we don’t realise what we do may have implications down the line.”
Then there is Jaap-Henk Hoepman, a professor from the Netherlands who has a series of 26 tactics that are grouped by eight strategies in order to mitigate different types of privacy violations. Jason explained: “So his strategies are things like minimising data; separating data; giving users control; enforcing policies and demonstrating compliance with your policies. These 26 tactics give practical, real-world steps that a engineer can take and we can show that ‘oh, we can use this tactic to mitigate this violation of this potential actor, and again, concrete steps, not high level principles that are hard to reach.”
Panasonic’s GDPR journey with marketing director Stephen Yeo
Then there is the issue of complying with regulations around the world.
GDPR is different from California’s impending regulation, is different from Brazil’s, Canada’s rules, the Indian approach. How do you possibly comply with all?
For Jason it boils down to privacy by design.
If you follow the principles he talks about “you won’t have to worry about the fact that you’re operating in 100 different jurisdictions with 100 different laws, because you’ll cover yourself for 90% of that. So I think it’s a good core.”
GDPR anniversary: has the regulation backfired? What next?
And finally there is decisional interference. Jason says: “If you look at the Cambridge Analytica scandal, the problem there wasn’t just a researcher collecting information and sharing it with Cambridge Analytica, not just Cambridge Analytica developing psychographic profile; but the big concern, was Cambridge Analytica’s manipulation of people and trying to manipulate their votes.”
“And that is what’s called decisional interference — interfering with people’s private affairs, with their private decision-making as autonomous individuals.”
It’s the kind of thing that would either have George Orwell turning in his grave, or maybe laughing sardonically and saying: “I told you so.”
But then they didn’t have privacy by design in 1948, when Orwell’s most famous book was written. His main protagonist, Winston Smith didn’t benefit from privacy by design. Then again, to paraphrase the first sentence of 1984: when “the clocks strikes 13”, it is not too late to start implementing privacy by design, but it would be a good idea to start straight away.