Digital self defense: Is privacy tech killing AI?Digital self defense does not mean MI5 agents. Nor are those who practice it unified by a single ideology. They’re fed up with the pervasive online behavioural tracking that now follows them into their offline lives.
AI needs data. Lots of it. The more data you can feed a machine-learning algorithm, the better it can spot patterns, make decisions, predict behaviours, personalise content, diagnose medical conditions, power smart everything, detect cyberthreats and fraud; indeed, AI and data make for a happy partnership: “The algorithm without data is blind. Data without algorithms is dumb.” Even so, some digital self defense maybe in order.
But AI is at risk. Not everyone wants to share, at least, not under the current rules of digital engagement. Some individuals disengage entirely, becoming digital hermits. Others proceed with caution, using privacy-enhancing technologies (PETs) to plug the digital leak: a kind karate chop, digital self defense — they don’t trust website privacy notices, they verify them with tools like DuckDuckGo’s Privacy Grade extension and soon, machine-readable privacy notices. They don’t tell companies their preferences; they enforce them with tools like Baycloud Bouncer, and search anonymously using AI-powered privacy-protective search engines and browsers like Duck Duck Go, Brave, Cliqz and Firefox. These rebels use Privacy Badger to block cross-page tracking by invisible trackers and navigate the world using Google alternatives like Open Street Map, messaging apps like Signal, and zero-knowledge cloud providers like Tresorit and Cozy to collaborate and store data.
Don’t just check that box. What if regulatory compliance actually enhanced business innovation and performance
“For companies, regulatory compliance does not mean just checking a box to keep the dogs at bay. It can mean achieving performance benchmarks that move the company forward,” says Ronald Lear, the CMMI Institute’s Director of IP Development.
Digital self defense
These individuals aren’t MI5 agents. Nor are they unified by a single ideology. They’re fed up with the pervasive online behavioural tracking that now follows them into their offline lives.
This behaviour started with the stunning 2013 Edward Snowden revelations of sweeping US government surveillance programs that gave the NSA “unprecedented [military] access to civilian communications”, allowing them to slurp emails, photos, videos, live chats and more from the servers of Apple, Yahoo, Google, Skype, and Facebook. Similarly invasive tactics by the UK intelligence agency GCHQ were revealed, spurring greater privacy awareness on both sides of the pond that changed the trajectory of EU data protection law, leading to the prosumer, privacy-focused culture reflected in today’s GDPR.
Tech giants rushed to prove their privacy creds by offering encryption as a default, declaring user privacy a priority. They positioned themselves as trusted data guardians. And consumers gave them their trust. No need for digital self defense when you have trust.
Fast-forward to today, and the Cambridge Analytica scandal has revealed something perhaps even more disturbing than government surveillance: Surveillance Capitalism. Fuelled by the Adtech industry, online trackers and shady data brokers scrape, package, analyse, profile, auction, exchange, and weaponise our digital identities – or distorted versions of them – to deliver ‘precision marketing’ to influence our behaviours.
Amazon knows more about us than we do about ourselves. Facebook can predict our next sentence, and, more frighteningly, our vote. The Cambridge Analytica scandal revealed how Adtech microtargeting techniques paired with fake news and psych-ops tactics could be marshalled to influence voter decisions in the Brexit Referendum and the 2016 US election.
Adtech’s new darling is location-based marketing, which tracks and maps users’ moves, following them into their offline worlds, inferring intimate details by combining app data and other sources they’ve collected and compiled to create rich profiles and segment users by race, sexual preference, financial status, age, eating habits, substance abuse history, political affiliation, hate content, infertility, then peddling them in the dark alleyways of Real Time Bidding platforms, with potentially harmful effects.
Why companies must become custodians of customer and internal data
We’re more than the sum of our zeroes and ones….
The profiles aren’t always accurate and targeting decisions can be outright offensive: bail bond ads targeting users with African-American sounding names. Big brands like Pepsi promoted on Nazi websites. One social media user who’d been relentlessly targeted with adoption ads after her stillbirth made a public plea to tech giants to fix their algorithms: “If you’re smart enough to realise that I’m pregnant, that I’ve given birth, then surely you’re smart enough to realise that my baby died, and can advertise to me accordingly, or maybe, just maybe, not at all.”
Precision marketing is creepy. It may be harmful. And it’s not really working. It’s “commoditising human relationships at the expense of customer understanding” notes Tricia Wang, a technology ethnographer. She claims 70% of chief marketing officers feel Adtech isn’t delivering valuable customer insights. The New York Times actually saw ad revenue increase after blocking ad exchanges in the EU. Big Data gives the macro picture – an incomplete and inaccurate one – while missing the human narrative. Ironically, Adtech has purged customer-centricity from marketing. It can’t see the human for the zeroes and ones.
For tech’s sake: Reconciling emerging tech and the GDPR
In the lead-up to ‘G-Day’, critics warned GDPR would have a chilling effect on innovation and called on regulators to abandon core GDPR principles in favour of emerging tech. But by pitting privacy against innovation they missed the mark on both. Ironically, their pleas revealed a striking resistance to change and a vigorous defence of ‘business-as-usual’. There is no need for tech and GDPR to be at odds. Privacy lawyer, Abigail Dubiniecki, takes up the story.
And they’re going on the offence
Blackbelts like Pernille Tranberg of Data Ethics are helping consumers and companies find win-win alternatives to corporate surveillance. She’s teaching young cybernauts the basics so they can avoid a Truman Show future — a kind of digital self defense teacher. Consumers can repatriate their data with tools like Yo-Da, Digi.Me and TapMyData, and transact on their own terms at a fair market value with tools like Advantagious.
Website tools like Perimeter block unknown or unsafe third party trackers while permitting privacy-friendly ones. Brave helps publishers get more revenue per view through direct rewards, and visitors can get paid to view privacy-friendly ads. Analytics providers like Matomo give website owners rich analytical data that they own, and helpful privacy controls they can leverage.
Why privacy by design is like going to gym
Jason Cronk, expert on privacy by design, a crucial part of the GDPR, but which is being rolled out worldwide, talks to Information Age about AI, GDPR and privacy in the data age. He starts by comparing Privacy by Design with going to the gym.
Companies that wage war on their consumers will eventually lose.
Adversarial? Yes. But it doesn’t have to be. Instead of disabling ad blockers or duping users into consent with deceptive interfaces and dark patterns, advertisers could try respecting people’s choices. Collaborate rather than compete. Get to know your customers. Make it human. Otherwise you’re fighting a losing battle.
We risk a digital crisis in 2019 akin to the 2008 banking crisis, warns data privacy lawyer
Is privacy killing AI?
No, but Adtech will if we’re not careful.
When you treat every online interaction like a digital one-night stand, don’t expect brand loyalty. People want to reclaim their agency. They’re tired of being watched and now they’re watching the watchers. They are voting with their adblockers: with 1.7B people blocking ads, it’s “the biggest boycott in human history.”
Privacy is humanising AI and it can save Adtech. Privacy-friendly tools level the playing field, encouraging people to share more. So advertisers: analyse anonymous or encrypted data, decentralise AI. Put the human back in AI to unlock the illusory customer insights Surveillance Adtech can’t deliver.
GDPR anniversary: has the regulation backfired? What next?
Abigail Dubiniecki is a speaker and educator, and privacy specialist at My Inhouse Lawyer”?