Digital self defense: Is privacy tech killing AI?

AI needs data. Lots of it. The more data you can feed a machine learning algorithm, the better it can spot patterns, make decisions, predict behaviours, personalise content, diagnose medical conditions, power smart everything, detect cyber threats and fraud; indeed, AI and data make for a happy partnership: “The algorithm without data is blind. Data without algorithms is dumb.” Even so, some digital self defense maybe in order.

But AI is at risk. Not everyone wants to share, at least, not under the current rules of digital engagement. Some individuals disengage entirely, becoming digital hermits. Others proceed with caution, using privacy-enhancing technologies (PETs) to plug the digital leak: a kind karate chop, digital self defense — they don’t trust website privacy notices, they verify them with tools like DuckDuckGo’s Privacy Grade extension and soon, machine-readable privacy notices. They don’t tell companies their preferences; they enforce them with dedicated tools, and search anonymously using AI-powered privacy-protective search engines and browsers like Duck Duck Go, Brave and Firefox. These rebels use Privacy Badger to block cross-page tracking by invisible trackers and navigate the world using Google alternatives like Open Street Map, messaging apps like Signal, and zero-knowledge cloud providers like Tresorit and Cozy to collaborate and store data.

Don’t just check that box. What if regulatory compliance actually enhanced business innovation and performance

“For companies, regulatory compliance does not mean just checking a box to keep the dogs at bay. It can mean achieving performance benchmarks that move the company forward,” says Ronald Lear, the CMMI Institute’s Director of IP Development.

Digital self defense

These are some of the tools of digital self defense (AKA Surveillance Self Defense) and their use means privacy may be gaming AI algorithms by skewing the Big Data inputs.

These individuals aren’t MI5 agents. Nor are they unified by a single ideology. They’re fed up with the pervasive online behavioural tracking that now follows them into their offline lives.

This behaviour started with the stunning 2013 Edward Snowden revelations of sweeping US government surveillance programs that gave the NSA “unprecedented [military] access to civilian communications”, allowing them to slurp emails, photos, videos, live chats and more from the servers of Apple, Google, Skype, and Facebook. Similarly invasive tactics by the UK intelligence agency GCHQ have been revealed in the past, spurring greater privacy awareness on both sides of the pond that changed the trajectory of EU data protection law, leading to the prosumer, privacy-focused culture reflected in regulations such as GDPR.

Tech giants rushed to prove their privacy creds by offering encryption as a default, declaring user privacy a priority. They positioned themselves as trusted data guardians. And consumers gave them their trust.  No need for digital self defence when you have trust.

Fast-forward to the Cambridge Analytica scandal in 2018, which revealed something perhaps even more disturbing than government surveillance: Surveillance Capitalism. Fuelled by the Adtech industry, online trackers and shady data brokers scrape, package, analyse, profile, auction, exchange, and weaponise our digital identities – or distorted versions of them – to deliver ‘precision marketing’ to influence our behaviours.

Amazon knows more about us than we do about ourselves. Facebook can predict our next sentence, and, more frighteningly, our vote. The Cambridge Analytica scandal revealed how Adtech microtargeting techniques paired with fake news and psych-ops tactics could be marshalled to influence voter decisions. This led to Facebook banning political advertising on its platform earlier this year.

Adtech’s new darling is location-based marketing, which tracks and maps users’ moves, following them into their offline worlds, inferring intimate details by combining app data and other sources they’ve collected and compiled to create rich profiles and segment users by race, sexual preference, financial status, age, eating habits, substance abuse history, political affiliation, hate content, infertility, then peddling them in the dark alleyways of Real Time Bidding platforms, with potentially harmful effects.

Why companies must become custodians of customer and internal data

Jon Cano-Lopez, CEO of REaD Group, explores why organisations must become custodians of the new oil in business: data

We’re more than the sum of our zeroes and ones….

The profiles aren’t always accurate and targeting decisions can be outright offensive: bail bond ads targeting users with African-American sounding names. Big brands like Pepsi promoted on Nazi websites. One social media user who had been relentlessly targeted with adoption ads after her stillbirth made a public plea to tech giants to fix their algorithms: “If you’re smart enough to realise that I’m pregnant, that I’ve given birth, then surely you’re smart enough to realise that my baby died, and can advertise to me accordingly, or maybe, just maybe, not at all.”

Precision marketing is creepy. It may be harmful. And it’s not really working. It’s “commoditising human relationships at the expense of customer understanding” noted Tricia Wang, a technology ethnographer. She claimed 70 per cent of chief marketing officers feel Adtech isn’t delivering valuable customer insights. The New York Times actually saw ad revenue increase in 2019 after blocking ad exchanges in the EU. Big data gives the macro picture – an incomplete and inaccurate one – while missing the human narrative. Ironically, Adtech is in danger of purging customer-centricity from marketing. It can’t see the human for the zeroes and ones.

For tech’s sake: Reconciling emerging tech and the GDPR

In the lead-up to ‘G-Day’, critics warned GDPR would have a chilling effect on innovation and called on regulators to abandon core GDPR principles in favour of emerging tech. But by pitting privacy against innovation they missed the mark on both. Ironically, their pleas revealed a striking resistance to change and a vigorous defence of ‘business-as-usual’. There is no need for tech and GDPR to be at odds. Privacy lawyer, Abigail Dubiniecki, takes up the story.

And they’re going on the offence

Blackbelts like Pernille Tranberg, co-founder of Data Ethics, are helping consumers and companies find win-win alternatives to corporate surveillance. She’s teaching young cybernauts the basics so they can avoid a Truman Show future — a kind of digital self defense teacher. Consumers can repatriate their data with tools like Digi.Me and Tapx, and transact on their own terms at a fair market value.

Website tools like Blockthrough block unknown or unsafe third party trackers while permitting privacy-friendly ones. Brave helps publishers get more revenue per view through direct rewards, and visitors can get paid to view privacy-friendly ads. Analytics providers like Matomo give website owners rich analytical data that they own, and helpful privacy controls they can leverage.

Why privacy by design is like going to gym

Jason Cronk, expert on privacy by design, a crucial part of the GDPR, but which is being rolled out worldwide, talks to Information Age about AI, GDPR and privacy in the data age. He starts by comparing Privacy by Design with going to the gym.

Companies that wage war on their consumers will eventually lose.

Adversarial? Yes. But it doesn’t have to be. Instead of disabling ad blockers or duping users into consent with deceptive interfaces and dark patterns, advertisers could try respecting people’s choices. Collaborate rather than compete. Get to know your customers. Make it human. Otherwise you’re fighting a losing battle.

We risk a digital crisis in 2019 akin to the 2008 banking crisis, warns data privacy lawyer

The 2019 digital crisis, data privacy charlatans and the good guys with an ethical approach: data privacy will diverge in 2019 says privacy lawyer.

Is privacy killing AI?

No, but Adtech will if we’re not careful.

When you treat every online interaction like a digital one-night stand, don’t expect brand loyalty. People want to reclaim their agency. They’re tired of being watched and now they’re watching the watchers. They are voting with their adblockers: with 1.7B people blocking ads, it’s “the biggest boycott in human history.” And with the likes of Apple and Google shifting away from third-party cookies, advertisers will need to find new ways to leverage consumer data while retaining trust.

Privacy is humanising AI and it can save Adtech. Privacy-friendly tools level the playing field, encouraging people to share more. So advertisers: analyse anonymous or encrypted data, decentralise AI. Put the human back in AI to unlock the illusory customer insights Surveillance Adtech can’t deliver.

GDPR anniversary: has the regulation backfired? What next?

The definitive round-up of GDPR as it celebrates its first anniversary. Has GDPR backfired? what next for GDPR?

Abigail Dubiniecki is a speaker, educator, and privacy specialist at My Inhouse Lawyer.

Related:

How to navigate the iOS user privacy landscape in a post-IDFA world — Si Crowhurst, vice-president, creative labs & brand at Vungle, discusses how advertisers can navigate the iOS user privacy landscape in a post-IDFA world.

The future impact of artificial intelligence — This article will explore how artificial intelligence is set to impact organisations in the future, gauging the insights of experts in the space.

Abigail Dubiniecki

A privacy lawyer from My Inhouse Lawyer, speaker and educator.