Big breaches and data leaks occur with disappointing frequency. We often see incidents happen (which may or not be breaches) whereby millions of records fall into the wrong hands or are misused. But when large swathes of personal information are leaked so regularly, people often respond with apathy because they don’t see the personal impact of these breaches.
If a fraudster takes out a mortgage application using someone else’s details, it is nearly impossible to pin down where those details originated. So, it is treated like any other fraud quickly forgotten. In contrast, people have reacted to the Cambridge Analytica scandal with outrage across all parts of the globe. What makes this situation different?
For many years, advertisers and marketers have been working to try and understand ‘who’ their customers are. Big data has allowed information to be collected and analysed at scale.
>See also: Why do big data projects fail?
Many companies with access to a dataset of habits or personality traits will use it to help improve their products and offerings. Video streaming sites can recommend shows and movies that they think you’d be interested in based on both your own viewing history, and what others with similar interests have watched. Shopping advertisers use similar algorithms to help predict buying patterns.
For example, purchasing a pair of shoes will often lead to shoe polish recommendations. While not perfect, these legitimate uses mostly work well for both sides. They help the retailer upsell or keep the viewer glued to a site for longer, and the consumer enjoys the convenience of “hand-picked” recommendations.
When data manipulates you
However, when huge amounts of data on individuals are aggregated into one place, it becomes possible to use that data to manipulate the individual.
In her 2013 paper, Analyzing the Chemistry of Data, Wendy Nather asked whether data should be treated like dangerous chemicals. Individual data elements may be inert, but when combined, they can create a toxic mix.
As Cambridge Analytica’s CEO Alexander Nix explained at a marketing conference in 2017, two individuals could have similar characteristics—the same age, gender, have similar incomes, families, and subscribe to the same newspapers—but have very different drivers of behaviour according to the OCEAN personality model. By adding this knowledge to the mix, it is possible to understand what really drives people, and play to their hopes and fears to target them accordingly.
>See also: Big data and analytics in the UK Police Force
It’s worth bearing in mind that targeted pushing of content to individuals is not at all unique to Cambridge Analytica’s practices. It is similar in principle to how a video provider would recommend a video, and it’s what nearly all social media networks do when they push content in a non-chronological order, or serve up selected advertisements.
However, there are two major differences in what Cambridge Analytica allegedly did. The first was the fact that it violated its agreement with Facebook to obtain the data profiles of an estimated 50 million Facebook users. Second, was the fact that people felt manipulated at scale believing that the company had fundamentally undermined the democratic process in several countries.
What we can learn from this
We’re living in unprecedented times. Big data analytics have made possible today fantasy situations that used to be reserved for the movies. This is not necessarily a bad thing, and we will likely see some good uses arising in the future.
A few years ago, Target made the headlines after it knew a teenage girl was pregnant before her parents did. But, imagine if the same kind of technology was able to alert you that you were about to have a heart attack before any physical symptoms occurred – thus saving your life.
Technology is a tool, and it can be used for good or nefarious purposes.
>See also: How Tesco is using AI to gain customer insight
The Cambridge Analytica incident won’t be the last of its kind that is met with public outrage. So, it’s worthwhile for any company that collects personal data about customers to remember the following:
1. Personal information relating to customers, employees and users is extremely valuable. Hackers will go to great lengths to obtain this data, so it must be protected accordingly.
2. Vetting partners and your supply chain is extremely important. If you are allowing a partner to access your company servers or a subset of your data, you need to ensure that the partners are who they say they are and will use data for the intended purposes only. This goes well beyond a simple paper exercise.
3. Threat detection controls are vital to be able to identify when partners or insiders may be undertaking suspicious activities. By monitoring for untoward actions, an appropriate response can be taken to deal with any potential issues in a timely manner and before they become a larger incident.
4. Finally, any incidents where an individual’s privacy is compromised need to be responded to appropriately. GDPR is clear about rapid response capabilities, and the need to report breaches to both regulators and affected individuals where applicable. The response should involve clear and timely communication that specifies what happened, how it happened, what steps the company is taking, as well as offering advice to affected individuals.
Sourced by Javvad Malik, security advocate at AlienVault