Emotional, biometric and wearable technologies are increasingly being used by mainstream brands.
This month saw big hitter Mastercard launch self biometric payments across Europe, allowing card users to verify transactions with a snap of their smartphone camera or fingerprint instead of irritating passwords or pin codes.
Meanwhile, Ford and Xaxis have been using emotional triggers around the UEFA Champions League to boost engagement in Vietnam.
The potential benefits of this kind of data for both sides of the transaction are clear.
Consumers get an enjoyable experience of a product or service that works with refreshing ease, while companies gain access to data about who the audience is and what they want.
But the technology is moving significantly faster than organisational processes, governmental legislation and corporate consciousness can keep up with.
The Data Protection Act, which defines UK law on the processing of personal data, was not designed with this reality in mind.
The forthcoming EU General Data Protection Regulation that the UK will have to “adequately” meet does not mention emotions.
In the absence of specific regulations, organisations wanting to make use of emotional, biometric and wearable data should do so safely, ethically, and respectfully. After all, such data is the most personal of the personal: it is intimate and profoundly experiential.
Companies looking to step into emotional, biometric and wearable data business should consider 3 key ethical points.
These were identified at a recent workshop hosted at Digital Catapult attended by data protection regulators, industry self-regulators, privacy officers of tech giants, CEOs of emotion–based start-ups and academics interested in data ethics. Namely:
1. Individual control is everything
There is arguably no information more intimate, unique and therefore highly personal than that produced by our own body.
As such, companies should be giving the individuals producing this kind of data complete control.
Use of data should not begin until people are expressly happy to share insights and they should fully understand what the data is to be used for. Don’t be covert and creepy.
2. How consent is determined is vital
Consent is not a privacy panacea and terms and conditions are of course mostly unread (and it is unreasonable to expect people to do so).
Sincerity and simplification are needed. What is key is ensuring consent is meaningful and that use of data about emotions does not begin without a clear and informed affirmative answer, perhaps even offering a “light” version of what you’re offering if the user initially declines.
3. View privacy as an opportunity, not a barrier
Managing personal data can be seen as hard, complex and as a matter of compliance.
Done meaningfully, the prize is that ethics and authenticity, along with creativity, builds reputations with hard-to-reach potential and existing customers.
Recent research findings show that younger people in particular are willing to engage in new emotion-based interactive experiences, but they require control and sincerity from companies. They see this as currently lacking.
Be open and clear, build trust and strong relationships, showcase delightfully creative use of entrusted data, and be upfront. Don’t bury bad news: it will get found.
It’s early days, there’s potential: show leadership and don’t blow it. Emotion is a sensitive business.
Going forward, the appeal for organisations is clear: according to the IPA emotive campaigns outsell informative campaigns on every business metric: 17% more profit, 30% greater market share and 19% higher sales.
The value in understanding and measuring emotion reactions is obvious. However, desire to increase effectiveness must be balanced not just with legal compliance, but good data ethics, sincerity and creativity.
Respect, put people first, make captivating experiences, and enjoy financial rewards from doing the right thing.
Sourced by Dr Andrew McStay, director of the Network for the Study of Media and Persuasive Communication at Bangor University