Ethical design thinking: empowering designers to drive ethical change

Ethical design thinking should be incorporated into organisations as technologies like AI and data sharing become more advanced.

Whether intentional or not, we are surrounded by unethical technology. Technology that excludes groups of people and exploits human psychology has made its way into our everyday lives and has impacted society in ways we never imagined. But it doesn’t have to be this way. We can drive ethical change if we bring a human perspective to technological innovation.

To do this, organisation leaders must turn to and empower their designers to help them rethink how they innovate and drive ethical change.

In order to find out more about this problem and what can be done to correct it, Information Age spoke to Lisa Woodley, vice president, customer experience at NTT Data, who is raising awareness of the growing importance of ethics in customer experience design.

The technology problem: bias, manipulation and addiction

What exactly is the problem?

First, it’s important to take a step back to think about the different kind of technologies and the data used to drive those technologies that people use on a daily basis.

“When you think about the devices we have, the invasiveness of technology and how connected we’ve become, it’s created this perfect storm of potential for good and potential for evil, just like any other invention,” says Woodley.

Artificial intelligence, for example, is being used to look at all of the data from all the different sources where people interact, like social media websites, to create algorithms and make decisions. That’s a good thing because it allows us to make decisions based on facts. The problem is, this same AI is inherently biased. We are aware of this; it is after all a product of the biased data that it was trained with, and yet it is driving decisions like who should receive a mortgage.

“This is the first issue: we know AI is biased and we’re giving it complete control of important decisions,” continues Woodley.

The second issue arises from the sheer quantity of data that people are generating every day.

People don’t realise how much data they produce daily or what it means that a website uses cookies. Add to that the amount of, what Woodley calls, “self-reported data” that people give up on their Facebook, Twitter or Instagram.

“This starts to create data not only about who we are in our purchasing decisions, but it also creates deeper psychological models of us. These psychological models can then be used to manipulate us in ways most people don’t even realise,” she warns.

The third problem concerns the addictive nature of technology and in particular, our reliance on mobile phones — all of which has been carefully designed.

It’s important to point out that designers did not intentionally set out to make people addicted to their phones or technology. But, in the early days of web and mobile, ‘stickiness’ was a client requirement from designers. Businesses wanted people to stay on their websites or apps for as long as possible.

“In this reality, designers invented things like email or Facebook notifications, likes and comments – design patterns that tap into the addiction centre of your brain,” adds Woodley.

These three factors have come together to create a significant problem; addiction to technology has caused people to generate more data. They can’t help themselves even when they know how that data might be used. This data is then harvested. It is passed through algorithms that leverage artificial intelligence to make important decisions, despite the clear presence of bias, and to manipulate them to the detriment of their own psychological wellbeing.

“We’re at a point with technology where we’ve gone further down the rabbit hole than we intended, and we need to now focus on pulling ourselves out and avoiding these negative scenarios, without losing all the positive benefits of AI, technology and connectivity,” says Woodley.

Where does responsibility fall?

In one sense, society as a whole is responsible for the current situation.

Until recently, mainly because of a lack of awareness, we didn’t even know there was an issue with our data. But scandals, like Cambridge Analytica, where data was being used for particularly nefarious means — in this case, to influence the public in political thinking — garnered a significant amount of attention and brought the issue of personal data use into the public eye.

But, the reality is businesses have been monetising people’s personal data and using it to influence them for years.

Again, Woodley is quick to state that she doesn’t believe the majority of companies set themselves up to exploit people in this way. “I think they just weren’t thinking about the people. In social media companies, for example, nobody was thinking about the long-term impacts of their platforms on society.

“They saw that the financial gains were tremendous and they could use users’ data to create a significant value proposition, which is the aim of every business,” she explains.

This created a snowball effect, where companies then pushed their designers to ensure users did not leave their website or platform — which is where the problems outlined above first started. Woodley believes the same situation happened with AI, where companies assumed their programmes were unbiased from the start.

“As technology started to get more sophisticated, it’s almost as if the sophistication of the technology outpaced people’s ability to comprehend the ethics of what they were doing,” she adds.

When thinking about responsibility in terms of the role of the designer, it’s important to understand that their job is to bring the human perspective together with the needs of the business and the possibilities of technology to create a great user experience. Unfortunately, they’ve often been brought in too late to have a meaningful impact on what was being designed. As a result, they were never in a position to question what they were being asked to do. If a client wanted a sticky site, they would do everything in their power to make it happen. Not being in a position to really question the long term impact of their actions, designers became unwitting enablers of the current situation.

See also: How can technology design be made more inclusive? – Inclusive technology design that takes factors such as prior experience, demographics and disabilities into account is vital to minimising barriers.

However, designers can help drive the ethical change needed in technology to turn the situation around, if empowered by their businesses.

Ethical design thinking

Designers have started to recognise that some of what they have created is harming people.

They are now starting to look at the use of technology and its impact in the long term, with ethical design at the centre of their thinking.

Despite their motivation, companies have accepted that AI bias exists and are changing how they harvest and use people’s data — and designers are central to this change in strategy.

“The core is really around pivoting from what can be done, with the designer coming in at a later stage, to thinking about what should be done, with the designer coming in at the beginning of the process,” says Woodley.

“The designer represents the human. They create what is consumed by the person and so they should be ones influencing the line between what the business wants, what is possible from a technology perspective and what is responsible from an ethical perspective,” she continues.

Design thinking, starting with empathy or the understanding of the human, needs to be at the forefront of future technology innovations and services.

We need to flip the current model. Instead of leveraging technology to achieve business goals without taking the human impact into consideration, we need to put the human at the centre of our technology endeavours.

“Ethical design thinking focuses on the goals of the human – what they need and what are they motivated by. It then aligns these goals to business aspirations, and looks to technology to meet these goals in a way that will benefit both,” explains Woodley.

This type of thinking will lead to more ethical use of technology. However, for this to become a reality, designers need to be elevated by their organisations and brought into conversations around the business’ goals earlier in the process.

“When I talk to companies about the need to incorporate ethical design thinking, it goes hand-in-hand with them raising up their designers to have more of a voice at the table – it’s about changing the order of things and elevating the designer by recognising that they represent the voice of the human.

“When you’re ideating on technologies and apps and things that you’re developing for people, don’t wait until you finalised the idea before bringing the designer onboard. Bring the designer in upfront to help you figure out what that idea is, so that you have something that is more ethical and that will benefit both your business and the people who are the target for that technology,” Woodley concludes.

Avatar photo

Nick Ismail

Nick Ismail is a former editor for Information Age (from 2018 to 2022) before moving on to become Global Head of Brand Journalism at HCLTech. He has a particular interest in smart technologies, AI and...

Related Topics

Ethical Design Thinking