Q&A: the importance of online content moderation for a diverse user base

Dex Hunter-Torricke, head of communications & public engagement at the Meta Oversight Board, spoke to Information Age about how the body goes about maintaining fair content moderation for diverse communities

The Meta Oversight Board (OSB) is an independent body dedicated to moderating content on Facebook and Instagram, managing appeals to decisions made around material posted on these platforms. Maintaining fairness and accessibility for a diverse, worldwide user base is vital for ensuring that users feel safe, and that online communities continue to thrive.

Part of the OSB is head of communications & public engagement Dex Hunter-Torricke, who has served in his role since the board’s launch in May 2020, and has held a long-standing passion for ensuring that underrepresented groups feel continuously included and safe online. In this Q&A, Dex sheds light on his tech sector journey, the importance of content moderation, and how to make big tech more accountable. This expands on insights provided at a recent panel that took place at the Women in IT Summit UK.

You’ve had a range of impressive roles at big-name tech organisations. What got you into this line of work?

I was one of those young, idealistic and slight naïve people who wanted to go and change the world.

My first job after university was at the United Nations, and I figured at the start of my career that I would spend my entire career in the public sector. But here, I quickly realised the limitations of the UN, and the public sector generally. The UN is comprised of many institutions with brilliant people, but they are repeatedly hamstrung by a lack of resources, as well as vast amounts of bureaucracy. The other problem is when we’re young and growing up, we don’t think about change in a sophisticated way. It wasn’t until later that I realised that governments don’t have a monopoly on change — it’s something we need a lot of different actors to focus on, and most change is driven by everyday people.

After the UN, I joined Google, and from here spent the majority of my career in tech. Although there were a number of superficial differences and different focuses [compared to the public sector], for example on products and services, there is a common struggle to try and improve the lives of people.

Having navigated that intersection between the public and private sectors, a common frustration I’ve had over the past few years is how little understanding there is within government around the long-term challenges that society faces. Post-pandemic, there will be bigger challenges: climate change, for example, will make Covid look like a walk in the park. I don’t think we’re even vaguely equipped to handle those problems.

From your experience at OSB, what key actions can we take to make big tech more accountable when it comes to content and impact?

The board began making decisions in January 2021. Since then, we’ve issued 23 case decisions, overturned 16 cases, and issued 108 recommendations to what is now Meta.

What we’re now starting to see is the pattern of Meta being slowly shifted in the direction of better transparency and treatment of users. Of course, these are still baby steps — we’re very cognisant that the board’s capabilities are narrow, but important as well. For example, we’ve examined Meta’s community standards, which serve all users, and looked to ensure that those rules are clear to all users of Facebook and Instagram. That’s just the basic element of transparency, and ensuring the rules treat people fairly. If you can’t read the rules and understand them, how can you possibly abide by them?

We’ve pushed Meta to translate its community standards into languages used by hundreds of millions of people, but have never been included before, which the company has agreed to do. Also, we’ve got the company to agree to publish informal government requests for data. Meta had already committed, and does publish formal requests, but we were keen to expose informal requests as well, so researchers and policymakers pushing for transparency can see that.

There were a bunch of recommendations made around the suspension of Donald Trump from Facebook and Instagram, because this was a highly consequential case not just for US citizens, but users worldwide. One of the major results of that was telling Meta they need a real crisis protocol to deal with these big, novel situations. The company has agreed to do that, and to publish the framework in the next few weeks. We know that the protocol is already being used for cases such as the Ukraine crisis. Our work has covered a very broad range of issues from COVID misinformation to hate speech around the world, and I expect that the range of cases will continue to grow.

How would you say content moderation and regulation has progressed over the last few years, and what can we expect for 2023?

Much of the focus on regulation over the past few years has been on trying to be reactive to problems that have already emerged on social media. Initiatives like the EU Digital Services Act and the UK Online Safety Bill are examples of policymakers trying to play catchup. Although, there is no question are these measures important — big tech companies have admitted they need to be more tightly regulated.

Over the next couple of years, we should see policymakers look to be more proactive, responding to emerging technologies before it’s too late. One big, current example is the metaverse: it’s the buzzword of the year, but it’s also a real platform shift of enormous substance emerging in the industry, and the engineering has got to a stage where immersive and augmented worlds can be built. I don’t believe policymakers have kept up with those trends — some still chuckle when the metaverse is brought up, assuming it’s mere marketing speak. But I see this being as disruptive, if not more so, as the arrival of smartphones, both in a positive and negative sense.

What is key to closing the ever present digital divide for those who can’t access digital tools or the internet more broadly?

The way we think about the digital divide has changed: if you’d asked me about it 10 years ago, I’d say it was purely about getting people connected to the internet. In 2000, there were only around 1 billion people using the internet, and now we’re over 4 billion, with that number continuing to rise. So I think we should be confident that the tech industry will close that gap in time.

The real focus will now shift to the quality of access within that huge community, which is neither equal, nor fair for the majority. Most users globally aren’t wallowing in oceans of unlimited data, or diverse amounts of content experiences. We’re dealing with an internet where the vast majority of services from big tech platforms continue to either be only delivered in English, or optimised for English. This means that entire communities around the world aren’t being given content or services that reflect the local linguistic and cultural nuances necessary to continue engaging properly. That serves no one well, not even those of us who are enjoying the most advanced internet services, because the whole promise of having a ‘world wide web’ was that we would unleash productivity and creativity in every part of the world.

How have you seen diversity, equity, inclusion and accessibility issues evolve during your time in the tech industry – and why do you think things aren’t progressing in the way we’d hoped?

Progress has been slow. I don’t think there’s anyone who’s following DEI issues and thinking that things are moving as quickly as we’d like, because there is a natural tendency to be reactive to problems. Much of the focus over the last few years has been on tackling whatever’s made the biggest headlines. While that’s incredibly important, DEI encompasses immense challenges, because they reflect the messiness and global nature of modern society.

Another big area that I was very focused on when starting out in the tech industry, is how socioeconomic diversity issues are treated. It’s not enough to look to hire talent from specific demographics, if they reflect the most privileged members of society. We need to ensure we reflect the full diversity of society.

I personally was the first in my family to receive a university degree, and when working in tech companies in California I was often the only European in the room, and I would think to myself: ‘Why aren’t there more people like me in the room providing a different point of view?’. We know we have to do better on that.

As communities globally become more economically, culturally and politically powerful, that will influence how companies cater to their interests in line with DEI policies, but we still have a long way to go.

Related:

How the regulation of big tech can affect your business — The UK’s pending Online Safety Bill and the EU’s Digital Services Act are designed for the regulation of big tech, but there is the issue of legal but harmful and unintended consequences that can affect your business.

The biggest diversity, equity and inclusion trends in tech — This article will take a look at the biggest trends in the tech sector relating to diversity, equity and inclusion.

Avatar photo

Aaron Hurst

Aaron Hurst is Information Age's senior reporter, providing news and features around the hottest trends across the tech industry.