In a speech at the United Nations general assembly later today, the UK Prime Minister – Theresa May – is expected to tell technology companies to go “further and faster” in removing extremist content.
In order to drive her point, the prime minister will also host a meeting with world leaders and Facebook, Microsoft and Twitter in an effort to stamp out the circulation of hate-filled terror propaganda.
May will challenge social networks – like Facebook and Twitter – and search engines – like Google – to take down terrorist material within two hours. Throughout her time as prime minister, she has repeatedly called for tech firms to come together to help end the “safe spaces” that terrorists enjoy online.
Following recent terror attacks in the UK, ministers – including Home Secretary, Amber Rudd – have called for limits to end-to-end encryption messaging, like the service offered by WhatsApp. This is seen as one of those “safe” online spaces for terrorists to communicate, because messages can’t be stopped and intercepted by third parties. This, of course, raises the contentious issue of online privacy.
Addressing the UN, May will say the terrorists will never win, but that “defiance alone is not enough”.
“Ultimately it is not just the terrorists themselves who we need to defeat. It is the extremist ideologies that fuel them. It is the ideologies that preach hatred, sow division and undermine our common humanity,” she will say.
Internet giant, Google, said tech firms were doing their part in this fight, but they need governments and users to help.
In her speech later today, the prime minister will acknowledge this progress made by technology companies, and praise it, since the establishment in June of an industry forum to counter terrorism.
However, she will urge them to go “further and faster” in developing artificial intelligence, machine learning-led solutions, which can automatically to automatically reduce the amount of time terror content remains available online.
The target for the UK, France and Italy will be one to two hours to take down terrorist content wherever it appears.
Whether this is taken seriously by tech companies will be judged in one month time, when ministers at a G7 meeting on 20 October will decide on what progress has been made – if any.
Kent Walker, general counsel for Google, who is representing tech firms at May’s meeting later today, said the technology industry would not be able to “do it alone”.
“Machine learning has improved but we are not all the way there yet,” he told BBC Radio 4’s Today programme, in an exclusive interview.
“We need people and we need feedback from trusted government sources and from our users to identify and remove some of the most problematic content out there.”
When referring to bomb making instructions, he said: “Whenever we can locate this material, we are removing it. The challenge is once it’s removed, many people re-post it or there are copies of it across the web.”
“And so the challenge of identifying it and identifying the difference between bomb-making instructions and things that might look similar that might be perfectly legal – might be documentary or scientific in nature – is a real challenge.”
Downing Street, however, is confident that the technology companies can achieve the full removal of terror content from the online web. “These companies have some of the best brains in the world,” a Downing Street source said.
“They should really be focusing on what matters, which is stopping the spread of terrorism and violence.”