May's anti-internet response to terror called "embarrassing"

PM Theresa May has said the internet gives extremists a "safe space to breed"

Prime Minister Theresa May has used the terror attack in London over the weekend to take another swing at introducing backdoors to encryption a common pattern that has brought out the usual criticism from the tech industry, calling her response "lazy," "embarrassing" and "disappointing".

After the terror attack at London Bridge on Saturday evening, May once again called for Silicon Valley giants to do more against extremism.

She said: "we cannot allow this ideology the safe space it needs to breed. Yet that is precisely what the internet and the big companies that provide internet-based services provide. We need to work with allied, democratic governments to reach international agreements that regulate cyberspace to prevent the spread of extremism and terrorist planning. And we need to do everything we can at home to reduce the risks of extremism online."

Her comment touched on two points: first, the removal of extremist propaganda spread via social media; and second, the use of encrypted messaging tools for "planning" such attacks.

Advertisement - Article continues below
Advertisement - Article continues below

May and her government have made similar calls after the other two terror incidents in the country this year. After the Westminster attack, Home Secretary Amber Rudd said: "We need to make sure that organisations like WhatsApp, and there are plenty of others like that, don't provide a secret place for terrorists to communicate with each other". After the Manchester bombing, reports suggested the government planned to start enforcing aspects of the new Investigatory Powers Act that would force online companies to hand over user data via encryption backdoors assuming the party is re-elected, that is.

Repeated encryption criticism

May's rhetoric was, once again, criticised. The repetition is "embarrassing," Paul Bernal, University of East Anglia, said on Twitter. "The knee-jerk blame the internet' that comes after every act of terrorism is so blatant as to be embarrassing," he said.

Professor Peter Neumann, director of the International Centre For The Study Of Radicalisation at King's College London, said May's response was "lazy". Most jihadists are now using end-to-end encrypted messenger platforms e.g. Telegram. This has not solved [the]problem, just made it different," he said on Twitter. "Moreover, few people radicalised exclusively online. Blaming social media platforms is politically convenient but intellectually lazy."

He added: "In other words, May's statement may have sounded strong but contained very little that is actionable, different, or new."

Jim Killock, director of the Open Rights Group, said May's response was "disappointing".

Advertisement - Article continues below

"This could be a very risky approach," he said. "If successful, Theresa May could push these vile networks into even darker corners of the web, where they will be even harder to observe."

"But we should not be distracted: the internet and companies like Facebook are not a cause of this hatred and violence, but tools that can be abused," Killock added. "While governments and companies should take sensible measures to stop abuse, attempts to control the internet is not the simple solution that Theresa May is claiming."

Cory Doctorow, author and internet pundit, cited internet activistAaron Swartz' call that it's "not okay to not understand the internet anymore". Doctorow said in a blog post: "That goes double for cryptography: any politician caught spouting off about back doors is unfit for office anywhere but Hogwarts, which is also the only educational institution whose computer science department believes in 'golden keys' that only let the right sort of people break your encryption."

Tech response

Advertisement - Article continues below

Leading tech firms stressed they were doing all they could to remove extremist content from their platforms, with Google saying it had spent hundreds of millions of pounds doing so.

Facebook told the BBC: "Using a combination of technology and human review, we work aggressively to remove terrorist content from our platform as soon as we become aware of it - and if we become aware of an emergency involving imminent harm to someone's safety, we notify law enforcement."

Advertisement - Article continues below

Twitter echoed that, saying "terrorist content has no place" on its site, adding that it is working to expand its use of technology to automate the removal of such content.

However, online firms have been repeatedly caught out with extremist content on their sites, with the government pulling its own advertising after it was spotted running alongside hate-promoting videos on YouTubeand a government report slammed tech companies for "profiting from hatred". Of course, such content can be removed without tech firms needing to meddle with encryption.

Featured Resources

How inkjet can transform your business

Get more out of your business by investing in the right printing technology

Download now

Journey to a modern workplace with Office 365: which tools and when?

A guide to how Office 365 builds a modern workplace

Download now

Modernise and transform your sales organisation

Learn how a modernised sales process can drive your business

Download now

Your guide to managing cloud transformation risk

Realise the benefits. Mitigate the risks

Download now

Most Popular

cloud computing

Google Cloud snaps up multi-cloud analytics platform for $2.6bn

13 Feb 2020

How to use Chromecast without Wi-Fi

5 Feb 2020
Microsoft Azure

Microsoft Azure is a testament to Satya Nadella’s strategic nouse

14 Feb 2020
operating systems

How to fix a stuck Windows 10 update

12 Feb 2020