Facebook investors will urge firm to drop end-to-end encryption plans

Shareholders say plans could hinder efforts to detect and stop child exploitation cases

Facebook’s shareholders will urge the company to ditch plans to implement end-to-end encryption across its messaging systems, saying that it could hinder efforts to detect and stop child exploitation cases.

The investors are set to discuss and vote on Facebook’s management plans at the company’s annual shareholders meeting today, which is to be held virtually due to COVID-19 safety measures.

Advertisement - Article continues below

The meeting will allow them to seek a board review of the company’s plans to use end-to-end encryption and its potential impact on child abuse victims.

Facebook CEO Mark Zuckerberg previously stated that “people's private communications should be secure” and that the company’s implementation of end-to-end encryption will prevent “anyone -- including us -- from seeing what people share on our services”.

However, Michael Passoff, CEO of shareholder advocacy service Proxy Impact, said that  “shareholders are legitimately concerned that Facebook's role as a facilitator of child abuse and exploitation will spiral even further out of control if it adopts end-to-end encryption without first stopping predators who prey on children”.

“Not only is it the right thing to do, but it is in the best interests of the company which may otherwise face legislative, regulatory, legal, advertising and consumer backlashes,” he added.

Advertisement
Advertisement - Article continues below

Zuckerberg’s plans to implement end-to-end encryption, announced last year, have previously raised concerns about children’s safety.

Advertisement - Article continues below

“When we were deciding whether to go to end-to-end encryption across the different apps, this was one of the things that just weighed the most heavily on me,” Zuckerberg said at Facebook’s internal Q&A session in October 2019.

However, he expressed that he was "optimistic" that criminals could still be identified by other means, such as their patterns of activity.

According to The New York Times, Facebook Messenger alone was responsible for “nearly 12 million of the 18.4 million worldwide reports of” child sexual abuse material (CSAM) in 2018.

Featured Resources

Preparing for long-term remote working after COVID-19

Learn how to safely and securely enable your remote workforce

Download now

Cloud vs on-premise storage: What’s right for you?

Key considerations driving document storage decisions for businesses

Download now

Staying ahead of the game in the world of data

Create successful marketing campaigns by understanding your customers better

Download now

Transforming productivity

Solutions that facilitate work at full speed

Download now
Advertisement

Most Popular

Visit/mobile/google-android/356373/over-2-dozen-additional-android-apps-found-stealing-user-data
Google Android

Over two dozen Android apps found stealing user data

7 Jul 2020
Visit/laptops/29190/how-to-find-ram-speed-size-and-type
Laptops

How to find RAM speed, size and type

24 Jun 2020
Visit/cloud/356260/the-road-to-recovery
Sponsored

The road to recovery

30 Jun 2020