IT Pro is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

Facebook investors will urge firm to drop end-to-end encryption plans

Shareholders say plans could hinder efforts to detect and stop child exploitation cases

Facebook website on a computer screen

Facebook’s shareholders will urge the company to ditch plans to implement end-to-end encryption across its messaging systems, saying that it could hinder efforts to detect and stop child exploitation cases.

The investors are set to discuss and vote on Facebook’s management plans at the company’s annual shareholders meeting today, which is to be held virtually due to COVID-19 safety measures.

The meeting will allow them to seek a board review of the company’s plans to use end-to-end encryption and its potential impact on child abuse victims.

Facebook CEO Mark Zuckerberg previously stated that “people's private communications should be secure” and that the company’s implementation of end-to-end encryption will prevent “anyone -- including us -- from seeing what people share on our services”.

However, Michael Passoff, CEO of shareholder advocacy service Proxy Impact, said that  “shareholders are legitimately concerned that Facebook's role as a facilitator of child abuse and exploitation will spiral even further out of control if it adopts end-to-end encryption without first stopping predators who prey on children”.

“Not only is it the right thing to do, but it is in the best interests of the company which may otherwise face legislative, regulatory, legal, advertising and consumer backlashes,” he added.

Zuckerberg’s plans to implement end-to-end encryption, announced last year, have previously raised concerns about children’s safety.

“When we were deciding whether to go to end-to-end encryption across the different apps, this was one of the things that just weighed the most heavily on me,” Zuckerberg said at Facebook’s internal Q&A session in October 2019.

However, he expressed that he was "optimistic" that criminals could still be identified by other means, such as their patterns of activity.

According to The New York Times, Facebook Messenger alone was responsible for “nearly 12 million of the 18.4 million worldwide reports of” child sexual abuse material (CSAM) in 2018.

Featured Resources

Accelerating AI modernisation with data infrastructure

Generate business value from your AI initiatives

Free Download

Recommendations for managing AI risks

Integrate your external AI tool findings into your broader security programs

Free Download

Modernise your legacy databases in the cloud

An introduction to cloud databases

Free Download

Powering through to innovation

IT agility drive digital transformation

Free Download

Most Popular

FCC commissioner urges Apple and Google to remove TikTok from app stores
data protection

FCC commissioner urges Apple and Google to remove TikTok from app stores

29 Jun 2022
Former Uber security chief to face fraud charges over hack coverup
data breaches

Former Uber security chief to face fraud charges over hack coverup

29 Jun 2022
Internet providers look to ease cost of living crisis with cheaper broadband
broadband

Internet providers look to ease cost of living crisis with cheaper broadband

29 Jun 2022