Facebook purges tens of thousands of apps as part of Cambridge Analytica probe
Removal of software from just over 400 developers 'not necessarily related to data misuse', company claims
Facebook has suspended tens of thousands of applications from its platform as part of a wider investigation into data-sharing practices following the Cambridge Analytica scandal.
The App Developer Investigation, launched in March 2018, has involved reviewing all apps that have access to large amounts of information, as well as those that express a "potential" to break Facebook's policies.
However, despite suspending tens of thousands of apps, the social media giant said this didn't necessarily indicate they posed an active threat to users. Many of these apps, for example, were still in their testing phase, and some developers who did not respond to requests for information also saw their apps taken down.
"In a few cases, we have banned apps completely," Facebook's vice president for product partnerships Ime Archibong said in a statement on Friday.
"That can happen for any number of reasons including inappropriately sharing data obtained from us, making data publicly available without protecting people's identity or something else that was in clear violation of our policies."
Facebook takes a more in-depth look where it has particular concerns, including a background investigation of the developer and technical analysis of the app's activities.
The results of this probe could then mean a developer is required to submit to tougher questioning, or that further inspections of the app are needed.
"We have not confirmed other instances of misuse to date other than those we have already notified the public about, but our investigation is not yet complete," Archibong continued.
"We have been in touch with regulators and policymakers on these issues. We'll continue working with them as our investigation continues."
A Facebook spokesperson was not able to say at the time of writing why these apps have been suspended now if the company's data-sharing policies were updated in 2014, post-scandal. It's also unclear as to how Facebook can determine how an app is deemed to have the "potential" to abuse its policies, nor why apps that posed no harm to users have been suspended.
One app that Facebook banned from its platform outright was myPersonality, which shared information with researchers and third-party companies, without adequate protections for user privacy. The developer also refused Facebook's request to audit the service.
Ironically, Facebook has itself been at the receiving end of such action in the recent past, with Apple stripping the company of its developer certificate in January for breaching its own app guidelines.
The iPhone maker banned Facebook because the company was marketing a data-gathering app known as Research VPN app to consumers, after publishing it onto Apple's Developer Enterprise Program (DEP). This hub has more relaxed rules around data-sharing than the App Store, because it's intended for internal employee testing and usage only.
Facebook insists it has made "widespread improvements" to how the company evaluates and sets policies for developers. These measures include removing a host of application programming interfaces (APIs), and better-staffing teams that investigate malicious developers and illicit software.
The IT Pro guide to Windows 10 migration
Everything you need to know for a successful transitionDownload now
Managing security risk and compliance in a challenging landscape
How key technology partners grow with your organisationDownload now
Software-defined storage for dummies
Control storage costs, eliminate storage bottlenecks and solve storage management challengesDownload now
6 best practices for escaping ransomware
A complete guide to tackling ransomware attacksDownload now