SAN FRANCISCO: Facebook fired back after a series of damning reports in the Wall Street Journal reported that the company had failed to protect users.
The company is under relentless pressure to protect itself from being a platform for misinformation and hatred to spread while remaining a forum for people to speak freely. It has been difficult to respond to.
A number of recent Wall Street Journal reports said the company knew its Instagram photo-sharing tool was affecting the mental health of teenage girls and that its moderation system was double-yarded to allow VIPs to bypass rules.
One of the articles cited Facebook’s own research, saying that changing its software in 2018 ultimately fueled political outrage and division.
But Facebook announced Tuesday that it has spent more than $ 13 billion in the past five years on teams and technology dedicated to combating abuse.
Around 40,000 people now work for the Californian technology giant on security and protection; according to Facebook, the number quadrupled in 2016.
“How technology companies deal with complex topics is questioned intensively and often without important context,” claimed Facebook in a blog post.
The social network launched a website about.facebook.com/progress to showcase its anti-abuse work.
Facebook’s Nick Clegg also attacked the coverage in a blog post on Saturday, saying the articles were unfair.
“At the heart of this series is a downright false claim: Facebook researches and then systematically and deliberately ignores it when the results are inconvenient for the company,” he wrote.
The Journal stories, in part, cited studies commissioned by the company that had troubling revelations such as, “We make body image problems worse in one in three teenage girls.”
Clegg said the stories selectively used quotations in a way that provided a deliberately one-sided view of the company’s work.
“We will continue to ask ourselves the tough questions. And we will continue to improve our products and services as a result, ”he said at the end of his contribution.
Facebook recently launched an initiative aimed at users working together on the platform to promote real-world violence or conspiracy theories, starting with the dismantling of a German network that is spreading Covid misinformation.
The new tool is designed to detect organized, malicious attempts that pose a threat but do not comply with the social media giant’s existing rules against hate groups, said Facebook security chief Nathaniel Gleicher.