- November 21, 2020
- Posted by: Bastion team
- Category: World News
SAN FRANCISCO • Facebook, for the first time, has disclosed numbers on the prevalence of hate speech on its platform, saying that out of every 10,000 content views in the third quarter, 10 to 11 included hate speech.
The world’s largest social media firm, under scrutiny over its policing of abuses, particularly around the US presidential election, released the estimate in its quarterly content moderation report.
Facebook said it took action on 22.1 million pieces of hate speech content in the third quarter, about 95 per cent of which was proactively identified, compared with 22.5 million in the previous quarter.
The company defines “taking action” as removing content, covering it with a warning, disabling accounts, or escalating it to external agencies.
This summer, civil rights groups organised a widespread advertising boycott to try to pressure Facebook to act against hate speech.
The company agreed to disclose the hate speech metric, calculated by examining a representative sample of content seen on Facebook, and submit itself to an independent audit of its enforcement record.
On a call with reporters, Facebook head of safety and integrity Guy Rosen said the audit would be completed “over the course of 2021”.
The Anti-Defamation League (ADL), one of the groups behind the boycott, said that Facebook’s new metric still lacked sufficient context for a full assessment of its performance.
“We still don’t know from this report exactly how many pieces of content users are flagging to Facebook – whether or not action was taken,” said ADL spokesman Todd Gutnick. That data matters, he said, as “there are many forms of hate speech that are not being removed, even after they’re flagged”.
Rivals Twitter and YouTube, owned by Alphabet Inc’s Google, do not disclose comparable prevalence metrics.
Mr Rosen also said that from March 1 until the Nov 3 election, the company removed more than 265,000 pieces of content from Facebook and Instagram in the United States for violating its voter interference policies.
Facebook said it took action on 19.2 million pieces of violent and graphic content in the third quarter, up from 15 million in the second. On Instagram, it took action on 4.1 million pieces of violent and graphic content.