Facebook is responding to whistleblower Frances Hoggin’s testimony by trying to turn the narrative into hate speech.

And the company’s Vice President of Integrity, Jay Rosen, posted a defense of anti-hate action across the social network, saying that the decline in hate speech matters more than just the presence of this content.

Rosen said the prevalence of hate across Facebook has fallen by nearly 50 percent in the past three quarters to 0.05 percent of content viewed, or about five views out of every 10,000.

“The narrative that the technology we use to combat hate speech is insufficient and that we are deliberately misrepresenting our progress was wrong,” he added.

“We don’t want to see hate on our platform, and neither do users or advertisers,” Rosen wrote. We are transparent about our work to remove them. What these documents make clear is that our integrity work is a multi-year journey. While we will never be perfect, our teams are constantly working to develop our systems, identify problems, and build solutions.

This means taking care to avoid accidentally removing content, and limiting access to people, groups, and pages that are likely to violate policies.

The Wall Street Journal report notes that internal documents show that two years ago, the company reduced the time human auditors focused on hate speech complaints. It also made other modifications that reduced the number of complaints.

This in turn helped create the appearance that the company’s AI was more successful in enforcing the rules than it actually was.

Rosen’s response also does not address Hugin’s claim that the company has resisted implementing safer algorithms to reduce hateful interactions.

And the company may be making strides in reducing hate. But this is not the view of Hugin, who says the social media company is not doing enough.

The company sometimes ran into a problem because the content was mistakenly reported as hate speech, and the removal system could lead to more incidents. Likewise, hate will only have a limited effect if only a few people view a particular post.

In her testimony, Hugin emphasized that the company can only pick up a very small minority of offensive material, and if true, this remains a problem, even if only a small portion of users view the material.

Related Articles
Leave a Reply

Your email address will not be published. Required fields are marked *