Facebook is where you drip, drip misinformation. So it’s where you go to try and affect elite discourse and political discourse, media discourse on Muslims. It has about 200 million users, and they tend to be richer, wealthier elites. And that is, you know, that’s a perfectly reasonable way to regulate that industry.Īdams: What does this sort of hate speech look like across the different platforms? Are there differences in the way that it shows up depending on which platform you’re on?Īhmed: The truth is that Twitter, for example, is used primarily to affect, because Twitter is really quite a small platform. And so by saying that your failure to meet standards which you’ve incorporated into your community standards, you could be liable for damages. What sort of impact do you think that could have on limiting anti-Muslim hate speech and hate speech in general?Īhmed: Well, the key thing to most of the legislation is they hold them to account to the standards that they set for themselves. Congress, I think we’re at the point now where it is time for them to take responsibility and show some accountability for the hatred that they allowed to proliferate on their platforms.Īdams: In Europe, the Digital Services Act is meant to hold social media companies more accountable for the content that’s on their platforms. So at this point, whether it’s legislation around the world coming into place, whether that’s in the United Kingdom with the online safety bill, the European Union with the Digital Services Act, or a raft of bills that have been proposed in the U.S. YouTube appears to have told people that they have taken down a few of the videos, but they didn’t tell anyone whether they took them down after our report came out, which is what we believe happened.
Twitter said, we know that we can do better. And in fact, it’s very similar numbers to those that we saw when we looked at COVID conspiracy theories and vaccine misinformation.Īdams: How have the platforms responded to your report so far?Īhmed: To date, most of the platforms haven’t responded. Kimberly Adams: And do you have any sense how that compares to the way that these platforms respond to other types of hate speech?Īhmed: It is very comparable to the work that we’ve done studying antisemitism and misogyny. Now that in itself is problematic that we could find it so easily, but then we reported it to the platforms using their own reporting tools, so clicking “Report dangerous post.” We went back a few weeks later to check what action did they take? What we found was really disturbing, that 9 out of 10 times, even when notified about the most egregious hatred - glorifying the terrorist at Christchurch, very, very dangerous conspiracy theories and extreme forms of hatred - 9 out of 10 times, they took no action whatsoever. Imran Ahmed: We identified hundreds of bits of hatred on their platform. The following is an edited transcript of his conversation with Marketplace’s Kimberly Adams on CCDH’s latest research on the problem.
Imran Ahmed is the founder and CEO of the group. But that’s an ongoing challenge as they operate in numerous countries with many languages and social contexts.Ī new report from the nonprofit Center for Countering Digital Hate reveals anti-Muslim hate speech and misinformation still proliferate online. Social media companies say they are working hard to prevent hate speech from being posted on their platforms, and remove it when it is.