The Biden administration wants to fight domestic terrorism. How can technology help you?


Earlier this month, the White House published its very first strategy to combat domestic terrorism. The plan includes more funding for investigators and prosecutors, better information sharing between agencies, and efforts to address the underlying causes of violent extremism, such as racism and bigotry.

Technology also has a role to play. Joe Biden’s administration has said it will invest in programs to increase digital literacy and work with tech companies to make it harder to recruit terrorists online.

I spoke with Heidi Beirich, co-founder of the nonprofit Global Project Against Hate and Extremism, which works to expose and counter racism, bigotry and prejudice. I asked him what we know about how to stop extremist ideas from spreading online. The following is an edited transcript of our conversation.

Heidi Beirich (Photo courtesy of Val Downes)

Heidi Beirich: We know for sure that one thing works, and that’s basically the removal of hate group material from major tech platforms. There is a lot of evidence that it reduces the number of recruits, reduces the amount of propaganda. And this is not just for white supremacists. This is also true of ISIS, for example, and al-Qaida, which have been massively distorted over the years. We have evidence on the click-through rates of people who watch videos that warn them of the dangers of these movements. There’s also what they call “redirect programs” where you could research something about white supremacy, and material is coming up on maybe mental health issues or other things that are kind of a path. to get away, for example, from something that glorified Hitler.

Amy Scott: And what do you think the click-through rates are telling us? Do we know that people are following, say, the mental health link?

Beirich: We know people are clicking on it. We have very little evidence of what happens after that point, which is really the most important point, isn’t it? Are they engaging with a mental health professional? Are they getting help? If they watch a video about the horrors of white supremacy, does it actually change their opinions? This is the kind of data we need to ensure the success of these programs.

Scott: And what could the government do about it?

Beirich: Well, I think it’s very important – and part of this has already started – for the government to make funds available to civil society organizations to start experimenting in this space.

Scott: I imagine government involvement is risky, however, in terms of people’s perceptions. Let’s say a counter-programming video with a government stamp can be considered propaganda.

Beirich: There is no doubt, and there have been several failed propaganda projects by the government. I mean, certainly during the cold war, but we also saw it in the fight against ISIS. And in fact, the FBI published not too long ago – you know, I hate to say that, because I have a lot of friends in the FBI – but a terrible website called Don’t Be a Puppet, that was to stop young people from being radicalized, and it was absolutely not evidence-based and fundamentally ridiculous. So you don’t even want the government’s heavy hand and just pure propaganda, you want facts. And you also need people trained in psychology to be successful in this stuff.

Scott: Is there anything we can learn from the way other countries are approaching this?

Beirich: We have a lot to learn from our allies. For example, Germany, for decades, Sweden too, have had exit programs in place to help people come out of extremist movements of all kinds, including things like neo-Nazism. They do a lot of work with young people to try to alleviate these problems, and they have been more direct in defending democracy against white supremacy and similar threats. These are all things the United States could learn from, and in fact, in Biden’s recent strategy paper on countering domestic terrorism, there is a lot of talk about learning from the allies, so hopefully that is. is exactly what is happening.

Scott: The White House report or strategy talks about tackling the root causes of extremism, one of which is systemic racism. But there is this struggle right now as to whether we can even teach about it in our schools. What can law enforcement actually do about the kind of underlying causes?

Beirich: Yes, that’s a complicated thing in this critical race theory debate, I think it’s very unfortunate because at the root of the white supremacist problem is not really understanding the history of racism and the impact of racism in this country. I mean, obviously we wouldn’t have this movement if we didn’t have deep tensions of racism in the country yet. But now, when it comes to law enforcement, we need to use them to stop the most terrifying aspects of this threat – hate crimes, for example, domestic terrorism – but we also need to reform the relationship between the forces of order and communities. . It would be nice if law enforcement took hate crimes seriously in many parts of this country which they don’t, which would strengthen those ties and ultimately lead, hopefully. , to a reduction in the racism that propels it all.

Scott: Why do you think the United States is so far in recognizing the threat of domestic terrorism and white supremacist extremists?

Beirich: Frankly, it was a political failure of the administrations of both parties. After September 11, our entire government apparatus, the entire FBI, intelligence agencies, domestic and international, turned their attention entirely to Islamic extremism. It was as if they had forgotten that Timothy McVeigh had detonated the federal building in Oklahoma City a few years before, in 1995. And until 2014, very late in the [Barack] the Obama administration, there was really no emphasis on this threat as it grew, as more and more people were being killed by white supremacists. I mean, there have been serious government failures to stand up and pay attention and realize that this was not a problem, of sorts, with only one type of terrorism being our focus. It was a question of “and”. And frankly, we should have known better. So now the problem is much worse than it was 10 years ago.

Related Links: More information from Amy Scott

We have more of that disastrous Don’t Be a Puppet campaign that Beirich was talking about. The interactive website was intended to teach teens how to recognize violent extremist messages and avoid being dragged away by them. But, like Laurie Goodstein reported in the New York Times in 2015, civil rights and religious leaders opposed its focus on Islamic extremism and feared it would lead to intimidation of Arab and Muslim students and hamper freedom of speech. The site is no longer active.

What about a technological solution to tackle hate speech online? MIT Technology Journal reported on a new study that finds artificial intelligence isn’t here yet. Scientists have tested a number of “cutting edge” systems for detecting hate speech, and none of them have been successful. Some scenarios have stumbled AI moderators: the use of profanity in otherwise harmless statements, slurs that have been picked up by the target group, and references to hate speech that are actually meant to counter it.

And, if the Biden administration wants tech companies to do more to stem extremism, it will need to speak to Alphabet’s YouTube. Earlier this year, USA Today reported on an Anti-Defamation League study that found that even after being pressured to remove extremist content, YouTube still recommended white supremacist videos to viewers who had watched similar content. In response, a YouTube spokesperson said that “the views this type of content gets from recommendations have dropped by more than 70% in the United States.”


About Author

Comments are closed.