© 2024 KGOU
News and Music for Oklahoma
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

How Facebook Uses Technology To Block Terrorist-Related Content

Facebook has created new tools for trying to keep terrorist content off the site.
Jaap Arriens
/
NurPhoto via Getty Images
Facebook has created new tools for trying to keep terrorist content off the site.

Social media companies are under pressure to block terrorist activity on their sites, and Facebook recently detailed new measures, including using artificial intelligence, to tackle the problem.

The measures are designed to identify terrorist content like recruitment and propaganda as early as possible in an effort to keep people safe, says Monika Bickert, the company's director of global policy management.

"We want to make sure that's not on the site because we think that that could lead to real-world harm," she tells NPR's Steve Inskeep.

Bickert says Facebook is using technology to identify people who have been removed for violating its community standards for sharing terrorism propaganda, but then go on to open fake accounts. And she says the company is using image-matching software to tell if someone is trying to upload a known propaganda video and blocking it before it gets on the site.

"So let's say that somebody uploads an ISIS formal propaganda video: Somebody reports that or somebody tells us about that, we look at that video, then we can use this software to create ... a digital fingerprint of that video, so that if somebody else tries to upload that video in the future we would recognize it even before the video hits the site," she says.

If it's content that would violate Facebook's policies no matter what, like a beheading video, then it would get removed. But for a lot of content, context matters, and Facebook is hiring more people worldwide to review posts after the software has flagged them.

"If it's terrorism propaganda, we're going to remove it. If somebody is sharing it for news value or to condemn violence, we may leave it up," Bickert says.

The measures come in the wake of criticism of how Facebook handles content. Last year, for example, Facebook took down a post of the Pulitzer Prize-winning photo of a naked girl in Vietnam running after a napalm attack. The move upset users, and the post was eventually restored. Facebook has also been criticized for keeping a graphic video of a murder on the site for two hours.

Morning Edition editor Jessica Smith and producer Maddalena Richards contributed to this report.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Heidi Glenn has been the Washington Desk’s digital editor since 2022, and at NPR since 2007, when she was hired as the National Desk’s digital producer. In between she has served as Morning Edition’s lead digital editor, helping the show’s audio stories find life online.
More News
Support nonprofit, public service journalism you trust. Give now.