© 2024 KGOU
News and Music for Oklahoma
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

NATO Targets Disinformation Efforts

SCOTT SIMON, HOST:

Of course, NATO was created as a Western military alliance. But when NATO-sponsored experts met in Riga, Latvia this week, it was to examine a non-military adversary of increasing concern to them - social media manipulation. The prevalence of what experts refer to as malign influence campaigns - the rest of us call it disinformation - is both growing and spreading.

Janis Sarts heads the NATO Strategic Communications Center of Excellence in Riga. Mr. Sarts, thanks so much for being with us.

JANIS SARTS: Thank you.

SIMON: Is disinformation, in your mind, the equivalent of a military threat?

SARTS: Certainly it is a national security threat. And particularly, it is increasing because of the way technology has transformed the overall information landscape. And that allows for also the new ways of manipulating disinformation and using disinformation to affect our cognitive spaces. And that is obviously increasingly a national security threat.

SIMON: And how would it be a threat to national security?

SARTS: Well, because that undermines the internal societal discussion processes. These used the systems to actually create distraction on the certain sets of fake news. And in these cases, that is meant to undermine the stability within the given society.

SIMON: Can you give us a for instance that you were able to notice?

SARTS: There was, a few years ago, a referendum for the Catalonian independence. And then suddenly, the bots, which seemed to be Venezuelan bots, had started to spread this highly divisive images of the atrocities of the police, most of them inaccurate. And they were meant to deepen the crisis which was unfolding and thus achieving this for agility in Spain and using this opportunity to divide the society.

SIMON: This isn't just happening in Europe, is it?

SARTS: No, no, no, no. Actually, Europe is a small place. But, of course, in United States, in Asia, in India, Mexico, Brazil - it's all over place. The interesting twist, though, is core of this manipulation infrastructure is actually Russian-owned.

SIMON: And who does it? Is it foreign intelligence services?

SARTS: We know there is a very, very vibrant, vast ecosystem that sells the SERPs (ph). We estimate that most of that is commercial activity, but we believe some within that commercial activity, you would have also other state-related actors. All of that is for sale. And to give you the context, we bought 50,000 engagements during EU election for 300 euros. So it's really very accessible. And it is cheap.

SIMON: Social media companies have to bear special responsibility for this?

SARTS: Well, obviously you see that what this whole ecosystem is doing, it is undermining the credibility of what is taking place within these platforms. So I would believe that is not in their business interest. But you clearly see their kind of practice is not sufficient enough to confront that risk. But I think also what would be important that there is some kind of ability for - from a public perspective - to get some transparency. Is what these social media companies saying what they're doing, is it the real deal? Does it deliver? Because for most of the case, we have to take their word for granted.

SIMON: I think a lot of us have heard about obviously trolling and deepfakes. What technology particularly concerns you right now?

SARTS: Well, right now, what I'm really worried is the way big data, AI and increased knowledge of how human mind operates and behavioral sciences are marrying together. You know, we recently run an experiment together, one of the allied militaries, where, in a military exercise, we tried to see whether by scraping the open source data we could indicate who are the soldiers participating in the exercise, and based on these data sets that were openly available, whether we can influence what soldiers do in a military exercise.

And what we were able to accomplish was we were able to make soldiers disobey orders, leave the positions they are supposed to defend during a military exercise, which to me indicates the kind of trajectory of the risk because the kind of skilled actors and the amount of data that we are leaving as people in the digital space...

SIMON: Yeah.

SARTS: ...With a good AI capability, can deliver very significant ability to manipulate the behavioral outcomes. And that is something that we have to really start very, very quickly starting to solve from a national security perspective but also from a individual privacy and individual security perspective because there are, of course, many ways one can use that for malign purposes.

SIMON: Janis Sarts heads the native Strategic Communications Center of Excellence in Riga, Latvia. Thanks so much for being with us, Mr. Sarts.

SARTS: Thank you very much. Transcript provided by NPR, Copyright NPR.

More News
Support nonprofit, public service journalism you trust. Give now.