© 2024 KGOU
News and Music for Oklahoma
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

How YouTube Is Changing Our Viewing Habits

SCOTT SIMON, HOST:

People watch 1 billion hours of video every day on YouTube. That's 10 times the amount they watched five years ago. And YouTube is now poised to surge above the 1.25 billion hours of television that Americans watch every day. Of course, YouTube personalizes recommended playlists. You see one cute cat video - six more will be lined up for you. You watch one video that purports to show the Abominable Snowman at Starbucks - you get 10 more videos for the snowman, the Loch Ness monster and UFO abductions. What are some of the implications for more and more people making YouTube more and more their source for information entertainment? Dr. Zeynep Tufekci thinks about technology. She joins us now from the University of North Carolina, Chapel Hill. Thanks so much for being with us.

ZEYNEP TUFEKCI: Thank you for inviting me.

SIMON: So what do you foresee as a result of this change?

TUFEKCI: Well, the devil is in the details as usual. A platform like YouTube has algorithms designed to recommend to you things that it thinks will be more engaging. And what I've found is that whatever I watched, it would push a more hardcore version of whatever it was I was watching across the political spectrum. So something that I found really striking was that if I watched Donald Trump rallies, I would get recommended white supremacist conspiracy theories. And you have examples from radicalized, you know, extremists when, some of their interviews, they talk about going down the rabbit hole of YouTube.

SIMON: I remember somebody once sent me a video purporting to show that man never landed on the moon. And there - for a couple of days thereafter, I had eight or 10 similar videos, each of them more convincing than the other, showing me why man never landed on the moon. Now, I don't mind crawling out on a limb and saying, that is false, man landed on the moon. But at the same time, it makes you wonder, you know, people, who don't consider themselves dumb will watch some of those videos and say, see, this proves it. I mean, that was the case of the person who sent it to me. Does YouTube or any other platform have some kind of responsibility not to put misinformation on their site?

TUFEKCI: So there's two things going on. On the one hand, we do have freedom of speech. So if somebody wants to claim we never landed on the moon, I can see that as freedom of speech. But there's no freedom to necessarily be recommended by YouTube, right? So what I see happening and what I see as troublesome is that if you watch something like that, YouTube could recommend to you something debunking falsehoods saying hey, check this out, right? Or...

SIMON: I mean, after seeing that video, I got a lot of stuff saying 9/11 never happened.

TUFEKCI: Right, that's exactly right. So if you look at it from ISIS to white supremacists, it's the key platform that they put their stuff on. And they rely on that recommendation engine to draw more and more people, preferably young, vulnerable, gullible people, to their message.

SIMON: Do you know if YouTube is concerned about this? Have they indicated any qualms?

TUFEKCI: I have not reached out to them or heard from them yet. I will be writing more about this. But this is something every Silicon Valley platform has to consider. This is true for Facebook. This is true for YouTube. And there's a responsibility for platforms that are making so much money to consider how can we balance this so that people still stay on our site but with healthier stuff.

SIMON: Zeynep Tufekci, contributing an opinion writer for The New York Times and author of the forthcoming book "Twitter And Tear Gas: The Fragility And Power Of Protest," thanks so much for being with us.

TUFEKCI: Thank you for inviting me. Transcript provided by NPR, Copyright NPR.

More News
Support nonprofit, public service journalism you trust. Give now.