© 2024 Ideastream Public Media

1375 Euclid Avenue, Cleveland, Ohio 44115
(216) 916-6100 | (877) 399-3307

WKSU is a public media service licensed to Kent State University and operated by Ideastream Public Media.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Studies describe impact of Facebook and Instagram on the 2020 election

STEVE INSKEEP, HOST:

Academic researchers got a chance to study something that many people assume is true, that social media makes us more partisan. You know the basic idea. People get fed things on Facebook. And each time they click on propaganda, computer algorithms track that and feed them more distortions, more stuff, more lies for more clicks - a feedback loop. Meta, the owner of Facebook, allowed researchers access to data to find out if this really leads to political polarization. They published findings in the journals Science and Nature. And Talia Stroud of the University of Texas at Austin is one of those who did the research.

Good morning.

TALIA STROUD: Good morning. Nice to be here, Steve.

INSKEEP: I'm sure it's hard to get to the bottom line here. But as best you can tell, does social media like Facebook put us all in silos where we keep getting stuff that confirms our existing beliefs?

STROUD: Well, there are nuanced findings here, and let me be clear about what we found. So first we found when we looked at political news URLs posted on the platform by U.S. adult users at least 100 times on Facebook, we see that many political news URLs were seen and engaged with primarily by conservatives or liberals, but not both. And so that certainly speaks to polarization and fragmentation on the platform. However, when we reduced the amount of like-minded content that people were seeing on the platform, among participants who gave us their permission to change their feed, we didn't find any significant effects on their levels of affective polarization.

INSKEEP: I want to explain what you're saying. You're saying that conservatives are getting a different news diet, different articles, than liberals. They're rarely reading exactly the same things. But when you gave them a more varied news diet, they still believed whatever they believed before.

STROUD: Yeah, that's it. When we reduced the content from like-minded users and pages and groups on the platform over a period of three months in the 2020 election, we didn't find that it had a significant effect on their levels of polarization.

INSKEEP: Now it's interesting. Meta is kind of excited about this research. They actively collaborated. They gave you access to data. As you mentioned, they allowed you to change people's feeds with their permission, and they've characterized it all in a certain way. They've said that this shows that it's not true that their algorithms drive polarization. Are they correct to say that their algorithms do not drive polarization?

STROUD: You know, I think that the research that we published on ideological segregation suggests that there is ideological segregation on the platform and that based on the political news URLs that people are sharing, there are differences in what news is being consumed by liberal and conservative audiences on the platform. Now, it is correct to say that in the experiment that we did on like-minded sources, that didn't affect political polarization, nor did our research where we removed reshared content or where we switched people's feeds from the standard Facebook feed and Instagram feed to a chronological feed where the most recent content appeared first. So I think there's a bit of a mixed bag here in terms of the relationship between platforms and the content that people see and the way they react.

INSKEEP: I wonder if there's a question about intensity here. It's not that you maybe were a liberal and suddenly became conservative because you saw some Facebook posts or the reverse. But I wonder if because of the nature of algorithms feeding you more and more stuff to which you're already predisposed, people just feel more intensely partisan, have a more intense dislike for the other side because they are continually being told through the algorithm that the other person is evil and terrible and eat children, or whatever they're being told.

STROUD: That's one thing that we did analyze with the idea of affective polarization, because that's really trying to get at how do you feel about members of your own party and members of the other party. And are you feeling more intense toward the other side and more negative toward them and more positive toward your own side? In these studies, when we did these experiments, we don't find that those levels increased.

INSKEEP: Oh, that's really interesting. So I've got just about 10 seconds. I'm curious if you use Facebook much yourself.

STROUD: I do use Facebook.

INSKEEP: OK. Do you find it useful?

STROUD: You know, I think there definitely are uses for Facebook, such as keeping up with friends and family. But I think this research really leads people to think, or I hope it leads people to think, about what the algorithm is surfacing for them and who they're connected with on the platform.

INSKEEP: Just be conscious that you're being manipulated. Talia Stroud, thanks so much.

STROUD: Thank you.

INSKEEP: She's a professor at the University of Texas at Austin. Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Steve Inskeep is a host of NPR's Morning Edition, as well as NPR's morning news podcast Up First.