© 2024 Ideastream Public Media

1375 Euclid Avenue, Cleveland, Ohio 44115
(216) 916-6100 | (877) 399-3307

WKSU is a public media service licensed to Kent State University and operated by Ideastream Public Media.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Facebook's Own Research Says Its Apps Can Harm Mental Health. Senators Have Questions

LEILA FADEL, HOST:

Facebook's global head of safety fielded tough questions on Capitol Hill today as senators accused the company of concealing data that confirmed Facebook and Instagram harms some young people's mental health. Here's Senator Richard Blumenthal.

(SOUNDBITE OF ARCHIVED RECORDING)

RICHARD BLUMENTHAL: It has weaponized childhood vulnerabilities against children themselves. It's chosen growth over children's mental health and well-being, greed over preventing the suffering of children.

FADEL: The hearing comes after a Wall Street Journal investigation uncovered Facebook's own research, which showed that the photo sharing app led to body image issues or worse in many teens. In response, Facebook has said the research was taken out of context. Jeff Horwitz is part of the team that reported that investigation, and he joins us now.

Hi, Jeff.

JEFF HORWITZ: Hello.

FADEL: So before we get started, I should note that Facebook is an NPR sponsor. But let's start with what your investigation found about the impact of Facebook and Instagram on teens and mental health. What did you learn?

HORWITZ: So the company's been looking at this for a number of years. And what they found is that for most users, Instagram is perfectly fine. However, for users who come to the platform with some level of mental vulnerability, which is to say a lot of teenagers, it can be really problematic. And in particular for teenage girls, it can make body image issues worse. And in fact, they found that there were - among users who they surveyed who had thought about harming themselves in the last month, that a non-trivial percentage - 6% in the U.S., 13% of British teenagers - trace the desire to kill themselves back to the app itself.

FADEL: Wow. Wow. And you've been following the hearing today. What's been happening?

HORWITZ: Well, Facebook, last night in advance of this, released a couple of the slide decks that - from the researchers that we had cited in our reporting. We then released another four. And the interesting thing is Facebook kind of undercut the legitimacy and value of its own research in a really kind of surprising way. They basically said that it did not show the things that the researchers said and that the researchers had kind of overstated the value, I suppose, of their own work in these presentations to executives. So they defended themselves by saying simply that it had been misconstrued. Obviously, members of the Senate on both sides did not see it that way.

FADEL: Tell us about the choice that Facebook made. Who did they send to this hearing today?

HORWITZ: So they sent Antigone Davis, a woman who is in charge of their safety efforts. She's not someone who would have been kind of responsible for the things that - the problems that are alleged here. She would be the person who would be kind of trying to work on fixing them. So it was kind of an unusual choice, shall we say, for the company to send her out.

FADEL: And ultimately, what were the big questions she was facing?

HORWITZ: The major questions were why Facebook hadn't disclosed this and also why it wasn't doing more as a result of this work. So I think the senators considered this to be proof that the company's products were in fact harmful, and they believed that Facebook had an obligation to do far more than it has to figure out how to keep young people either off its platform entirely if it's not safe, or to figure out how to make it safe for them. And they certainly did not like the idea of Facebook resurrecting an idea that was quite live until the beginning of this week, which was an Instagram platform meant for children under the age of 13. They considered it just to be reckless and irresponsible to proceed, given what Facebook knows of its own work.

FADEL: And that's on indefinite pause right now, right?

HORWITZ: Yeah, they've said that they will bring it back and that this will move forward, but they are taking a break to consult with everyone and sort of try to make clear why they believe this is a good thing for the world and a good thing for children.

FADEL: So what happens next?

HORWITZ: Well, there was a bipartisan agreement that Facebook did not acquit itself well. And so there was also talk about changing the laws related to children's safety on the internet, which, you know, go back to the '90s and, candidly, don't really seem to contemplate the existence of social media in its current form. And, you know, I think some of the senators talked about how this had been a frustration, that it hadn't happened yet, and they've been working on this for a long time. But, you know, they suggested that perhaps there might be an impetus at this point and a momentum to get it done.

FADEL: So rare to hear the term bipartisan agreement these days. That was Wall Street Journal reporter Jeff Horwitz.

Thank you so much for your reporting.

HORWITZ: Thank you.

FADEL: If you or someone you know may be considering suicide, contact the National Suicide Prevention Lifeline at 1-800-273-8255.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.

Tags
Leila Fadel is a national correspondent for NPR based in Los Angeles, covering issues of culture, diversity, and race.
Mia Venkat
[Copyright 2024 NPR]