© 2024 Ideastream Public Media

1375 Euclid Avenue, Cleveland, Ohio 44115
(216) 916-6100 | (877) 399-3307

WKSU is a public media service licensed to Kent State University and operated by Ideastream Public Media.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Under growing pressure, Meta vows to make it harder for teens to see harmful content

Facebook and Instagram parent Meta is facing pressure to make its apps safer for teens.
Lionel Bonaventure
/
AFP via Getty Images
Facebook and Instagram parent Meta is facing pressure to make its apps safer for teens.

Meta is making changes to what teens can see when using Instagram and Facebook. The company announced on Tuesday it will start hiding certain types of content on both apps and restrict specific search terms on Instagram. These changes are for all teens under 18.

"Now, when people search for terms related to suicide, self-harm and eating disorders, we'll start hiding these related results and will direct them to expert resources for help," Meta stated in a blog post.

The new policies come as Meta is facing dozens of state lawsuits, possible federal legislation and mounting pressure from child safety advocacy groups to make its social networks safer for kids.

Meta says it removes or limits recommendations of certain types of posts for all users — things like nudity and drugs for sale. The company says it will now restrict teens from even coming across much of this content, including when it's posted by a friend or someone they follow.

Jean Twenge, a psychology professor at San Diego State University and author of the book Generations, says this is a step in the right direction but that it's still hard to police who is actually a teen on Facebook and Instagram.

"You do not need parental permission to sign up for a social media account," Twenge says. "You check a box saying that you're 13, or you choose a different birth year and, boom, you're on."

Twenge, who has consulted for lawmakers in their suits against Meta, says teens have experienced higher incidences of depression, negative body issues and bullying because of social media. She says studies show that teens who are heavy users of social media are about twice as likely to be depressed or to harm themselves compared with light users.

"There's clearly a relationship with spending too much time on social media and then these negative outcomes," Twenge says.

A Meta spokeswoman acknowledged people can misrepresent their ages on Facebook and Instagram. She told NPR that the company is investing in age verification tools and technology that can better detect when users lie about their age.

Last May, U.S. Surgeon General Vivek Murthy issued a warning about the risks that social media has on kids. He said the technology was helping fuel a national youth mental health crisis.

The move came as a bipartisan group of federal lawmakers, led by Sen. Richard Blumenthal, D-Conn., and Sen. Marsha Blackburn, R-Tenn., amped up their campaign to get the Kids Online Safety Act passed as quickly as possible. If passed, the legislation would hold tech companies accountable for feeding teens toxic content.

A group of more than 40 states also filed lawsuits against Meta in October, accusing it of designing its social media products to be addictive. Their lawsuits rely on evidence from Facebook whistleblowers Arturo Bejar and Frances Haugen.

Bejar testified before a Senate Judiciary subcommittee in November saying Meta has failed to make its platform safer for kids, despite knowing the harm it causes. His testimony came two years after Haugen detailed similar findings in the Facebook Papers.

Copyright 2024 NPR. To see more, visit https://www.npr.org.

Dara Kerr
Dara Kerr is a tech reporter for NPR. She examines the choices tech companies make and the influence they wield over our lives and society.