NewsPoliticsTech & ScienceTop StoryUS

Meta, TikTok & other social media CEOs testify before Senate committee

Sexual predators. Addictive features. Self-harm and eating disorders. Unrealistic beauty standards. Bullying. These are just some of the issues young people are dealing with on social media — and children’s advocates and lawmakers say companies are not doing enough to protect them. On Wednesday, the CEOs of Meta, TikTok, X and other social media companies went before the Senate Judiciary Committee to testify as lawmakers and parents grow increasingly concerned about the effects of social media on young people’s lives.

Quick Read

  • Social Media Risks for Youth: Children’s advocates and lawmakers highlight issues like addiction, bullying, and exposure to harmful content on social media platforms.
  • Senate Judiciary Committee Hearing: CEOs of Meta, TikTok, X, and others testified about the impact of their platforms on young users.
  • Testimonies on Exploitation: The hearing included accounts from children and parents about negative experiences on social media.
  • Calls for Regulation: Senators Dick Durbin and Lindsay Graham expressed concerns over the safety of social media for children and the need for regulatory measures.
  • Platform Safety Measures: Social media executives discussed existing safety features and collaborations with nonprofits and law enforcement to protect minors.
  • Snapchat Supports Federal Bill: Snapchat endorsed a bill imposing liability on platforms recommending harmful content to minors, urging the industry to follow suit.
  • TikTok and X’s Policies: TikTok enforces age restrictions, and X (formerly Twitter) claims it does not target children, supporting legislation against child exploitation.
  • Advocates Demand More Action: Child health advocates argue that social media companies have not done enough to safeguard minors, calling for independent regulation.
  • Meta’s Legal Challenges: Meta faces lawsuits alleging it designed addictive features for children and failed to shield them from online predators.
  • Internal Emails Reveal Concerns: Emails between Meta executives reveal discussions about improving youth mental health protections amidst growing political concerns.
  • Meta Enhances Child Safety Features: Recent updates include hiding inappropriate content from minors and introducing measures to discourage late-night use among teenagers.
  • Critics Unconvinced by Meta’s Efforts: Experts argue Meta’s safety features are insufficient and not user-friendly, questioning the company’s commitment to child safety.
  • YouTube’s Absence Noted: Despite its popularity among teens, YouTube was not included in the Senate hearing, raising questions about the platform’s oversight.

The Associated Press has the story:

Meta, TikTok & other social media CEOs testify before Senate committee

Newslooks- (AP)

Sexual predators. Addictive features. Self-harm and eating disorders. Unrealistic beauty standards. Bullying. These are just some of the issues young people are dealing with on social media — and children’s advocates and lawmakers say companies are not doing enough to protect them.

On Wednesday, the CEOs of Meta, TikTok, X and other social media companies went before the Senate Judiciary Committee to testify as lawmakers and parents grow increasingly concerned about the effects of social media on young people’s lives.

From left; Discord CEO Jason Citron, Snap CEO Evan Spiegel, TikTok CEO Shou Zi Chew, X CEO Linda Yaccarino and Meta CEO Mark Zuckerberg, are sworn in during a Senate Judiciary Committee hearing on Capitol Hill in Washington, Wednesday, Jan. 31, 2024, to discuss child safety. (AP Photo/Manuel Balce Ceneta)

The hearing began with recorded testimony from kids and parents who said they or their children were exploited on social media.

“They’re responsible for many of the dangers our children face online,” U.S. Senate Majority Whip Dick Durbin, who chairs the committee, said in opening remarks. “Their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk.”

Senate Judiciary Committee Chairman Sen. Dick Durbin, D-Ill., right, listens as ranking member Sen. Lindsey Graham, R-S.C., left, speaks during a hearing with the heads of social media platforms on Capitol Hill in Washington, Wednesday, Jan. 31, 2024, to discuss child safety. (AP Photo/Susan Walsh)

South Carolina Sen. Lindsay Graham, the top Republican on the Judiciary panel, echoed Durbin’s sentiments and said he’s prepared to work with Democrats to solve the issue.

“After years of working on this issue with you and others, I’ve come to conclude the following: social media companies as they’re currently designed and operate are dangerous products,” Graham said.

He told the executives their platforms have enriched lives but that it is time to deal with “the dark side.”

Discord CEO Jason Citron arrives to appear before the Senate Judiciary Committee’s hearing on online child safety on Capitol Hill, Wednesday, Jan. 31, 2024 in Washington. (AP Photo/Jose Luis Magana)

Beginning with Discord’s Jason Citron, the executives touted existing safety tools on their platforms and the work they’ve done with nonprofits and law enforcement to protect minors.

Snapchat had broken ranks ahead of the hearing and began backing a federal bill that would create a legal liability for apps and social platforms who recommend harmful content to minors. Snap CEO Evan Spiegel reiterated the company’s support on Wednesday and asked the industry to back the bill.

TikTok CEO Shou Zi Chew arrives to appear before the Senate Judiciary Committee’s hearing on online child safety on Capitol Hill, Wednesday, Jan. 31, 2024 in Washington. (AP Photo/Mark Schiefelbein)

TikTok CEO Shou Zi Chew said TikTok is vigilant about enforcing its policy barring children under 13 from using the app. CEO Linda Yaccarino said X, formerly Twitter, doesn’t cater to children.

“We do not have a line of business dedicated to children,” Yaccarino said. She said the company will also support Stop CSAM Act, a federal bill that make it easier for victims of child exploitation to sue tech companies.

Yet child health advocates say social media companies have failed repeatedly to protect minors.

Meta CEO Mark Zuckerberg arrives to appear before the Senate Judiciary Committee’s hearing on online child safety on Capitol Hill, Wednesday, Jan. 31, 2024 in Washington. (AP Photo/Mark Schiefelbein)

“When you’re faced with really important safety and privacy decisions, the revenue in the bottom line should not be the first factor that these companies are considering,” said Zamaan Qureshi, co-chair of Design It For Us, a youth-led coalition advocating for safer social media. “These companies have had opportunities to do this before they failed to do that. So independent regulation needs to step in.”

Meta CEO Mark Zuckerberg, arrives to testify before a Senate Judiciary Committee hearing on Capitol Hill in Washington, Wednesday, Jan. 31, 2024, to discuss child safety. (AP Photo/Manuel Balce Ceneta)

Meta will likely be a central focus of the hearing, with the Menlo Park, California, tech giant being sued by dozens of states that say it deliberately designs features on Instagram and Facebook that addict children to its platforms and has failed to protect them from online predators.

People hold photos of their loved ones as they sit in the audience before the start of a Senate Judiciary Committee hearing with the heads of social media platforms on Capitol Hill in Washington, Wednesday, Jan. 31, 2024, to discuss child safety. (AP Photo/Susan Walsh)

New internal emails between Meta executives released by Sen. Richard Blumenthal’s office show Nick Clegg, president of global affairs, and others asking Meta CEO Mark Zuckerberg to hire more people to strengthen “wellbeing across the company” as concerns grew about effects on youth mental health.

“From a policy perspective, this work has become increasingly urgent over recent months. Politicians in the U.S., U.K., E.U. and Australia are publicly and privately expressing concerns about the impact of our products on young people’s mental health,” Clegg wrote in an August 2021 email.

People hold photos of their loved ones as they sit in the audience before the start of a Senate Judiciary Committee hearing with the heads of social media platforms on Capitol Hill in Washington, Wednesday, Jan. 31, 2024, to discuss child safety. (AP Photo/Susan Walsh)

The emails released by Blumenthal’s office don’t appear to include a response, if there was any, from Zuckerberg. In September 2021, The Wall Street Journal released the Facebook Files, its report based on internal documents from whistleblower Frances Haugen, who later testified before the Senate.

People hold photos of their loved ones as they sit in the audience before the start of a Senate Judiciary Committee hearing with the heads of social media platforms on Capitol Hill in Washington, Wednesday, Jan. 31, 2024, to discuss child safety. (AP Photo/Susan Walsh)

Meta has beefed up its child safety features in recent weeks, announcing earlier this month that it will start hiding inappropriate content from teenagers’ accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders. It also restricted minors’ ability to receive messages from anyone they don’t follow or aren’t connected to on Instagram and on Messenger and added new “nudges” to try to discourage teens from browsing Instagram videos or messages late at night. The nudges encourage kids to close the app, though it does not force them to do so.

But child safety advocates say its actions from the companies has fallen short.

Snap CEO Evan Spiegel arrives to appear before the Senate Judiciary Committee’s hearing on online child safety on Capitol Hill, Wednesday, Jan. 31, 2024 in Washington. (AP Photo/Jose Luis Magana)

“Looking back at each time there has been a Facebook or Instagram scandal in the last few years, they run the same playbook. Meta cherry picks their statistics and talks about features that don’t address the harms in question,” said Arturo Béjar, a former engineering director at the social media giant known for his expertise in curbing online harassment who recently testified before Congress about child safety on Meta’s platforms.

X CEO Linda Yaccarino arrives to appear before the Senate Judiciary Committee’s hearing on online child safety on Capitol Hill, Wednesday, Jan. 31, 2024 in Washington. (AP Photo/Jose Luis Magana)

“Instagram promises features that end up hidden in settings that few people use. Why is ‘quiet mode’ not the default for all kids?” Béjar added. “Meta says that some of the new work will help with unwanted advances. It is still not possible for a teen to tell Instagram when they’re experiencing an unwanted advance. Without that information how can they make it safer?”

TikTok CEO Shou Zi Chew arrives to appear before the Senate Judiciary Committee’s hearing on online child safety on Capitol Hill, Wednesday, Jan. 31, 2024 in Washington. (AP Photo/Mark Schiefelbein)

Google’s YouTube is notably missing from the list of companies called to the Senate Wednesday even though more kids use YouTube than any other platform, according to the Pew Research Center. Pew found that 93% of U.S. teens use YouTube, with TikTok a distant second at 63%.

For more U.S. news

For more Science & Technology news

Previous Article
Taylor Swift, Bad Bunny may vanish from TikTok as licensing dispute boils over
Next Article
Biden to visit Ohio Town hit by fiery train derailment last year

How useful was this article?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this article.

Latest News

Menu