© 2023 WSHU
NPR News & Classical Music
Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations

Blumenthal, Congress fear social media endangers kids

Marsha Blackburn, Richard Blumenthal, Frances Haugen
Alex Brandon
/
Associated Press
Sen. Richard Blumenthal, D-Conn., right, speak to former Facebook data scientist Frances Haugen, center, during a hearing of the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security, on Capitol Hill, Tuesday, Oct. 5, 2021, in Washington.

U.S. Senator Richard Blumenthal (D-CT) questioned policy leads from TikTok, Snapchat and YouTube on how their social media platforms affect vulnerable young users on Tuesday.

Blumenthal, who chairs the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security, led the hearing after a Facebook whistle-blower testified before the same subcommittee earlier this month and addressed the dangers of big tech companies and their algorithms. A video of that earlier hearing went viral after Blumenthal asked Facebook’s global head of safety whether the company would commit to “ending finsta,” fake accounts run by children on Instagram to get around their parents.

He wants urgent congressional action to protect kids and teens online. Blumenthal and Sen. Edward Markey (D-MA) will lay out an agenda for Congress on Wednesday to protect children’s online privacy, including passing their bipartisan legislation the Children and Teen’s Online Privacy Protection Act.

Blumenthal said on Tuesday these social media platforms all use algorithms that hurt children.

“The algorithms push emotional and provocative content — toxic content — that amplifies depression, anger, hate, anxiety because those emotions attract and hook kids and others to their platforms” said Blumenthal.

Blumenthal warned parents not to trust big tech companies with their kids, so that their safety would be protected.

The three social media companies looked to set themselves apart from the Facebook scrutiny by emphasizing their safety provisions. Snapchat pointed out that its strongest privacy protection is that it only allows users ages 13 and up.

“We make no effort, and have no plans, to market to children,” said Jennifer Stout, vice president of Global Public Policy for Snap Inc.

Michael Beckerman, TikTok’s vice president for the Americas, said TikTok has established numerous safety measures to protect young users in recent years.

“We work to create age-appropriate experiences for teens throughout their development,” said Beckerman, “For example, people under 16 automatically have their accounts set to private, they cannot host live streams and they can’t send direct messages on our platform.”

Beckerman said the company knows trust must be earned and they are seeking to do that through action, transparency and accountability.

Leslie Miller, vice president of Government Affairs and Public Policy at YouTube, said that YouTube Kids provides parents with the tools to control the app for children. YouTube Kids, created in 2015, is safe for children and parents, said Miller. She also mentioned kids under 13 must supervised by parents or they are not allowed on the site.

“Being different from Facebook is not a defense,” Blumenthal said the committee does not want a “race to the bottom” for child protection features among social media platforms.

“I want a market where the competition is to protect children,” he said.

Clare is a former news fellow with WSHU Public Radio.