(WASHINGTON) — Legislators on Wednesday introduced a bipartisan bill aimed at protecting children from the potentially harmful impacts of social media.
The bill, sponsored by Sens. Marsha Blackburn, R-Tenn., and Richard Blumenthal, D-Conn., came as Congress held five hearings on the dangers of social media for children and teens aged 16 or younger in recent months, including one at which a whistleblower who testified against Facebook — now Meta — about internalized documents that showed the tech giant prioritized profits over the mental well-being of children.
While the senators would not comment on the likelihood of the bill passing, they emphasized during a press conference on Wednesday that there is bipartisan support for it in the House and the Senate.
“What we’re doing in this bill is empowering those children and their parents to take back control,” Blumenthal said.
The Kids Online Safety Act of 2022 includes five major elements:
- Social media companies would be required to provide privacy options, the ability to disable addictive features and allow users to opt-out of recommendations like pages or other videos to “like.” It would also make the strongest privacy protections the default.
- The bill would give parents tools to track time spent in the app, limit purchases and help to address addictive usage.
- It would require social media companies to prevent and mitigate harm to minors, including self-harm, suicide, eating disorders, substance abuse, sexual exploitation and unlawful products for minors, like alcohol.
- Social media companies would be required to use a third party to perform independent reviews to quantify the risk to minors, compliance with the law and whether the company is “taking meaningful steps to prevent those harms.”
- Social media companies would be required to give kids’ data to academic and private researchers. The scientists would use that data to do more research on what harms children on social media and how to prevent that harm.
“The social media platforms have proven they are not going to regulate themselves. Because of that, we have put the effort into how do we make certain that this is a safer environment,” Blackburn said Wednesday.
Dr. Dave Anderson, clinical psychologist at the Child Mind Institute, said the bill marks the sensible intersection of tech and public policy.
“I think politicians are taking what we know from the science and saying, ‘How do we build in these safeguards?'” Anderson said.
He said social media algorithms have evolved to show children only more of what they are interested in rather than a variety of viewpoints and that marks a dangerous change for children with mental health issues.
Meta, which owns Instagram and Facebook, had no comment on the legislation, but a spokesperson pointed to the company’s announcement in December that it is taking a stricter approach to recommendations for teens, including nudging them toward different topics if they have been dwelling.
A Snap Inc. spokesperson said the company spends a tremendous amount of time and resources to protect teenagers and has tools that allow kids to report concerning behavior and turn off location services, as well as resources for parents.
TikTok updated its policies on Tuesday to “promote safety, security, and well-being on TikTok,” according to a press release written by TikTok Head of Trust and Safety Cormac Keenan.
TikTok and Twitter did not respond to ABC News’ request for comment.
Copyright © 2022, ABC Audio. All rights reserved.