WASHINGTON, Nov. 7 (UPI) -- A Meta whistleblower sounded the alarm to Congress in testimony Tuesday about Instagram's product design, alleging that the popular social media platform knowingly created products at the expense of teens' mental health and safety.
In response, senators on both sides of the aisle announced their commitment to bringing a floor vote to key social media regulation bills by the end of the calendar year.
"I'm tired of waiting," said Sen. Josh Hawley, R-Mo., ranking member of the Senate Judiciary Committee's Subcommittee on Privacy, Technology and the Law.
Sen. Richard Blumenthal, D-Conn., the subcommittee's chairman, joined Hawley's pledge to bring a vote before 2024.
Former senior Meta engineer Arturo Bejar's appearance came after a federal lawsuit against the tech giant filed by 41 state attorneys general just last month.
The lawsuit was spurred by issues Bejar raised while working for the company since 2009, including internal data showing that 1-in-8 13- to 15-year-olds experienced an unwanted sexual advance within the past seven days on Instagram. But he said he was ignored when he contacted the company's executives, including CEO Mark Zuckerberg.
"Meta knows the harm that kids experience on the platform," Bejar said. "The executives know that their measures fail to address it."
After seeing his own 14-year-old daughter subjected to repeated sexual advances while using the platform, Bejar said he was motivated to create change and returned to Meta in 2019.
He was appointed by Meta executives as a safety and well-being consultant and created goals for the company based on his daughter's experiences.
Bejar said Meta's safety measures, which rely on machine-learning models and automated processes to regulate user behavior, were not doing enough to catch inappropriate content.
"There are certain features being developed in response to public outcry which was, in reality, a placebo," Bejar said. "Rather than being based on user experience data, they were based on very deliberately narrow definitions of harm."
While at Meta, he developed the "Bad Emotional Experience Feedback" user survey, which found that 26% of users under 16 years old said they had a bad experience on the app due to race, religion or identity-based hostility in the past week.
The data also found that Instagram users were 100 times more likely to notify the platform that they witnessed bullying on the app than Meta's internal statistics said they would.
After sharing these suggestions with executives in 2021, Bejar said that the company was not receptive. He went public with The Wall Street Journal about his experiences and Meta's response, which ultimately prompted legislators and attorneys general to take bipartisan action.
"These are no longer companies that started in a garage with two guys tinkering around with platforms on computers in their college dorm rooms," said Sen. Amy Klobuchar, D-Minn. "These are real lives that are being lost."
Klobuchar was joined by other lawmakers in emphasizing these statistics represent actual lives being lost. Federal health authorities, including officials at the Centers for Disease Control and Prevention and the surgeon general, have pointed to the dangerous effects of social media on teen mental well-being amid rising suicide rates.
With dopamine-manipulating features and addictive algorithms, lawmakers likened the need for social media regulation to actions taken toward tobacco companies.
"Social media is looking more and more like tobacco," Blumenthal said. "We have another addictive product where Big Tech, like Big Tobacco, is saying, 'It's the parent's responsibility. It's the kids. It's not our product.'"
Sen. Lindsey Graham, R-S.C., highlighted Meta's influence, noting it spent more than $19.2 million on lobbying the federal government in 2021. Graham urged his colleagues to "follow the money trail" and reject funds from these companies to curtail their power in government until they change.
"They successfully shut down every meaningful piece of legislation every year," Hawley said. "We'll get all kinds of speeches in committee, we'll get speeches on the floor about how we have to act, and then this body will do nothing. Why? Money."
By bringing bipartisan legislation to the floor for a vote, Hawley said, those lawmakers who are bought out by Big Tech would be forced to make a decision.
Lawmakers advocated for measures such as enabling users to opt out of algorithm-based recommendations and providing third-party researchers with algorithmic datasets from social media companies to better understand minors' safety.
Blumenthal assured the subcommittee that he would work with Senate Majority Leader Chuck Schumer, D-N.Y., to bring a vote for bills related to the issue before the end of the year.