US lawmakers unleashed a torrent of criticism against social media top executives at a Capitol hearing Thursday, blaming the companies for amplifying false content and calls to violence, while promising new regulations to stem rampant online disinformation.
The hearing featuring the top executives of Facebook, Google and Twitter, the latest in a series of events examining the policies of tech platforms, got off to a stormy start as lawmakers rejected the claims by the CEOs on their efforts to keep off harmful content.
"It's now painfully clear that neither the market, nor public pressure, will enforce the social media companies to take the aggressive action they need to take to eliminate disinformation and extremism from their platforms and therefore it's time for Congress and this committee to legislate," said Representative Frank Pallone, chairman of the House of Representatives panel holding the hearing.
"Rather than limit the spread of this information, Facebook, Google and Twitter have created business models that exploit the human brain preference for divisive content to get Americans hooked on their platforms at the expense of the public interest."
At the hearing held remotely over video, Democrats slammed the platforms for failing to act to stem misinformation about Covid-19 vaccines and incitements ahead of the January 6 Capitol violence. Republicans meanwhile revived complaints that social networks were biased against conservatives.
Republican Representative Bob Latta accused the firms of operating "in a vague and biased manner, with little to no accountability," relying on a law giving them a "shield" against liability for content posted by others.
"My constituents simply don't trust you anymore," said Republican Gus Bilirakis.
"People want to use your services, but they suspect your coders are designing what they think we should see and hear."
- Free expression vs. moderation -
The tech CEOs said they were doing their best to keep out harmful content.
"We believe in free expression, we believe in free debate and conversation to find the truth," Dorsey said.
"At the same time we must balance that with our desire for our service not to be used to sow confusion division or distraction. This makes the freedom to moderate content critical to us."
Dorsey said in his written remarks that "every day Twitter grapples with complex considerations on how to address extremism and misinformation."
Zuckerberg said dealing with false and harmful content is a complex challenge.
"People often say things that aren't verifiably true, but that speak to their lived experiences," he told the panel.
"I think we have to be careful restricting that, for example, if someone feels intimidated or discriminated against while voting. I believe that they should be able to share their experience... I don't think anyone wants a world where you can only say things that private companies judged to be true."
At the same time, the Facebook founder said, "we also don't want misinformation to spread that undermines confidence in vaccines, stops people from voting, or causes other harms."
Pichai, whose company includes YouTube, defended the actions of the video platform, saying that after the January 6 violence it "raised up authoritative sources across our province on YouTube," and "removed livestreams and videos that violated our incitement to violence policies."
Pichai said Google's mission is "providing trustworthy content and opportunities for free expression, while combating misinformation. It's a big challenge."
Zuckerberg offered lawmakers a proposal to reform the liability shield known as Section 230, suggesting that platforms have systems in place to filter and remove illegal content.
He maintained that Congress "should consider making platforms' intermediary liability protection for certain types of unlawful content conditional on companies' ability to meet best practices to combat the spread of this content."
Lawmakers said they would introduce their own proposals to reform Section 230.
"The regulation that we seek should not attempt to limit constitutionally protected freedom of speech, but it must hold platforms accountable when they are used to incite violence and hatred or as in the case of the Covid pandemic spread misinformation that costs thousands of lives," said Democratic Representative Jan Schakowsky.
Pallone meanwhile said to the executives, "your business model itself has become the problem and the time for self regulation is over. It's time we legislate to hold you accountable, that's what we're going to do."
Some lawmakers argued that platforms like Facebook use algorithms that amplify inflammatory content, and seek to keep people addicted to social media to boost revenue.
Representative Adam Kinzinger cited research saying Facebook algorithms "are actively promoting divisive hateful and conspiratorial content because it engages users to spend more time."
Zuckerberg responded that "there's quite a bit of misperception about how our algorithms work and what we optimized for now."
He added that "we are trying to help people have meaningful social interactions" but that "that's very different from setting up algorithm" that lead to addiction.