WASHINGTON – Former President Donald J. Trump has repeatedly called for the deregulation of the law protecting tech companies from liability for what people post. President Biden, as a candidate, said the law should be “revoked.”
But lawmakers aiming to weaken the law have begun to agree on a different approach. They are increasingly focusing on removing protections for specific types of content rather than making wholesale changes to the law or removing it altogether.
That still leaves them with a question with far-reaching consequences: What exactly should lawmakers cut back?
A bill introduced last month would strip away protections on content companies are paid to distribute, like advertising, among other categories. Another proposal, expected to be reintroduced from the previous parliamentary session, would allow people to sue when a platform amplifies terrorist-related content. And another potentially rendered platform will only waive content from the law when one platform fails to comply with a court order to remove that content.
Even these more modest proposals for legal shields, Section 230 of the Communicative Rules Act, can go viral on the internet. These modifications can give companies like Facebook and YouTube an incentive to remove certain types of content while removing others. Critics also argue that there is a huge possibility of undesirable consequences, citing the 2018 law stripping the immunity of platforms intentionally facilitating sex trafficking. making some sex work more unsafe.
“I think we’re trying to say, ‘How can you shorten some of the exceptions to the 230 in a way that doesn’t affect your freedom of expression?’, Said Sen. Mark Warner of Virginia. said the person who introduced the law to cut the law. with a Democrat, Senator Mazie K. Hirono of Hawaii.
Calls for change gained momentum after the January 6 attack on the Capitol, carried out in part by people connected with QAnon and other conspiracy theories that thrived on social media. Critics claim that the shield has allowed tech giants to ignore criminal activity, hate speech, and extreme content posted on their services.
The law protects websites from multiple lawsuits about content posted by their users or how the sites choose to moderate that content. Adopted in 1996, it has facilitated the proliferation of major online services as they don’t need to take new liability every time they add one of their billions of other users.
Big tech companies have said they are willing to break the law, an attempt to shape changes they see increasingly likely. Facebook and Google, the owners of YouTube, have signaled they are willing to work with lawmakers to change the law, and a number of smaller companies have recently formed a lobbying group to shape any any change.
Some small steps – like promoting content removal after a court order – may gain support from tech companies. But others, like stripping an immunity from all ads, probably won’t.
Many lawmakers say creating legal-compliant terms will allow them to tackle the worst cases of disinformation or hate speech online without disrupting the entire economy. internet, hacking small websites or violating freedom of expression.
Representative Anna G. Eshoo, a California Democrat who proposed removing some of the content from the law, said: “There aren’t any laws that fix everything. “When someone says remove Section 230, the first thing they tell me is they don’t really understand it.”
But there are many other problems that have not been resolved. Lawmakers have to decide how much they want access to the platforms’ core business models rather than just encouraging better censorship. One way to tackle the core problem is to limit the shield when a post is amplified by proprietary algorithms that rank, sort, and propose content to users, like Ms. Eshoo’s bill in some case. Or, like Mr. Warner’s bill, lawmakers could simply say that Section 230 should not apply to any advertising at all.
And they grappled with the question of whether any changes should only apply to the largest platforms, like Facebook and YouTube, or be in effect across the internet. The smaller firms have argued that they should be exempt from many changes.
“I think we want to take the most modest step possible,” said Hany Farid, a professor at the University of California, Berkeley, who studies misinformation. “Give it a year or two, see how it plays out and tune.”
It is familiar for lawmakers to focus on targeted changes to the law. In 2018, Congress passed legislation removing Section 230 protections when platforms intentionally facilitate sex trafficking.
But Mr Trump has focused on deregulation. During his final weeks at the White House, he pushed Republicans to end protections in an unrelated defense funding bill. His supporters and allies may be dissatisfied with the targeted changes proposed by the Democrats, who now control both the Senate and the House of Representatives.
The White House did not immediately comment on the matter on Monday. But a December op-ed co-written by Bruce Reed, the vice president of human resources at Mr. Biden, said that “platforms are responsible for any content that generates revenue”. Op-ed also says that while fixing specific content types is a start, lawmakers would do a good job of giving the platforms the full liability shield with conditions they moderate for relevant content.
Supporters of Section 230 say that even small changes can hurt vulnerable people. They point to the 2018 anti-trafficking bill, which sex workers say it becomes more difficult to check potential clients online after some of the services they use have shut down, due to new legal liability concerns. Instead, sex workers reported that they now have to risk meeting clients in person without using the internet to determine their intentions at a safe distance.
Sen. Ron Wyden, an Oregon Democrat who co-wrote Section 230 while in the House, said measures aimed at addressing misinformation about rights could be used against other political groups in the House of Representatives. future.
“If you remember 9/11, and you had knee-like reactions to those horrible tragedies,” he said. “I think it would be a big mistake to use the disgusting, nauseating attacks on the Capitol as a means to suppress freedom of speech.”
Compliance with the law can still be extremely difficult, industry officials say.
“I appreciate that some policymakers are trying to be more specific about what they don’t like online,” said Kate Tummarello, chief executive of Engine, an advocacy group for small companies. “But there is no universe in which platforms, especially small platforms, will automatically know when and where to speak illegally on their website.”
The issue may come to the fore as executives from Google, Facebook and Twitter testify later this month to the House Energy and Commerce Committee, which is looking at the future of the law.
Washington Representative Cathy McMorris Rodgers, the committee’s top Republican, said: “I think it will be a big deal. “Part 230 is really driving it.”