WASHINGTON – Stuart Force said he found solace on Facebook after his son was stabbed to death in Israel by a member of the Hamas militant group in 2016. He went to the site to read hundreds of messages. condolences on your son’s page.
But just a few months later, Mr. Force decided that Facebook was partially responsible for the death, as the algorithms that power the social network helped spread Hamas’s content. He, along with relatives of other terrorist victims sued the company, claiming that the company’s algorithms aided the crime by regularly amplifying posts encouraging terrorist attacks.
The legal case ended unsuccessfully last year when the Supreme Court refused to accept it. But debates over the power of the algorithm have been raging in Washington, where some members of Congress are citing the case in a heated debate over laws that protect tech companies from liability to user-posted content.
At House hearings on Thursday on spreading disinformation to Facebook, Twitter and Google executives, some lawmakers are expected to focus on how the algorithms of Companies are written to generate revenue by showing posts that users are more inclined to click and respond to. And some would argue that the law that protects social networks from liability, Section 230 of the Communication Framework Act, should be changed to hold companies accountable as their software transforms. Services from platform to accomplice for offline crimes.
Representative Frank Pallone Jr., Chairman of the Energy and Commerce Commission, said: “The past few years have proven that the more social media platforms promote extreme and extreme content, the more they collect. get more money for advertising and interaction ”. the executives.
“Now it is clear that neither the market nor public pressure is stopping social media companies from raising false information and extremism, so we have no choice but to establish France, and now is the question of the best way to do it, ”added Pallone, a Democrat in New Jersey.
Former President Donald J. Trump called for Section 230 to be abolished, and President Biden made similar remarks when campaigning for the White House. But the deregulation seems increasingly suspicious, with lawmakers focusing on smaller possible changes to the law.
Change the legal shield to account for the power of algorithms that can reshape a website, because algorithmic categorization, recommendations, and distribution are pervasive on social media. The system decides which links are displayed first in the Facebook News Feed, which accounts are recommended for users on Instagram, and which videos are played next on YouTube.
Industry, freedom of speech activists and other legal shield advocates argue that the social network’s algorithms are equally applicable to posts no matter what the message is. They say that the algorithms only work due to user-provided content and, therefore, covered in Section 230, protects websites that host people’s posts, photos, and videos.
The court has agreed. A federal district judge said even “with the most generous read” of the charges made by Mr. Force “put them bluntly in” the immunity granted to the foundations under the law.
A Facebook spokesperson declined to comment on the incident but only commented from its chief executive, Mark Zuckerberg, in favor of some changes to Section 230. Elena Hernandez, YouTube spokesperson, is in the department. Google, says the service has made changes to “search and discovery algorithms to ensure more authoritative content is featured and prominently labeled in search results and recommendations. . “
Twitter noted that it has offered to give users more options over their timeline ranking algorithms.
“Algorithms are the foundation of internet services, including Twitter,” said Lauren Culbertson, Twitter’s head of US public policy. “Rules must reflect the reality of how different services operate and content is ranked and amplified, while maximizing the competition and balancing between safety and freedom of speech.”
Mr. Force’s case began in March 2016 when his son, Taylor Force, 28, was killed by Bashar Masalha while going out to dinner with his graduate school friends in Jaffa, an Israeli port city. Hamas, a Palestinian group, said Masalha, 22, is a member.
During the following months, Stuart Force and his wife, Robbi, worked to settle their son’s property and clean up his apartment. That summer, they got a call from an Israeli litigation group, which asked the question: Are the Force families willing to sue Facebook?
After Mr. Luc visited a Hamas Facebook page, the family agreed to sue. The lawsuit fits in with a broader Force attempt to limit the resources and tools available to Palestinian groups. Mr. Force and his wife allied with lawmakers in Washington to pass a bill limiting aid to the Palestinian Authority, which partially governs the West Bank.
Their lawyers have argued in a US court that Facebook provided Hamas “a highly developed and sophisticated algorithm that enables Hamas to reach out and engage an audience that otherwise would have been. Facebook cannot be reached effectively ”. The lawsuit says Facebook’s algorithms not only amplify posts, but also support Hamas by referring groups, friends, and events to users.
The federal district judge, in New York, ruled over the claims, citing Section 230. The Force’s attorneys appealed to a panel of three judges of the US Court of Appeals for Round two, and two of the judges issued a complete verdict for Facebook. The other, Judge Robert Katzmann, wrote a 35-page dissident post over part of the verdict, arguing that Facebook’s algorithm proposals should not be protected by protective measures. juridical.
“Gathering evidence suggests that vendors designed their algorithms to direct users to content and to whom users agree – and they did it too well, driving the Sensitive souls go deeper and deeper down the dark path, ”he said.
Late last year, the Supreme Court rejected its call to hear another case that should have tested the Section 230 shield. In a statement attached to the court’s decision, Justice Clarence Thomas called for The court examines whether Section 230 protections have been overly extended, citing Mr. Force’s case and the opinion of Judge Katzmann.
Justice Thomas says the court does not need to decide at this point whether to refrain from the legal safeguards. “But under the right circumstances, we should do so,” he said.
Some lawmakers, lawyers and scholars say that the recognition of the power of social networks’ algorithms in determining what people see is long overdue. Platforms often fail to reveal exactly what factors algorithms use to make decisions and how they are weighed against each other.
Olivier Sylvain, a law professor at Fordham University who has made the argument in the context of civil rights, said: “Amplification and automated decision-making systems are creating opportunities for connectivity that otherwise would otherwise can not. “They are making a significant contribution to the content.”
That argument has emerged in a series of lawsuits alleging that Facebook is responsible for housing discrimination when its platform can target ads by a user’s race. The bill, introduced by Representative Yvette D. Clarke, a Democrat of New York, would strip Section 230 of the right to immunity from advertisements that violate civil rights laws.
A bill introduced last year by Representatives Tom Malinowski of New Jersey and Anna G. Eshoo of California, both Democrats, would strip Section 230 protections from social media platforms when Their algorithms amplify content that violates certain anti-terrorism and civil rights laws. Newsletter announcing the bill, to be reintroduced on Wednesday, cites the Force family’s lawsuit against Facebook. Mr. Malinowski said he was partly inspired by Judge Katzmann’s disagreement.
Critics of the law argue that it could violate the First Amendment, and since there are so many algorithms on the web, could wipe out more types of services than lawmakers intended. They also say there is a more fundamental problem: Adjusting the op-amp out of existence will not remove the pulses driving it.
“There’s one thing you can’t avoid,” said Daphne Keller, Program director for Platform Regulation at Stanford University’s Network Policy Center.