WASHINGTON – Stuart Force says he found solace on Facebook after it was his son stabbed in Israel by a member of the Hamas militant group in 2016. He turned to the website to read hundreds of messages in which he offered his condolences on his son’s side.
But just a few months later, Mr. Force had ruled that Facebook was partly responsible for the death because the algorithms that power the social network helped spread Hamas’ content. He, along with relatives of other victims of terrorism, sued the company, arguing that its algorithms aided the crimes by regularly reinforcing posts that encouraged terrorist attacks.
The Litigation ended unsuccessfully last year when the Supreme Court refused to admit it. In Washington, where some members of Congress are citing the case in an intense debate over the law protecting technology companies from liability for user-published content, arguments about the power of algorithms have been repeated.
At a House of Representatives hearing Thursday on the spread of misinformation with the executives of Facebook, Twitter and Google, some lawmakers are expected to focus on how companies’ algorithms are written to generate revenue by showing up on posts the users click and click responses. And some will argue that the law protecting social networks from liability, Section 230 of the Communications Decency Act, should be amended to hold companies accountable if their software makes platform services complicit in crimes committed offline .
“The past few years have shown that the more outrageous and extremist content social media platforms promote, the more engagement and advertising dollars they bring in,” said Frank Pallone Jr., chairman of the energy and trade committee, who questioned becomes the managing director.
“It is now painfully clear that neither the market nor public pressure will stop social media companies from escalating disinformation and extremism. So we have no choice but to legislate, and now it’s a question of how best to do is, “said Pallone. a Democrat from New Jersey added.
Former President Donald J. Trump called for Section 230 to be repealed, and President Biden made a similar comment when campaigning for the White House. A repeal, however, appears increasingly dubious as the legislature focuses on smaller possible legislative changes.
Changing the legal shield to take into account the power of algorithms could reshape the web as algorithmic sorting, recommendation and distribution are common on social media. The systems decide which links are displayed first in Facebook’s news feed, which accounts are recommended to users on Instagram and which video is played next on YouTube.
Industry, free speech activists, and other advocates of the legal shield argue that social media algorithms apply equally to posts regardless of the message. They say the algorithms only work based on the content provided by the users and so are covered in section 230 which protects websites that host posts, photos and videos by people.
Courts have agreed. A federal district judge said even a “most generous reading” of Mr. Force’s allegations “puts it right in the immunity that platforms are granted under the law.
A spokesman for Facebook declined to comment on the case, but cited comments from his chairman, Mark Zuckerberg, who supported some changes to Section 230. Elena Hernandez, a spokeswoman for YouTube, owned by Google, said the service made changes to the “search and detection algorithms to ensure more authoritative content are clearly highlighted and flagged in search results and recommendations.”
Twitter noted that it had suggested giving users more choices about the algorithms used to rank their timelines.
“Algorithms are fundamental building blocks for Internet services, including Twitter,” said Lauren Culbertson, director of US policy at Twitter. “Regulation needs to reflect the reality of how different services work and how content is ranked and expanded, while maximizing competition and balancing safety and freedom of expression.”
The case of Mr. Force began in March 2016 when his son Taylor Force, 28, was killed by Bashar Masalha while going out to dinner with classmates from the graduate school in Jaffa, an Israeli port city. Hamas, a Palestinian group, said 22-year-old Masalha was a member.
In the months that followed, Stuart Force and his wife Robbi worked to move into their son’s property and tidy up his apartment. That summer they received a call from an Israeli Process groupwho had a question: Would the Force family be willing to sue Facebook?
After Mr. Force spent some time on a Hamas Facebook page, the family consented to a lawsuit. The lawsuit fits in with a broader effort by the armed forces to limit the resources and tools available to Palestinian groups. Mr. Force and his wife have teamed up with Washington lawmakers to pass laws restricting aid to the Palestinian Authority, which rules part of the West Bank.
Her lawyers argued in an American court that Facebook had given Hamas “a sophisticated and sophisticated algorithm that makes it easier for Hamas to reach and involve audiences that it would otherwise not be able to reach as effectively”. The lawsuit said Facebook’s algorithms not only boosted posts, but also supported Hamas by recommending groups, friends, and events to users.
The New York federal district judge ruled against the claims, citing Section 230. The Force family lawyers appealed to a three-judge panel of the U.S. Second Circuit Court of Appeals, and two of the judges ruled entirely in favor of Facebook. The other, Judge Robert Katzmann, wrote a 35-page dissent on part of the verdict, arguing that Facebook’s algorithmic recommendations should not be covered by legal protection.
“There is growing evidence to suggest that vendors have designed their algorithms to guide users to content and people that users are okay with – and that they have done this too well, driving vulnerable souls ever further down dark paths.” , he said.
Late last year, the Supreme Court declined a call to hear another case that would have examined the Section 230 shield. In a statement accompanying the court’s decision, Justice Clarence Thomas asked the court to consider whether Section 230 protection had been extended too far, referring to Mr. Force’s complaint and Judge Katzmann’s opinion.
Judge Thomas said the court does not have to decide at the moment whether to restrict legal protection. “But in an appropriate case we have to do this,” he said.
Some lawmakers, lawyers, and academics say that recognizing the power of social media algorithms in determining what people see is long overdue. The platforms usually do not indicate exactly what factors the algorithms use to make decisions and how they are weighed up against each other.
“Reinforcement and automated decision-making systems create connections that would otherwise not be possible,” said Olivier Sylvain, professor of law at Fordham University, who made the civil rights argument. “They contribute significantly to the content.”
This argument has surfaced in a number of court cases claiming Facebook should be responsible for housing discrimination if its platform could target advertisements based on a user’s race. A Bill by Representative Yvette D. Clarke, Democrat of New York, would remove Section 230’s immunity from targeted advertisements in violation of the Civil Rights Act.
A bill tabled by Democrats Tom Malinowski of New Jersey and Anna G. Eshoo of California last year would remove Section 230 protection from social media platforms if their algorithms reinforced content against some anti-terrorism and anti-terrorism laws Violated civil rights laws. The press release announcing the bill, which will be reintroduced on Wednesday, quoted the Force family’s lawsuit against Facebook. Mr Malinowski said he was partly inspired by Judge Katzmann’s dissent.
Critics of the legislation say it may violate the first amendment and, because there are so many algorithms on the internet, it could avail of a wider range of services than the legislature intended. They also say that there is a more fundamental problem: regulating algorithmic amplification out of existence would not remove the impulses that drive it.
“There’s one thing you can’t get away from,” said Daphne Keller, director of the platform regulation program at Stanford University’s Cyber Policy Center.