For years, the vicious circle has turned: Websites attract bad, unverified complaints about alleged cheaters, sexual predators, sluts, and sluts. cheat. People slander their enemies. Anonymous posts appear high in Google’s victim name search results. The sites then charge victims thousands of dollars to take down the posts.
This cycle of slander has benefited the websites and the middlemen involved – and damaged the victims. Now Google is trying to break the loop.
The company plans to change its search algorithm to prevent sites operating under domains such as BadGirlReport.date and PredatorsAlert.us, from appearing in the list of results when someone searches for a person’s name.
Google also recently created a new concept it calls “known victims.” When people report to the company that they’ve been hacked on sites that charge for post removal, Google automatically blocks similar content when their name is searched. “Known victims” also include people whose nude photos were posted online without their consent, allowing them to request removal of explicit results for their names.
The changes – some already implemented by Google and others planned for the coming months – are a response to recent New York Times articles documenting how the slander industry hunts down victims. with the unwitting help of Google.
“I doubt that would be a perfect solution, certainly not. But I think it really has to have a positive and significant impact,” said David Graff, Google’s Vice President of Global Standards and Policy and Trust and Safety. “We can’t police the web, but we can be responsible citizens.”
That represents a significant change for victims of online slander. Google, which accounts for about 90% of global online searches, has historically opposed human reviews playing any role in its search engine, although in recent years Google bowed to pressure against misinformation and abuse that appeared at the top of its results.
At first, the founders of Google saw its algorithm as an unbiased reflection of the Internet itself. It used an analysis called PageRank, named after its co-founder Larry Page, to determine a site’s worthiness by assessing the number of other sites linked to it. , as well as the quality of those other sites, based on the number of sites linked to them.
The philosophy is, “We never touch search, no way. If we start tapping into search results, it’s a one-way rake to a regulated internet and we’re no longer neutral,” said Danielle Citron, a law professor at the University of Virginia. . A decade ago, Professor Citron pressured Google to block so-called revenge porn from appearing when searching for someone’s name. The company initially resisted.
Google made its position clear in a 2004 statement as to why its search engine turned up anti-Semitic websites in response to searches for “Jew.”
“Our search results are created completely objectively and are independent of the beliefs and preferences of those who work at Google,” the company said in the statement, which was removed a decade later. . “The only sites we ignored were those we were legally forced to remove or those that intentionally manipulated our results.”
Google’s initial interventions in Google’s search results were limited to things like web spam, pirated movies and music, as required by copyright law, as well as information that affects finances, such as a Social Security number. Only recently has the company been reluctant to take a more active role in cleaning up people’s search results.
The most notable case was in 2014, when European courts established the “right to be forgotten”. Residents of the European Union can request the removal of what they consider to be inaccurate and irrelevant information about them from search engines.
Google unsuccessfully fought the court’s ruling. The company says that its role is to make existing information accessible and that it does not want to be involved in regulating the content that appears in search results. Since permissions were established, Google has been forced to remove millions of links from people’s name search results.
More pressure for change occurred after Donald J. Trump was elected president. After the election, one of the top Google search results for “last votes in the 2016 election” was a link to an article that falsely claimed that Mr. The Electoral College, also won the popular vote.
A few months later, Google announced an initiative to provide “algorithm updates to show more authoritative content” to prevent information that is intentionally misleading, misleading, or offensive from showing up in search results. sword.
Around that time, Google’s aversion to the technical harassment of its results began to ease.
The Wayback Machine’s archive of Google’s policies on removing items from search results documents the company’s growth. First, Google is willing to disappear nude photos posted online without the consent of the subject. Then it started deleting medical information. This is followed by fake pornography, followed by sites with a “removal of exploitation” policy, and then so-called doxxing content, which Google defines as “disclosure of contact information with intent to cause harm “.
According to Google, the removal request form receives millions of hits every year, but many victims are unaware of their existence. That has allowed “reputation managers” and others to charge people for removing content from their results, which they can claim for free.
Pandu Nayak, head of Google’s search quality team, said the company started fighting sites that force people to remove slanderous content a few years ago, in response to the resurgence of an industry. is thriving, causing people to show up mocked photos that are then forced to delete.
Google began to rank such mining sites lower in its results, but the change did not help those who did not have a lot of information online. Because Google’s algorithm doesn’t like shortages, posts that accuse such people of drug abuse or pedophilia can still appear prominently in their results.
Slanderous hawker sites have relied on this feature. They won’t be able to charge thousands of dollars to remove content if the posts don’t damage people’s reputations.
Mr. Nayak and Mr. Graff said Google was not aware of the issue until it was highlighted in this year’s Times articles. They say that changes to Google’s algorithm and the creation of a “known victim” classification will help solve the problem. In particular, it will make it harder for websites to get attention on Google through one of their preferred methods: copying and reposting defamatory content from other sites.
Google recently tested the changes, with contractors performing side-by-side comparisons of new and old search results.
The Times had previously compiled a list of 47,000 people who had been written about slanderous websites. When searching for a small number of people whose previous results were defamatory posts, the changes Google made were detectable. For some, the posts have disappeared from their first page of results and image results. For everyone else, most posts are gone – save the post from a newly launched slander site called CheaterArchives.com.
CheaterArchives.com may illustrate the limits of Google’s new protections. Since it is fairly new, it is unlikely to generate claims from victims. Those complaints are one way Google finds slanderous sites. Additionally, CheaterArchives.com doesn’t explicitly advertise post removal as a service, potentially making it harder for victims to remove posts from their results.
Google executives say the company is not motivated solely by sympathy for victims of online slander. Instead, it’s part of a long-running effort by Google to combat websites that are trying to appear higher in search engine results than they should.
“These sites are fooling our system,” Mr. Graff said.
However, Google’s move is likely to raise questions about the company’s effective monopoly over information that is and isn’t in the public domain. Indeed, that’s part of the reason Google has so far been so reluctant to interfere with individual search results.
“You can find anything legit to find,” says Daphne Keller, an attorney at Google from 2004 to 2015, who worked on the search product team for part of that time. , who works on the search product team and is currently at Stanford researching how platforms should be regulated. Google is “just flexing its own muscles and deciding what information to disappear,” she said.
Ms. Keller is not criticizing her former employer, but lamenting the fact that lawmakers and law enforcement have largely ignored the slander industry and its extortion practices. , causing Google to clean up the mess.
Citron, a professor at the University of Virginia, who has argued that technology platforms have more power than governments in combating online abuse.
Professor Citron has been impressed by Google’s changes, especially the creation of the “known victims” title. Such victims are often posted, she said, many times, and the sites accumulate damage by scratching each other.
“I applaud their efforts,” she said. “Can they do better? Yes, they can.”
Aaron Krolik contribution report.