This article is part of the On Tech newsletter. You can register here to receive it weekdays.
The QAnon Conspiracy Theory, which advertises bogus health treatments and calls for violence based on false claims of election fraud, has a common theme: Facebook groups.
These like-minded forums can be great communities for passionate gardeners in the same neighborhood or parents with a child with rare diseases. But for years, it has been clear that groups have driven the tendency of some to engage in hot online battles, spreading rampant information whether it’s true or not, and scapegoating the others.
I don’t want to over-simplify and blame Facebook groups for all the bad in the world. (Read my colleague Kevin Roose’s latest column for suggestions on how to target polarizing and engage people in purposeful activities.) And Facebook’s harm reduction isn’t as simple as company critics believe.
But many of the more toxic side effects of Facebook groups are a result of the company’s choices. I asked some of the experts in the online media industry what they would do to reduce the downsides of groups. Here are some of their suggestions.
Stop automatic suggestions. Facebook said it would extend the suspension of proposals on computers so that people join groups related to politics. Some experts argue that Facebook should go further and stop computer-aided group suggestions altogether.
It’s great that Facebook offers a forum on how to grow roses to a horticultural post. But over the years, Facebook group suggestions have proven to be easily manipulated and push people away from ideas.
In 2016, according to a Wall Street Journal report, Facebook research found that two-thirds of those participating in extremist groups did so according to Facebook’s recommendation. My colleague Sheera Frenkel has said that automated group proposals are one of the ways QAnon conspiracy theory spreads.
The end of these computerized proposals is not a silver bullet. But it’s crazy that activists and academics routinely scream about how harmful recommendations are, and Facebook is just tinkering.
Provides more oversight of private groups. Social media researchers Nina Jankowicz and Cindy Otis have suggested not allowing groups on some members to stay private – meaning newcomers must be invited and outsiders can’t see what’s on. to be discussed – without regular human appreciation for their content.
Jankowicz told me: “A lot of really malicious groups are not searchable and only invited, and that’s a very serious problem.
Jankowicz and Otis have also pushed for more consistent descriptions of groups and more transparency about who administers them. Political discussion groups are sometimes intentionally mislabeled by homeowners as “personal blogs” to avoid the more attention Facebook applies to political forums.
Target offenders according to habit groups. Renée DiResta, a misinformation researcher at the Internet Observatory at Stanford, says Facebook needs to “act more assertively” on groups that are constantly harassing or violating Facebook’s rules. many times. Facebook took several steps in this direction last year.
Jade Magnus Ogunnaike, senior director of race justice organization Color of Change, also said that Facebook should stop using contractors to review material on the site. It would be fairer to convert those workers into employees, she said, and it could help improve the quality of monitoring what was happening on the team.
Add some… librarians? Joan Donovan, research director of the Shorenstein Center for Communication, Politics and Public Policy at Harvard University, suggests that major internet companies should hire thousands of librarians to provide information to those who are are tested against groups of false information.
Superstars are not great at everything
Jeff Bezos likes to say that failure is healthy because people and companies learn from it. But sometimes failure is the result of a company’s weaknesses and that’s not a good thing.
Over the last few days there have been articles about both Amazon and Google being unable to create their own successful video games even though they have limitless cash and smart people.
The root cause of their failures is complex, but two problems come to mind about what happened: cultural weaknesses and arrogance. (And in Amazon’s case, Bezos’ over-reliance on “Jeff-isms” intellect, like the one above.)
Here’s what happened: Google this week said it was shutting down its video game team. And Bloomberg News detailed the reasons behind Amazon’s continued failure to create its own high-powered video games.
Describing Amazon’s struggles as a reflection of Amazon, it reports that data phobia distracts people from making games fun. Executives are confident in their company’s expertise forcing employees to use game development technologies manufactured by Amazon instead of industry standard technologies.
Google too, for all of its successes, has had habits that get ingrained in unfamiliar fields that sometimes make it difficult to break into. This week, the technology publication The Information reported on Google’s business difficulties selling cloud computing technology to companies.
Google engineers are treated like kings and it’s hard to convince them to come up with three-year rigid product line maps that companies tend to like. The Google Cloud business has struggled for years with the same fundamental problem – challenging Google’s way into the bad habits of its corporate customers.
The magic (or frustrating) thing about cash-rich superstar companies is that they can often turn failure into success. But Amazon and Google’s struggles in business areas beyond their core expertise is a reminder that wealth and intelligence sometimes get companies overshadowed by their weaknesses.