Data released by Facebook last fall shows that in a week of October, 7 out of the 10 most engaged pages were mostly political, including pages of President Donald J. Trump, Fox News , Breitbart and Democratic Party Occupation.
Three years ago, Facebook said it would reduce the amount of content posted to the site by news and brand publishers, an overhaul it said would focus more on interaction between friends and family. At the time, Zuckerberg said he wanted to make sure Facebook’s products were “not only interesting, but also good for everyone.” He also said the company will take those actions even if it means hurting profits.
However, Facebook users have no problem finding political content. Non-governmental organizations and political action committees paid to display highly targeted political ads to millions of Americans in the months leading up to the November presidential election. Users created a large number of private groups to discuss campaign issues, organize rallies and support candidates. Up until recently, Facebook’s own system frequently suggested new, different political groups that users could join.
Facebook has re-examined some of these in recent months. After the polls ended on Election Day, the company closed the possibility of buying new political ads. And after the deadly Capitol riot on Jan. 6, Zuckerberg said the company would turn off its ability to recommend political groups “lower the temperature” in global conversations.
According to the new test, a machine learning model will predict how likely a post – whether it is posted by a major news organization, a political expert, or your friends or relatives – is political. Posts that are considered political are less likely to appear in a user’s feed.
It’s not clear how Facebook’s algorithm will determine how political content is or how the changes will significantly affect people’s feed. Lauren Svensson, a Facebook spokesperson, said the company will continue to “refine this model during the testing period to better identify political content and we may or may not use this approach. more permanent ”.
It is also not clear what would happen if Facebook experiments determined that reducing political content also reduces people’s website usage. In the past, the company omitted or modified algorithmic changes aimed at reducing the amount of divisive and misleading content people see, after determining that these changes made them less open to Facebook. than.