Facebook users who recently watched a video from a British tabloid featuring a Black Man saw an automated prompt from the social network asking if they wanted to “continue watching primate videos,” prompting the company must investigate and disable artificial intelligence assistance. pushed the message.
On Friday, Facebook apologized for what it called “an unacceptable bug” and said it was looking into the recommendation feature to “prevent this from happening again”.
The video is dated June 27, 2020, posted by The Daily Mail and features clips of Black men interspersed with civilians and white police. It has no connection to monkeys or primates.
Darci Groves, a former director of content design at Facebook, said a friend recently sent her a screenshot of the reminder. She then posted it on a product feedback forum for current and former Facebook employees. In response, a product manager for Facebook Watch, the company’s video service, called it “unacceptable” and said the company was “looking into the root cause.”
Ms. Groves said the reminder was “horrific and serious.”
Dani Lever, a Facebook spokesperson, said in a statement: “As we’ve said, while we’ve improved our AI, we know it’s not perfect, and we have much more to come. . We apologize to anyone who may have seen these offensive recommendations.”
Google, Amazon and other tech companies have come under scrutiny for years for biases in their artificial intelligence systems, especially around issues of race. Studies have shown that facial recognition technology is biased against people of color and makes it more difficult to identify them, leading to incidents of black people being discriminated against or arrested for errors. calculator.
In one example in 2015, Google Photos mistakenly labeled a photo of a Black person as “gorilla”, saying it was “really sorry” and would try to fix the problem immediately. More than two years later, Wired discovered that Google’s solution was to censor the word “gorilla” from searches, while also blocking “chimp,” “chimp,” and “monkey.”
Facebook has one of the world’s largest repositories of user-uploaded images for training face and object recognition algorithms. The company, which tailors content for users based on their previous browsing and viewing habits, sometimes asks people if they want to continue viewing posts in related categories. It is unclear whether messages such as “primates” are common.
Facebook and the photo-sharing app, Instagram, have struggled with other issues related to race. For example, after the UEFA European Championships in July, three black members of the England national football team were racially abused on social media for missing a penalty kick during the title match. enemy.
Race issues have also caused internal conflict at Facebook. In 2016, Mark Zuckerberg, chief executive officer, asked employees to stop crossing out the phrase “Black Lives Matter” and replace it with “All Lives Matter” in a common space at the company’s headquarters in New York. Menlo Park, California. Hundreds of employees also held a virtual walkout last year to protest the company’s handling of a post by President Donald J. Trump about the killing of George Floyd in Minneapolis.
The company then hired a vice president for civil rights and issued a civil rights audit. In an annual diversity report in July, Facebook said 4.4% of its US employees are black, up from 3.9% the previous year.
Groves, who left Facebook in the summer after four years, said in an interview that a series of missteps at the company showed that addressing racial issues was not a priority for leaders.
“Facebook can’t keep making these mistakes and then saying, ‘I’m sorry,'” she said.