(OPINION) – Facebook on Friday said it disabled its topic recommendation feature after it mistook Black men for “primates” in video at the social network.

According to France24 News, A Facebook spokesperson called it a “clearly unacceptable error” and said the recommendation software involve was taken offline. “We apologize to anyone who may have seen these offensive recommendations,” Facebook said in response to an AFP inquiry.

“We disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again.” Facial recognition software has been blasted by civil rights advocates who point out problems with accuracy, particularly it comes to people who are not white.


Advertisement


The NYT stated that Darci Groves, a former content design manager at Facebook, said a friend had recently sent her a screenshot of the prompt. She then posted it to a product feedback forum for current and former Facebook employees. In response, a product manager for Facebook Watch, the company’s video service, called it “unacceptable” and said the company was “looking into the root cause.”

Ms. Groves said the prompt was “horrifying and egregious.” Dani Lever, a Facebook spokeswoman, said in a statement: “As we have said, while we have made improvements to our A.I., we know it’s not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.”

Google, Amazon and other technology companies have been under scrutiny for years for biases within their artificial intelligence systems, particularly around issues of race. Studies have shown that facial recognition technology is biased against people of color and has more trouble identifying them, leading to incidents where Black people have been discriminated against or arrested because of computer error.