Facebook AI mistook black men for primates

Facebook turned off topic recommendation after the company’s AI mistook black men for “primates” in a video on the social network. Reported by the New York Times.

Facebook users who recently watched videos from a British tabloid featuring black men were shown an automatically generated prompt asking if they wanted to “keep watching primate videos.”

The video, dated June 27, 2020, was published by The Daily Mail and showed clashes between black men and white civilians and police officers. Although humans are among many species in the primate family, the video had nothing to do with monkeys, chimpanzees, or gorillas.

A Facebook spokesman said it was “clearly an unacceptable error,” adding that the recommendation software had already been disabled.

“We apologize to anyone who may have seen these offensive recommendations,” Facebook said in response to a request from Agence France-Presse. – We completely disabled the topic recommendation feature as soon as we realized it was happening. We will find out the reason and prevent this mistake from happening again. ”

Facial recognition software has been criticized by civil rights advocates who point to accuracy problems, especially for people who are not white.

A screenshot of the recommendation was shared on Twitter by former Facebook content design manager Darcy Groves. “This ‘keep watching’ suggestion is just not acceptable,” Groves tweeted to her former Facebook colleagues “Blatantly.”

If you have found a spelling error, please, notify us by selecting that text and pressing Ctrl+Enter.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Alexandr Ivanov earned his Licentiate Engineer in Systems and Computer Engineering from the Free International University of Moldova. Since 2013, Alexandr has been working as a freelance web programmer.
Function: Web Developer and Editor
Alexandr Ivanov

Spelling error report

The following text will be sent to our editors: