After watching a video from a British tabloid showcasing Black guys, Facebook users received an automated alert asking if they wanted to “keep seeing videos about Primates,” prompting the company to investigate and stop the AI-powered function that pushed the message.
Facebook apologized for the “unacceptable blunder” and said it looked into the recommendation tool to “avoid this from happening again” on Friday. This comes after its artificial intelligence systems asked viewers of a British film about a Black man if they wanted to see more content on “primates.”
The Daily Mail released a video on 27th June 2020 that contained clips of Black men fighting with white bystanders and police personnel. It has absolutely nothing to do with monkeys or apes.
A former Facebook content design manager, Darci Groves, claimed she recently received a prompt screenshot from a friend. She then shared it with past and current Facebook employees on a product feedback forum. In response to it, a product manager for Facebook Watch, its video service, called it “unacceptable” and told the company to investigate the root reason.
In a statement, Dani Lever, a Facebook spokesperson, claimed that the company’s artificial intelligence has improved but isn’t flawless and that there is more work to be done. They apologize to anyone who may have been offended by these suggestions.
In 2015, for example, Google Photos incorrectly categorized photos of Black individuals as “gorillas,” for which Google expressed “sincere regret” and promised to correct the problem quickly. Wired discovered that Google’s approach was to exclude the word “gorilla” from searches, as well as the words “chimpanzee,” “chimp,” and “monkey.”
Facebook’s facial- and object-recognition algorithms are trained on one of the world’s largest repositories of user-uploaded photographs. The company, which customizes material for users based on their previous browsing and watching habits, occasionally asks if they want to see more postings in related categories. It was unclear whether or not messages like the one about “primates” were widely distributed.
Racial tensions have also fueled internal friction at Facebook. In 2016, Mark Zuckerberg, the company’s CEO, urged employees in a shared space in Menlo Park, Calif., to cease crossing out the words “Black Lives Matter” and replace it with “All Lives Matter.” Last year, hundreds of employees organized a virtual walkout to protest the company’s treatment of President Donald J. Trump’s statement about the murder of George Floyd in Minneapolis.
For years, Google, Amazon, and other technology corporations have been criticized for biases in their artificial intelligence systems, notably on race-related concerns. According to several studies, facial recognition software is biased towards persons of color. It has a more challenging time identifying them, resulting in situations when Black people have been discriminated against or detained due to a computer error.
Source: https://www.nytimes.com/2021/09/03/technology/facebook-ai-race-primates.html
Other Source: https://www.theregister.com/2021/09/06/in_brief_ai/
Suggested
Credit: Source link
Comments are closed.