– Meta’s AI image generator Imagine accused of racial bias
– Issue discovered when it couldn’t create images of mixed-race couples
– Meta is not the first tech company to face criticism for racially biased AI
Meta’s AI image generator, Imagine, has come under fire for alleged racial bias after users discovered it was unable to produce pictures of mixed-race couples such as an Asian man with a white woman. Despite its capabilities to transform written prompts into realistic images, the AI was only able to display images of Asian couples, raising concerns about its programming.
The apparent bias is especially surprising considering Meta’s CEO, Mark Zuckerberg, is married to Priscilla Chan, a woman of East Asian heritage. Some users joked about being able to create images of Zuckerberg and Chan successfully using the tool. When Business Insider reached out to Meta for comment, no response was provided at that time.
This incident is not the first time a major tech company faced criticism for “racist” AI. In the past, Google had to pause its Gemini image generator due to creating historically inaccurate images, leading to accusations of being overly “woke.” The problem highlights larger concerns about AI algorithms being based on societal biases and stereotypes, potentially excluding or targeting certain groups unfairly.
Generative AIs like Gemini and Imagine derive their data from massive datasets, and if data on mixed-race couples is limited, this could explain why the AI struggles to generate such images. The issue underscores the need for more diverse and inclusive data sets in AI development to avoid perpetuating biases and prejudices in technology.