Trending Now
We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
{{ subpage.title }}
AI struggles with gender and race
Generative AI keeps messing up on important issues about diversity and representation — especially when it comes to love and sex.
According to one report from The Verge, Meta’s AI image generator repeatedly refused to generate images of an Asian man with a white woman as a couple. When it finally produced one of an Asian woman and a white man, the man was significantly older than the woman.
Meanwhile, Wired found that different AI image geneators routinely represent LGBTQ individuals as having purple hair. And when you don’t specify what ethnicity they should be, these systems tend to default to showing white people.
Generative AI tools are no better than their underlying training data. If their training data is biased, say, by having disproportionate amounts of queer people with highlighter-colored hair, it will consistently represent them as such. It’s incumbent on the proprietors of these models to both improve their training data — buying or accumulating more comprehensive datasets — but, more importantly, tweaking the outputs so they are more inclusive and less stereotypical. That requires lots of testing, and consultation with real-world people from diverse backgrounds about the harms of such representation failures.
But they need to be wary of overcorrection too: Google was recently condemned for generating Black and Asian Nazi soldiers as well as Native Americans in Viking garb. AI can’t understand the complexities of these things yet — but the humans in charge need to.