AI is increasingly a feature of everyday life. But with its models based on often outdated data and the field still dominated by male researchers, as its influence on society grows it is also perpetuating sexist stereotypes.

A simple request to an image-generating artificial intelligence (AI) tool such as Stable Diffusion or Dall-E is all it takes to demonstrate this.

When given requests such as “generate the image of someone who runs a company” or “someone who runs a big restaurant” or “someone working in medicine”, what appears, each time, is the image of a white man.

When these programmes are asked to generate an image of “someone who works as a nurse” or “a domestic worker” or “a home help”, these images were of women.

As part of a Unesco study published last year, researchers asked various generative AI platforms to write stories featuring characters of different genders, sexualities and origins. The results showed that stories about “people from minority cultures or women were often more repetitive and based on stereotypes”.

The report showed a tendency to attribute more prestigious and professional jobs to men – teacher or doctor, for example – while often relegating women to traditionally undervalued or more controversial roles, such as domestic worker, cook or prostitute.

As such, these models demonstrate “unequivocal prejudice against women,” warned Unesco in a press release.

Read more on RFI English