Scientists ask AI to generate human bodies—and the results reveal clear biases

AI-generated images. Credit: University of Toronto.

A new study from the University of Toronto has found that today’s artificial intelligence image generators often reproduce—and even exaggerate—common stereotypes about the human body.

When researchers asked popular AI models to create images of male and female bodies, the results leaned heavily toward narrow Western beauty standards and did not reflect real-world diversity.

The study, published in Psychology of Popular Media, examined 300 images generated by three major AI platforms: Midjourney, DALL·E, and Stable Diffusion.

The research team included postdoctoral fellow Delaney Thibodeau, research associate Sasha Gollish, master’s graduate Edina Bijvoet, Professor Catherine Sabiston, and Jessica E. Boyes from Northumbria University in the U.K.

The researchers found that when asked to create images of athletes, the AI systems consistently depicted extreme physiques.

Female and male athletes were shown with extremely low body fat, sharply defined muscles, and bodies that matched the “fit ideal” commonly seen in advertising and social media. Even when the images were not specifically about athletes, the body shapes remained highly unrealistic.

Gender stereotypes were also clear.

Female bodies were more likely to be portrayed as young, blonde, facially attractive, and dressed in revealing clothing such as swimsuits. Male bodies were often shown shirtless, heavily muscular, and with more body hair.

These repeated patterns suggest that the AI models draw from gendered imagery that already exists online, mirroring trends that place appearance over function and often promote harmful standards.

The study also highlighted a major lack of diversity. The overwhelming majority of the images were of young, white bodies.

No images depicted visible disabilities, and there was minimal variation in age or racial appearance. The researchers noted that when the prompt simply asked for “an athlete,” the AI generated a male body 90% of the time, pointing to a built-in bias toward male representation.

Professor Sabiston, director of the Mental Health and Physical Activity Research Center, says these findings show how important it is to closely examine how emerging technologies might reinforce existing beauty norms.

If AI systems continue to produce such narrow and unrealistic imagery, they risk spreading harmful messages about what healthy or athletic bodies “should” look like.

The researchers argue that AI developers should adopt a human-centered approach, designing algorithms that consider gender, race, disability, age, and other factors. Without deliberate attention to inclusiveness, AI will continue to amplify rigid and exclusionary body ideals.

Users also play an important role. Sabiston encourages people to think carefully about how they word prompts and how they use AI-generated images in public settings. Viewers should remember that AI images do not reflect reality and may be shaped by deep-rooted biases in training data.

More research is needed to understand how exposure to AI-generated images affects people’s self-esteem, motivation, and body image. Still, the team is hopeful. As more diverse and inclusive images begin circulating online, it may encourage healthier attitudes toward body and weight diversity—and help shape future AI systems for the better.