AI Art: Lensa

 
 

Someone left a comment on one of our TikTok videos about how AI art sexualises women - so of course we had to explore it further.

Now, you may have heard of the app Lensa, which gained popularity at the end of last year with around 6 million downloads in December alone. For context, it’s been downloaded in total around 25 million times, so almost a quarter of those downloads happened at that time.  

Now Lensa has had a number of articles written about it mentioning how problematic it is because of the “inherent misogyny” of the AI portraits that it creates. 

The way Lensa works is by using a model called Stable Diffusion. Now, the thing you need to know about Stable Diffusion is that it’s trained on unfiltered Internet content. Which means that the biases present in the images that it is being trained with, will also be reflected in the AI pictures that it generates.  

So back in December, the Guardian wanted to test this software, and they uploaded photos of three famous feminists into the app - Betty Friedan, author of The Feminine Mystique, Shirley Chisholm, the first black woman elected into U.S. Congress, and Amelia Earhart, first woman to fly solo across the Atlantic Ocean. You can find the results in the article here.

So Betty Friedan, was made to look kind of nymph-like, with a low-cut slip dress and full chest. Shirley Chisholm seems to have a wasp waist, and Amelia Earhart was depicted lying on a bed, maybe? Looking like she’s naked, with her mouth slightly open.  

Another user, at around the same time, submitted pictures of only her face and got a bunch of photos which all seem to be kind of sexualised, fantasy images with huge breasts and tight fitting outfits.  

So of course, I had to give it a go. And I did it twice, and got some really interesting results. The first time I did it, I put in a bunch of photos of myself, from the shoulders up, from the ages of 14 to 23. I figured if 13 year olds were allowed to use the app, I should use photos to reflect users this age. The AI didn’t really create any sexual photos - at most, there was a necklace which kind of looked like a collar, and some off-the-shoulder dresses. But nothing too bad.

The second time, I used photos all taken within the past year, again, only from the shoulders up. This time, I got a number of full-body pictures, and, most notably, some with tiny waists, and big boobs. Now although it’s not as bad as I thought it might be, and not as overtly sexual as the photos from the article, it is still taking photos of my face and giving me a body, and one that is consistently very skinny, with a tiny waist and big boobs.  

Clearly, AI programs like these are not neutral. Maybe some adjustments have been made since December, because my results definitely weren’t as bad as I was expecting, but it still feels like they have quite a narrow perception of what an ‘attractive body’ looks like. And this is because Lensa’s model Stable Diffusion is learning from human-made, unfiltered and inherently biased data and then packaging up all the information into images like these.

Let us know if you’ve tried this app, and what you think about it.

 
String Theory