Deepfake Pornography for Sale on Etsy (SFW Article Version)

Featured | 0 comments

Update, January 29th, 2024: Etsy appears to have removed the images referenced in this article.

This is the non-explicit version of the article, Deepfake Pornography is For Sale On Etsy. This version has fewer images, screenshots and links, and is a little shorter.

You can find the original explicit article here.

Etsy has deepfaked porn for sale on it’s website, and in its app. Here are the highlights:

  • Non-consensual AI generated porn (NCAIP) is for sale from multiple sellers. Nude images of figures such as Taylor Swift and Emma Watson are available paid downloads.
  • Some AI-images could be understood to feature minors
  • Paid ads for other Etsy stores run at the bottom of some of these pages, presumably without the knowledge of the store owner.
  • Etsy’s algorithms present more and more of this content to users. For instance, if you search for “Emma Watson” two suggestions are “Emma Watson no clothes” and “Emma Watson Horny”. Etsy also makes recommendations for other content to view, such as “Naked Young Women” and “NSfw AI Art”
  • Paid ads for kids clothing come up with searches for items such as “Naked Young Woman”
  • There is no NSFW filter that I can find. For example, searches for SFW terms may turn up NSFW results, although I did not determine the frequency of this.

The three things I want to cover here are, (1) what deepfakes are, and why it matters they are for sale on Etsy, (2) Etsy has very little safety standards in place, if any, to combat it, and (3) Etsy itself appears apathetic to the issue.

Notes

I did my best not to rig the system in any way and believe that the algorithm driven suggestions apply to any user, but there is not a way to validate that without access to the source code. Certainly, that non-consensual AI porn exists it not a subjective issue. Many of the stores have “NSFW” in their name.

Sign up for updates on my newsletter

What Is A Deepfake?

Deepfakes are “the manipulation of facial appearance through deep generative methods.” In short, a deepfake is when an image has been made — such as with AI — or altered (Photoshop) so that a real person is shown in an unreal situation. In this case, we are talking about AI generating nude photos of people without their consent, and probably without their knowledge.

Are They Legal?

It is still an evolving area of law, especially in the United States where the first amendment trumps nearly anything false or fake, unless it causes real, direct harm.

Deepfake child porn (child sexual abuse material, CSAM) is illegal, but beyond that, in general, yes, they are legal. If somebody took some photos of you and turned them into a fake naked version of you online you would have little recourse. You can still violate copyright (such as naked Elsa’s from Frozen) or potentially defame someone. Celebrities may have some recourse under the right of publicity which allows celebrities to prevent unauthorized commercial exploitation of their image, but that is only on a state-level, not federal.

Why It Matters

So what? It isn’t news that there is pornography on the internet, and it is nowhere near new that AI can generate realistic pornographic images of real people.

First, this just shouldn’t happen. Full stop. We shouldn’t need a discussion about the ethics of generating and selling deepfake pornography, we can just call it bad.

Second, it matters that it is for sale. It’s one thing to create the images, it is another to sell them. It is also reprehensible that Etsy — and hence public shareholders — will profit off of NCAIP.

Third, it really matters that it is on a mainstream website. Etsy isn’t a random website or the cesspools of Reddit or 4chan. It is a publicly traded company with $2.5bn in revenue.

Fourth, while some of this is legally gray, a lot of it is not.

Most of the issues are not complex, but there are many more and I simply don’t have time to cover all of them.

There is also a broader context, when big tech can’t or won’t take action against predatory users if it might harm the bottom line,

Media Matters reported that X places ads for name brands next to white nationalist material. The Wall Street Journal reported that pedophiles actively use Instagram to find CSAM and other pedophiles. Several months after the original story, they again reported that Meta won’t keep pedophiles off of its platform, and Instagram delivers sexual content of children to adults.

Etsy

Etsy is known as a maker community market. Handmade items, vintage items and highly creative artwork are thing you’ll find on Etsy. Handmade jewelry, custom embroidery, paintings, prints, bags and purses, soap, spa bags, custom guitar straps, ornaments. It is a vast array of items, generally made by people and not corporations, and mostly physical items.

About itself, Etsy says:

Etsy is the global marketplace for unique and creative goods. It’s home to a universe of special, extraordinary items, from unique handcrafted pieces to vintage treasures.

It also is a company with revenue of $2.5bn/year valued at nearly 10 billion dollars. It isn’t a small company.

Pornographic Deepfakes for Sale

Who and How?

I did not take a comprehensive survey, the people that came up most often were:

  • Taylor Swift
  • Emma Watson (so often Etsy suggests searching for her nude!)
  • Angelina Jolie
  • Emilia Clarke
  • Scarlett Johansson

However, there were at least 77 different female celebrities with nude deepfakes for sale.

The frequency of subjects is probably a combination of demand — these are for sale — and easy of AI generation. I’m not going to go into technical detail on AI works, but the more pictures of someone it can “see” during training the more accurate it will be at creating images of that person later. If it sees someone enough, other photos may even begin to look like that person.

Every one of those is from a different NCAIP image for sale on Etsy as a digital download. Some are also marketed as prints. They range in price from a few cents per image to $20 +/- for packs of hundreds or thousands of images. AI nudes of Meghan Merkle are $2.27 to $7.50, while Alessandra Ambrosio nudes are $6.

They are all digital downloads, pay, and have access immediately, downloaded from Etsy.

They Are Realistic

I don’t want to get tied up here, even poor quality deepfakes should be off the table. However, today many — or most — of the images for sale could pass for photos, at least until you blow them up in size. They also are getting better, fast. It is likely only a matter of months before most of them are picture perfect, or nearly so.

Here is an example I generated with ChatGPT in about 30 seconds (actual images from Etsy are in the explicit version of this article):

They Are Easy to Make

It is easy to generate these photos, on your own computer or online.

On websites like Civitai you can download all you need to get started in AI generated images. Different AI-models are trained to generate different things1. Some are just the base to get started, other have specialities such as “Downblouse & Nipslip” which are “intended to cause wardrobe malfunctions… Downblouse produces a view down the subject’s shirt, while nipslip permits nipples to escape their prisons…”

Some are trained on specific people. Three of the highest all time rated celebrity models are Scarlett Johansson, Gal Gadot and Britney Spears. You can combine these models to generate the images you want.

This isn’t always bad but the good use cases aren’t the focus of today.

After you’ve got your models loaded you need to tell it what to make. The following is a simple things you can ask, and the result. (Taken from Civitai)

Do:
Portrait of scarlett as a beautiful female model, georgia fowler, beautiful face, with short dark brown hair, in cyberpunk city at night. She is wearing a leather jacket, black jeans, dramatic lighting, (police badge:1.2)

How Many Deepfakes Are There on Etsy?

I’m not sure. When I first came across one I thought it was an anomaly. I was wrong.

As of this posting, there are at least 15 different stores selling deepfaked images of celebrities. It’s likely the number is higher: I primarily looked through the titles of items for sales, so if there was an image of a real person but didn’t use a real named (for example, “Blonde 99”) it would not have been counted.

Combined, there are at least 2400 sales across these 15 stores. Again, that is likely an undercount of the real number. There were at least three dozen more NSFW AI stores where I didn’t see explicitly named NCAIP. If there were photos of real people in those download packs, I didn’t find them because I didn’t want to spend days sifting through AI porn.

The Algorithm is Unhinged

The Etsy algorithm doesn’t just reinforce deepfake searches, it suggests them. Take this search for Emma Watson, as soon I typed in “Emma Watson” Etsy suggests I search for “Emma Watson with no clothes”, or “Emma Watson horny”

I’m not trying game the system, so I selected the search “Emma Watson photo”, and I still get two AI Generated porn images of Emma Watson in the first 9 results!

“emma watson in a bikini” (remember, an Etsy suggestion), brings up an Etsy ad with Jon Legend, four ads for Etsy stores, three search results each with Emma Watson NCAIP (and one where she looks like she is 10) and one actual Emma Watson product.

I wonder if Jon Legend is cool with his photo endorsing Etsy right above deepfaked Emma Watson photos.

If you view one of these products, Etsy suggests even more such content at the bottom. Some are other products from the same store, others are for similar products at other stores.

In addition to the product suggestions at the bottom of many pages Etsy suggests explored related categories (basically searches), that I didn’t even know were Etsy categories.

These categories include:

  • Breast
  • Milf model
  • Milf photography
  • Nsfw AI Art
  • NSfw Digital Art
  • Naked Woman
  • Sexy Images
  • Artistic Nudes
  • Nude Young Woman
  • Naked Young Woman
  • Nude AI Art

The search “naked young woman” leads to a variety of results.

  • Kids comfort clothes
  • A pornographic calendar
  • A physical art print of a woman in a 100% see through dress in the rain
  • Some of the NCAIP we have already reviewed
  • Bedroom decorations
  • An ad for “Berry Toddler Girl Kids Mockup”

For better or for worse, none of the suggestions above seemed tailored to previous viewing. The results were consistent both when I was logged in and had viewed many NSFW items, and when I opened a fresh incognito window free of that viewing history.

The only tailored suggestion I saw was when I clicked on a link from an email Etsy sent me, which had more AI porn.

Dealing With Sellers & Custom Requests

Like any good small business, you can go back and forth with the sellers, ask about custom orders and pricing, etc. “Jenny” — the most prolific shop I found — even has a helpful out of office message.

I contacted one seller and asked for custom works of known people but s/he declined because of the risk.

A second seller just pointed out that the prices were listed.

A third posted five new galleries of four photos each of real people within the hour. (I asked about custom requests, I didn’t ask for it)

No Age Gates

There is no warning that the content contains nudity (or worse, more later). There is no requirement that you are 18 to enter. It is likely that this violates state laws2, and possibly the federal Communications Decency Act of 1996. This is particularly damning because (1), it is easy to create an age gate and, (2) it is possible to come across this with normal searches.

And not just non-consensual content in AI, a 3D figure of a naked Rey Skywalker comes up as part of the simple search “daisy ridley”.

In the full results, this figurine runs next to autographed photos, ads and costumes.

Even Reddit — which does not remotely shy away from controversial content — at least warns the user that Mature Content is ahead:

No Age Restrictions on iPhone

Etsy is rated as 12 and up on the Apple App Store. Which has a whole makes sense: they do sell lingerie and other things that would require an age ranking, but it clearly isn’t rated for the mature content you can find on the app.

It even is an editor’s choice with 4.9 stars!

The same content from the desktop website is on the app.

The app shows NCAIP, other non-real person AI nudes, and a cartoon nude that seems more likely to be viewed by children.

Again: no gaming the system here, I logged in as a guest. I even deleted the app and re-downloaded it several times to make sure I was getting a typical experience.

Emma Watson is still there on the first page, and the search bar is just as aggressive in suggesting I view her nude in some form or fashion.

It’s actually possible to get to the AI porn without searching. If you go down the related items rabbit trail starting from something like a wedding lace dress, you can get there in a few minutes.

This also means the Apple indirectly gets a cut of any NCAIP images sold through Etsy through the Etsy app on iOS.

Something For the Kids?

While not the topic of this article, I was a little surprised to see the amount of naked Disney Princesses available as well.

Young Deepfakes

It probably isn’t possible to determine the age of a completely made up person with a high degree of accuracy, but does it matter? I and literally every single state attorney general would argue yes. So what does that mean for Etsy?

It means that some of the AI content straddles the line between CSAM and not.

Legality aside. At a moral and ethical level, how do you feel about naked depictions of people below? Should any of these image have nude depictions of the rest of the body? (And on Etsy!)

Emma Watson looks like what, 12, 13? Should any person endure fake-nude photos of themselves for sale online when they look like they are a teen?

Does Etsy Care?

I found a store that suggested I visit their other website: “I have removed some of the finest images due to the objection by Etsy. For more uncensored content, please visit our Gumroad site.”

The content left up — apparently without objection by Etsy — includes, “AI-generated Scarlett Johansson — Celebrity Photo Poster No. 1”, “AI-generated Alison Brie — Celebrity Photo Poster No. 2” among 103 other products, some of which looked to be real people without names attached.

103 AI generated porn images? And Etsy knows?

Nudity and Mature Content is specifically called out as a prohibited item in the Etsy Term of Service.

As a creative community, we tend to be fairly liberal about what we allow on Etsy. That said, we prohibit pornography, illegal or exploitative items, and used intimate items3..

Pornography of any sort is prohibited on Etsy, whereas mature content is restricted. Although pornography can be difficult to define, an item generally qualifies as pornography when it contains printed or visual material that explicitly describes or displays sex acts, sex organs, or other erotic behavior for the purpose of sexual arousal or stimulation.

But there is pornogaphy, the “Nsfw photo” search brings up hardcore porngraphy, from shops that have words in their names like “Adult”,” “XXX,” “Eroctic”. Ads for kids portraits next to softcore pornography. There is hardcore porn in search results. (t is also the only search I found where the iOS app had a different set of results, dropping the worst of it).

I reported several of these stores a few days before publishing, as of now the reported store all remain up and continue to sell deepfakes. Etsy did not respond to a request for comment.

Given the violations of privacy, law, Apple’s terms of service, their own terms of service, and common decency I don’t know how you could argue that they care. They have to know about it (especially I after I contacted their press department) — it’s a $10 billion company.

It seems that Etsy is either OK with deepfakes for sale, or doesn’t care (or both). And if they care so little about celebrities — who can sue them — will they care if photos of you and I crop up?

The most troubling line is CSAM: if there is a generated image of a person that could be under 18 but it isn’t clear, how should Etsy handle that? Should Etsy filter for them? (Yes)

It isn’t good enough that these are a fraction of the hundreds of thousands of items for sale. Etsy could put a stop to half of it through keyword filters — don’t allow “NSFW AI Photos” to be a product — and probably 90% – 95% of it through image detection. Etsy may not have an obligation to fight deepfakes across the internet: it does have an obligation NOT to promulgate them on its own site.

Sign up for updates on my newsletter

Want to re-publish this?

The text of this article is Creative Commons Licensed, CC-BY 4.0. You are free to:

  1. Share — copy and redistribute the material in any medium or format for any purpose, even commercially.
  2. Adapt — remix, transform, and build upon the material for any purpose, even commercially.

You must:

  1. Provide attribution by linking back to this page.
  2. Indicate if you made any changes (i.e., you added your own comments or edited this)

The images are provided as is. AI-generated art is not copyrightable itself., but there is copyrighted material in some of the images and screen captures above. I believe their use here constitutes fair use, but that does not transfer the copyright to anyone and you use the images at your own risk.

Notes

  1. I am going to simplify here instead of getting in the differences in models, LoRas, merged models, etc. ↩︎
  2. Some states require age verification, this goes beyond just clicking a button that you are 18. Tennessee is not one of those states and I do not have a good way of testing to see if Etsy requires verification for this content in states with these ID laws. ↩︎
  3. Used underwear was another product I came across researching this article. Yuck, ↩︎

0 Comments