Design tools

How to Protect Rape Victims from PornHub

An excellent article in the New York Times highlights a significant problem: PornHub is infested with rape videos. Here we are offering the solution.

The author of the article mentions:

It monetizes child rapes, revenge pornography, spycam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.

The article is short on solutions, though. As a company dealing with stock photography and AI processing of the photos, we have a couple of things to add.

I think there could be a combination of administrative and tech solutions.

1. Require a signed model release form

For years, stock photography does it with a standardized contract accepted by multiple (if not all) stock photo websites.

My colleagues and I used to sign it with our models (for our stock photography, not porn). It requires a brief conversation explaining what we do: “Your pictures could appear anywhere where our clients want to use them. The policy only prohibits the use for anything pornographic,” then responding follow-up questions, sometimes reading, and signing. There are apps for that, where people pass through several steps before they sign it, and they get a copy.

Usually, we sign it before shooting (a good time is while we do a model’s hair). A real story: once a model was late, the whole crew was waiting, and we started quickly. I didn’t bring this up before the shooting session ended — that was my first mistake. It appeared that I didn’t warn her that we’re shooting stock photography — that was my second mistake. In the end, so we had to discard the material while paying her the full fee.

Summary: model release forms are straightforward, well planned, and effective. They work in stock photography and stock videos. They could work with other photos and videos.

2. Detect the faces with AI

It’s easy and works well already.

Phase 1 would be banning people who don’t want to be published. I see it as an app called “Remove me from porn,” which makes sure it’s you and stores a fingerprint of your face in a blacklist. All pornographic websites should check the videos against this blacklist.

Phase 2 would be allowing only people who explicitly consented. Again, stock photography works like this, and it’s all good.

3. Anonymize the videos with deepfakes

Our company generates people (as reported in another New York Times article). I feel that generating faces similar to the person’s one would be super helpful.

A picture of Sasha Grey anonymized with generated photos. These non-existing people loosely remind her face but make her unrecognizable.

I believe combining our anonymization engine with deepfakes could resolve the problem once and for all.

Combined with the first two measures, we can encode the faces of people who didn’t consent. This way, we harm neither the industry nor customers by removing the vast majority of content, making this measure easier to implement.

CEO Column by Ivan Braun, the founder of Icons8 and Generated Photos

Title image from Moose Photo Stock

Recent Posts

Testing the ground

This is where we place all the possible blocks that we use for our articles…

5 months ago

How to illustrate a children’s book with AI

Creating illustrations for a children’s book is challenging, especially if you're not an artist. Learn…

5 months ago

Retro fonts: history, examples, and modern interpretation pt. 2

Nostalgic typefaces are trendy and more varied than ever. Discover the story behind vintage typography…

5 months ago

Retro fonts: history, examples, and modern interpretation pt. 1

Curious about the diverse world of nostalgic fonts? Keep reading to explore typography from the…

6 months ago

Arrow icons: style ideas and examples of use cases

Check out arrow icons in different aesthetics and get some inspiration on how to implement…

6 months ago

How to create perfect matching visuals for an online course landing page

Step-by-step tutorial on generating AI illustrations for the online school landing page design.

6 months ago

This website uses cookies.