An excellent article in the New York Times highlights a significant problem: PornHub is infested with rape videos. Here we are offering the solution.
The author of the article mentions:
It monetizes child rapes, revenge pornography, spycam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.
The article is short on solutions, though. As a company dealing with stock photography and AI processing of the photos, we have a couple of things to add.
I think there could be a combination of administrative and tech solutions.
For years, stock photography does it with a standardized contract accepted by multiple (if not all) stock photo websites.
My colleagues and I used to sign it with our models (for our stock photography, not porn). It requires a brief conversation explaining what we do: “Your pictures could appear anywhere where our clients want to use them. The policy only prohibits the use for anything pornographic,” then responding follow-up questions, sometimes reading, and signing. There are apps for that, where people pass through several steps before they sign it, and they get a copy.
Usually, we sign it before shooting (a good time is while we do a model’s hair). A real story: once a model was late, the whole crew was waiting, and we started quickly. I didn’t bring this up before the shooting session ended — that was my first mistake. It appeared that I didn’t warn her that we’re shooting stock photography — that was my second mistake. In the end, so we had to discard the material while paying her the full fee.
Summary: model release forms are straightforward, well planned, and effective. They work in stock photography and stock videos. They could work with other photos and videos.
It’s easy and works well already.
Phase 1 would be banning people who don’t want to be published. I see it as an app called “Remove me from porn,” which makes sure it’s you and stores a fingerprint of your face in a blacklist. All pornographic websites should check the videos against this blacklist.
Phase 2 would be allowing only people who explicitly consented. Again, stock photography works like this, and it’s all good.
Our company generates people (as reported in another New York Times article). I feel that generating faces similar to the person’s one would be super helpful.
I believe combining our anonymization engine with deepfakes could resolve the problem once and for all.
Combined with the first two measures, we can encode the faces of people who didn’t consent. This way, we harm neither the industry nor customers by removing the vast majority of content, making this measure easier to implement.
CEO Column by Ivan Braun, the founder of Icons8 and Generated Photos
Title image from Moose Photo Stock
This is where we place all the possible blocks that we use for our articles…
Creating illustrations for a children’s book is challenging, especially if you're not an artist. Learn…
Nostalgic typefaces are trendy and more varied than ever. Discover the story behind vintage typography…
Curious about the diverse world of nostalgic fonts? Keep reading to explore typography from the…
Check out arrow icons in different aesthetics and get some inspiration on how to implement…
Step-by-step tutorial on generating AI illustrations for the online school landing page design.
This website uses cookies.