upvote
Why does it need to be revenge porn? Pretty sure regular old porn has a large market there where people can specify what they idealistically want to see vs trying to find it, if it exists.

Not every place has LEGO incest porn… or whatever the kids are into these days.

reply
I'm not deeply immersed in the AI porn space but here's what I see from the ads when I surf without a blocker:

1. There's an AI-based virtual girlfriend industry that mixes text and images

2. There's an AI-based virtual boyfriend industry that is essentially all text (and not always distinguishable from the normal chat models)

3. There's a much shadier AI-based "undress this specific woman" industry

reply
People make revenge porn to humiliate people. Regular old porn can't achieve that goal.
reply
And yet, regular porn is highly monetizable, which was the actual question.
reply
Surprisingly no; it's pretty much a money sink where everybody goes bankrupt after a couple of years. It's why it's attractive to money launderers.
reply
I'm not sure that's true for onlyfans, which seems to have been highly profitable until the sudden death of its founder.
reply
Excellent point: I'm talking about pornography 1.0, as it were.
reply
1.0 should be attributed to pornography _before_ online distribution, and I suspect that was pretty profitable
reply
Isn't 1.0 before _photography_ rather?
reply
Drawings then?
reply
Live action
reply
and now we're back to livecams, time is just a flat circle man...
reply
If anyone can fake it, is revenge porn even effective? Doesn't making it easy for anyone to fake also make all of it plausibly deniable?
reply
maybe try to view this topic with a bit more criticality. i just quickly googled some keywords and am pasting the very first search entry so you get an idea:

https://www.cbsnews.com/news/sextortion-generative-ai-scam-e...

revenge porn or deepfakes in general are hugely harmful to people.

in the german-speaking world there's a scandal right now about a husband creating deepfakes of his wife, https://www.hollywoodreporter.com/movies/movie-news/christia...

> One fake video, which she claims was sent to 21 men, depicted her being gang-raped

i think you're taking this topic lightly because you just assume that it's not a big deal. try to keep in mind that people's mental health and with this their life is at stake.

as with lots of things, the problem is not the tech itself, but the existence of men. it's not all men, but it's usually men. not sure how we'll solve this issue.

reply
The answers to those questions have been clear for a while; it approaches concern trolling to keep on pretending to ask them in wide-eyed innocence.

Yes, revenge porn is very effective at causing harm, even though it can be generated.

No, because 'plausibly deniable' has never worked for social consequences and shame.

reply
I think it can be effective, but it's the wrong term for it if it's fake. It's a mixture of other things, like libel and fabricating indecent images, and the same underlying blackmail.
reply
Yes. You can go speak to some high school (or even middle school) girls who have had AI generated porn made of their likeness and shared with their classmates. Even though everybody knows that it is fake it is still humiliating, especially for a young person who is likely already self conscious about their body and sex.
reply
It's also used in piles and piles of fake video flooding youtube/tiktok/etc. Driving clicks and engagement.
reply
> There's only one highly monetizable use for AI video generation

Yeah, marketing. Which is a huge market...

reply
There are others! They're just all horrible and generally revolve around weaponized misinformation - personalized scams, for instance.
reply
Oh right. There's a bunch of panicky news stories in India about that right now. Fake video calls from your nephew in the UK or whatever needing money for an emergency
reply
I for one can't wait for ChatGPT-style sexting to become a thing.

It's not just dirty talk. It's a whole new paradigm in verbal filth.

On the topic of sora, though: current models are astounding. I watched a clip of Leonidas, Aragorn, William Wallace, Gandalf etc. all casually riding into a generic medieval town together, and if you showed that to me a few years ago, it would have seemed like magic. We're not far off from concerts featuring only dead artists, and all video and image testimony becoming unreliable. Maybe Sora was a victim of timing or mismanagement, because I don't see how this isn't still a seismic shift in the entertainment industry.

reply
> all video and image testimony becoming unreliable

This is a "seismic shift" in the sense of the Big One hitting California. The knock on effects of trust erosion caused by AI are going to huge and potentially unrecoverable.

reply
I mean, you just outlined why it won't be a seismic shift: the only way the videos reliably stay on-model is if that model violates someone's copyright. And then when the movie is made the output itself isn't copyrightable (the ultimate arrangement may be but no individual frame is).
reply