upvote
Someone who works on a “sugar dating” app advocating for synthetic child porn? That’s… uncomfortable?
reply
To say the least. Great catch! 'O brave new world, that has such people in 't.'
reply
Has the availability of deepfake porn generation reduced the demand for deepfake porn featuring real people? When deepfake generators are capable of creating convincing imagery of flawless ideal fake humans, why do you suppose there’s so many real humans who report being non-consensual subjects of deepfake porn?
reply
> Has the availability of deepfake porn generation reduced the demand for deepfake porn featuring real people?

yes

> When deepfake generators are capable of creating convincing imagery of flawless ideal fake humans, why do you suppose there’s so many real humans who report being non-consensual subjects of deepfake porn?

?

reply
One obvious argument is what it was trained on.
reply
Doesn't have to be. You can train it on normal pictures of children and nude images of adults.
reply
> Doesn't have to be. You can train it on normal pictures of children and nude images of adults.

You say this so casually, as though it were a normal thing to know, or as if a normal person would know it. Does that actually seem true where you live right now?

And how do you know that, anyway, Harsh? I mean, all those "unblocked" games you stole to give away and that you also put on Github, that's one thing. But this...

reply
Come on, it's not hard to come up with this idea. And it's not even true, model trained on clothed children and nude adults wouldn't know how children's genitals look like.
reply
This conversation just keeps getting more and more normal, huh.

Maybe you can explain why it is that, whenever lately I'm less than perfectly accurate on the technical requirements of using AI to generate kiddie porn, an entire legion of creepy anons comes pouring out of the woodwork to well-actually and bikeshed and bullshit about it? Are you really so anxious to prove your empiricism superior in this?

reply