upvote
People gave Altman shit for enabling NSFW in ChatGPT, but I see that as a step in the right direction. The right direction being: the one that leads to less corporate censorship.

>In the same way, using gpt5 is now very unbearable to me as it almost always starts all responses of a conversation by things like: "Great question"

User preference data is toxic. Doing RLHF on it gives LLM sycophancy brainrot. And by now, all major LLMs have it.

At least it's not 4o levels of bad - hope they learned that fucking lesson.

reply
I have seen a few normally progressive types act quite conservative puritan over the NSFW ChatGPT thing. It seems there are quite a lot of people consider things to be uniformly good or bad and their opinion of the whole colours their opinion of the parts.

OpenAI are in a difficult position when it comes to global standards. It's probably easier to see from outside of the United States, because the degree to which the historical puritanism has influenced everything is remarkable. I remember the release of the Watchmen film and being amazed at how pervasive the preoccupation with a penis was in the media coverage.

reply
People in the US went ballistic over Mass Effect showing an outline of a butt, in the dark, for 1 second.
reply
Name me one piece of enterprise software that lets you do NSFW things. The way people jump to 1984 with no thought is double plus bad. ChatGPT is a piece of enterprise software. They are trying to sell it to large companies at large prices. This is not a rhetorical question, do you think if you could generate nude images of celebrities or picture of extreme violence, corporations would buy it? Having been a director at a Fortune 500 company that bought software, I can tell you with 100% certainty the answer is "no".
reply
Technically? Microsoft Word certainly lets one write smut, and Photoshop certainly allows one to draw pornography? They won’t like, produce NSFW things automatically of course.
reply
Exactly. Programs that don't let you do things based on the content should be thought of as weird/broken.

Imagine if we woke up tomorrow morning and grep refused to process a file because there was "morally objectionable" content in it (objectionable as defined by the authors of grep). We would rightly call that a bug and someone would have a patch ready by noon. Imagine if vi refused to save if you wrote something political. Same thing. Yet, for some reason, we're OK with this behavior from "certain" software?

reply
There is more than one way we could generalize the precedent previously set, imo.

None of the templates included with e.g. Word were for smut.

Word allowed you to type in smut, but it didn’t produce smut that wasn’t written by the user. For previous enterprise software, that wasn’t really a relevant question.

So… I don’t think it is obvious that the “Word lets you type in smut” implies “ChatGPT should produce smut if you ask it for smut.”

I guess precedent might imply “if you write some smut and ask it to fix the grammar, it shouldn’t refuse on the basis of what you wrote being smut”?

reply
deleted
reply
Companies like PH use full Enterprise stacks from AWS to Oracle. Hell, CloudFlare actively takes flack for running much worse websites like 8Chan, Daily Stormer, etc. and they are as enterprise-focused as it gets.
reply
> Name me one piece of enterprise software that lets you do NSFW things.

Photoshop, MS word.

reply
I can't think of any that restrict it. Sharepoint refusing an NSFW photo or Oracle refusing to store video isn't a thing.
reply
Seemingly they don't have tests to see whether on certain areas their model gets better or worse.
reply
Try some of the Chinese models. Much less restrictive. With some obvious exceptions.
reply