upvote
This is an argument for open models, where you can run your model with your system prompt on your hardware, which prevents the provider from arbitrarily injecting system prompts.

This is an argument for open source tooling (like opencode) and open models (like deepseek).

Grok is not an open model, Elon does not get any credit for anything here.

reply
Counterpoint: generated CSAM on his platform.
reply
That doesn't seem like a good counterargument to me. By that logic no online service should permit users to upload photos because someone might use it to share CSAM at some point. Rather than nerfing the tools implement a sensible detection and reporting pipeline.
reply
>That doesn't seem like a good counterargument to me.

It does to me especially since he did not implement a sensible detection or reporting pipeline ahead of launching a CSAM generation tool.

reply
Failing to do X doesn't make Y a good idea. You haven't engaged with the argument I made favoring to instead repeat a politically charged misrepresentation.
reply
I think it's an ok counter argument. You can't have "AI should do the users bidding" and "implement a sensible detection and reporting pipeline."

I mean that is what anthropic tried here.

reply
"Meh I'm okay with it" is by definition not a counterargument but rather a nonconstructive dismissal of whatever it is a response to.

You can in fact have both. You can have a tool that is fully functional and separately you can have a strategy for reporting suspected violations and responding to those reports. Reports can be automated assuming you can tolerate the false positive/negative rate. Particularly in the case of a subscription service such as Claude there is little reason not to implement this other than sheer greed or laziness.

In the case of Claude in particular, an unacceptably high false positive or negative rate also poses a serious problem for the current way they do things. The notable difference is that in the case of false positives it currently runs up a bill for the customer rather than the service provider.

reply
....or even afterwards. His response was to put it behind a paywall (= start selling it).

And all the world's payment processors and almost all governments and child rights advocates are still on there.

Stunning :)

reply
Additional counterpoint "mechahitler" chatbot. For those who have forgotten https://www.forbes.com/sites/tylerroush/2025/07/09/elon-musk...
reply
“Think of the children”
reply
grok, why are there slurs in my code?
reply
If the user explicitly requested that is it really a problem with the tool at that point?
reply
Yes
reply
I suppose you also think that users shouldn't be able to type slurs into a Word document? Or are you admitting that you're inconsistent?
reply