upvote
GPT started to ‘wire in’ stuff around 5.2 or 5.3 and clearly Opus, ahem, picked it up. I remember being a tiny bit shocked when I saw ‘wired’ for the first time in an Anthropic model.
reply
Anthropic distills GPT?
reply
Everybody training models on large amounts of lightly filtered internet text is partially distilling every other model that had its output posted verbatim to the internet.
reply
And OpenAI probably distills anthropic, who would't?

It's all one big incestuous mess. In a couple of years we'll be talking about AI brainrot.

reply