Hacker News
new
past
comments
ask
show
jobs
points
by
baq
9 hours ago
|
comments
by
Barbing
7 hours ago
|
[-]
Anthropic distills GPT?
reply
by
yorwba
6 hours ago
|
parent
|
next
[-]
Everybody training models on large amounts of lightly filtered internet text is partially distilling every other model that had its output posted verbatim to the internet.
reply
by
beAbU
4 hours ago
|
parent
|
prev
|
[-]
And OpenAI probably distills anthropic, who would't?
It's all one big incestuous mess. In a couple of years we'll be talking about AI brainrot.
reply