upvote
It's amazing how quickly Anthropic is turning into the "bad" guys.

First we couldn't use our Claude subscription with anything but Claude code, then the limits seemed to change every week without any communication, then they banned a bunch of people (including some prominent names). Then they complain about the Chinese distilling using their API (which I'm partly sympathetic to but let's not pretend that Antrophic invented their training data from scratch).

Then there's this half-baked offer. I mean sure, it looks nice on paper but given how incredibly valuable opensource has been for them and given their budget it does seem a bit tight.

reply
6mo is so low, from the title I thought it'd be unlimited tbh especially considering they'll continue to crawl the content 6mo in the future
reply
Uncharitably, I think this is a strategy to gorge further especially if they select for higher quality open source. They are embracing the best to train off iteration patterns of the best, and have a semi self correcting slop mechanism.

Charitably this will be great for open source software so... so long as they never moat up and lockdown.

reply
Can't they just keep scraping these repositories for new data anyway? Or has that changed?
reply