> i'm not sure what gap people are trying to close building themselves some proverbial great library here, but i would encourage people to just sit back and trust that their brain is still one of the greatest technologies at their disposal.
Culturally I think this is going to fuck things up significantly. If I take the time to read all of the latest papers in the LLM space, I'm damn well not going to summarize it or document what I've learned for anyone. (Maybe this is why there are not many high quality books aggregating all of this information in all the latest papers, all of the advancements, etc. All the people doing this work would rather (smartly) milk the cash cow and maintain the information asymmetry.)
Or think about open source, this will kill it for people trying to make money off a product and keep it open source. Because someone could spin up a competitor overnight.
AI is going to make the information easier to acquire for cheap. But it's going to absolutely destroy the incentive structure and trust required to have an open exchange of information. It was already bad enough because the industry is not incentivized to produce quality literature for educational purposes like academia is. But after this, it'll be a complete shit show
>this clearly has visible impacts on how we engage with each other
> there's something there that I'm noticing and don't have the words for.
Welcome to ASI takeoff!
This is actually a good idea because it's a very cheap way to build your own industrial-strength search engine. We've forgotten how cool search engines are because Google's is so shit now.
(Although you don't need Claude, you can self-host this with minimal effort now.)