upvote
To be clear, I mean AI is going to be the downfall of ad supported content. But let's face it. We have link farms and spam factories as a result of the ad supported content market. I think this is going to eventually do justice for users because it puts a premium on content quality that someone will want to pay a direct licensing fee to scrape for your AI bots as opposed to tricking somebody into clicking on a link and looking at an impression for something they won't buy.
reply
So, if at the end of the day instead of clicking EVERY single link in the repository they just check it out and parse locally...... I would consider it a win.
reply
> This is ultimately just going to give them training material for how to avoid this crap.

> The arms race just took another step, and if you're spending money creating or hosting this kind of content, it's not going to make up for the money you're losing by your other content getting scraped.

So we should all just do nothing and accept the inevitable?

reply
> So we should all just do nothing and accept the inevitable?

I daresay rate-limiting will result in better outcomes than well-poisoning with hidden links that are against the policies of search engines.

Lots of potential for collateral damage, including your own websites' reputations and search visibility, with the well-poisoning approach.

reply
The README.md specifically states how to allow for nice robots to proceed unhindered. The people behind these efforts, I would imagine, don't particularly care about their sites' reputations in the cases people use LLMs for search.
reply
To be honest who cares about Google search anymore it's pretty useless these days.
reply
The small non-profit I volunteer with finds Google ads to be surprisingly effective, and much more cost-effective than FB for what they do, so there's at least some Google search usage in the demographic that they serve.
reply
Tech is just a series of arms races
reply