upvote
Alternatively, from the nginx config file for git.ardour.org:

   location ~ commit/* {
        return 404;
    }
reply
deleted
reply
I got another interesting idea from this and another comment but what if we combine this with ssh git clients/websites with the normal ability.

maybe something like https://ssheasy.com/ or similar could also be used? or maybe even a gotty/xterm instance which could automatically ssh/get a tui like interface.

I feel as if this would for all scrapers be enough?

reply
i'm working on something similar: instead of web-based ssh client, it's a web-based git client UI - you can "checkout" repos, browse commits, tree, read individual files, etc. with no server-side code at all; git objects are fetched and parsed on client-side. first target is the dumb-http git protocol, so people can host git repos on static websites, and visitors don´t need to clone by a local git client to peek in.

https://bandie91.github.io/dumb-http-git-browser-js-app/ui.h...

reply
Lately it seems like every time I start reading through a comment with a bunch of incoherent word and idea salad, I look up and there's your username.
reply