I'm thinking of LazyVim for example which has [1]:
git clone https://github.com/LazyVim/starter ~/.config/nvim
After that, once you do a sync or update, there's a whole lot more cloning going on.The other projects I was going to mention have apparently all switched away from using git for their package management (homebrew, Go, cargo, ...). I can't help but wonder to what extent that might have been influenced by the default slowness of doing a full git clone?
Of course these all could add `--depth 1` to their instructions or internal package management tooling, and ofc we need both options to be available. I pondering aloud that in my observation, `--depth 1` is probably the option that I want more often than not but YMMV.
A) You can update them, because you can git pull to fetch changes.
B) If you want to apply patches on top, its better to have version control so you can keep track of what you changed, especially useful if you want to rebase.
B) See A
I use OpenBSD and before that, I was on alpine, debian, and arch. Of it was a software I want to try, I downloaded the tarball. if it’s something I wanted to keep for longer, I created a port or a custom packages.
Juggling multiple directories and tarballs is a pastime from a bygone era. It's even more commands if you want to reuse the existing directory!
Downloading a tarball and running ./configure or make, editing a config file here or there, etc then running `make install` is the most common flow. Now days I find myself frequently editing the Dockerfile to make it to my liking. With a git repo, the owners of the repo have excluded all the local files, build caches, etc and you can keep pulling to get updates stashing and reapplying your local changes. With tarballs, you have to figure it out all over again. Lose your build cache (language dependent maybe), lose a change you made here or there, etc.