106 github.com
15 codeberg.org
7 gitlab.com
7 git.codemadness.org
4 bitreich.org
git.mills.io
sit in your rank? 🤣
git.mills.io
sit in your rank? 🤣
git.mills.io
sit in your rank? 🤣
git.mills.io
sit in your rank? 🤣
However, since it's so easy to add new ones, it's mostly repositories which aren't likely to disappear but carry a lot of value. For example, 143 MiB on my hard drive for the complete history of FFmpeg is a no-brainer for me.
Portal64 looks interesting, I haven't heard about it. I might need to get an N64 emulator going.
I wonder if I could push to a Git remote with my current setup. That would be the simplest way to do public distribution *and* remote backups.
Also, Portal 64 kept freezing on me so I played F-Zero X instead.
My goals are a little different.
1. Create and maintain very high quality archives.
2. Make them resilient against attacks from the inside, including (but not limited to) force-pushing an empty history and maliciously deleting branches on the remote.
3. Minimize resource usage of the local machine and that of the remote, including running
git gc --aggressive and not fetching refs for Dependabot, pull requests, etc.
However, simple clones are inefficient on disk space and a simple
git fetch
will happily obliterate its history if the remote says so.My goals are as follows.
1. Create high quality archives of a large number of repositories and keep them up to date.
2. Make them resilient against attacks from the inside, including (but not limited to) force-pushing an empty history and maliciously deleting branches on the remote.
3. Minimize storage and bandwidth usage, including (but not limited to) running
git gc --aggressive
when cloning and not fetching unnecessary commits, e.g. Dependabot and pull requests.
Eventually, I'll make the script public so anyone can easily maintain archives. There's still a lot I want to do before that, though.