Warner Losh wrote in
<CANCZdfrKFTjWnNGcZBTBfZ9e379VwjhNJK3Z8mLJR9dKLOj4SA(a)mail.gmail.com>:
|On Mon, Dec 27, 2021 at 11:44 AM Steffen Nurpmeso <steffen(a)sdaoden.eu>
|wrote:
|> Warner Losh wrote in
|> <CANCZdfoP6bkJCMTD96p=iEH8YP9cq1vX9TfXDASu0egmPYGVfQ(a)mail.gmail.com>:
|>|On Sun, Dec 26, 2021, 2:18 PM Theodore Ts'o <tytso(a)mit.edu> wrote:
|>|> I keep a private git repo on one of my machines, so when I get a HOST
|>|> account, I run a comand like this:
|>|>
|>|> % git clone ssh://tytso@example.com/home/tytso/repos/dotfiles .
|>|
|>|I have symlinks to all my files. I also have special hooks that I run per
|>|os and per host to pull in different configs when needed. Though in
|>|recent years I've not needed it much. I used to do a lot for work like
|>|this, but these days work envs are close to my home env, so there is
|> little
|>|point.
|>|
|>|I've been doing this since RCS days across 5 different SCMs... git makes
|>|oopses so rare that the paranoia below seems overkill. Though for other
|>
|> Oh yes, i could not agree more. I never tried bitkeeper ;), but
|> even after eleven years of git (~/calendar (symlink) just told me
...
|> 12/24 Beschließe öffentliche Projekte mit GIT zu managen (2010)
...
|The first years of git were interesting times to be using it. After that
|it's been rock solid, especially relative to all the other tools out there.
"rebase --onto" never really worked for me until it then did, but
it took long. They reversed the output of rev-parse at some time,
i test for version 1.8 for the switch. The garbage collect memory
window limit maybe now works .. and garbage collection took longer but
required less memory in earlier times, i always see it on the
OpenCSW.org cluster which uses git 2.4, 1.7.10.3 and what
else, but the plan to build on old one just for gc i never put in
practice; whether that today would still work, i do not know
either, no hash but SHA-1 here still, however.
I hated it bailed on breaking network connections, i do not know
whether they fixed it, i am now behind datagram based VPN, and
that "heals" that problem for me, practically always, luckily. It
is a pain with a bad internet connection when huge downloads then
break after say hundreds of megabytes, and cannot be restarted at
the time it bailed. On the other hand i never tried to fix it,
not even locally.
No breakage here, never. Wow.
| I have something similar to the .local stuff Ted does. In fact, I used to
|use exactly that pattern. However, I've taken to doing that via symlinks
|to the host name (so foo.host with multiple ones symlinked to the
|master if it comes to that). That way I could keep my local changes
|in version control... One to many client machines crashing and losing
|stuff in my past...
Yeah. No. :) (But for one lost backup encryption key that almost
broke me and anyway ate some really good work i was/am prowd of.)
--steffen
|
|Der Kragenbaer, The moon bear,
|der holt sich munter he cheerfully and one by one
|einen nach dem anderen runter wa.ks himself off
|(By Robert Gernhardt)