View previous topic :: View next topic |
Author |
Message |
lars_the_bear Guru
Joined: 05 Jun 2024 Posts: 522
|
Posted: Mon Jun 10, 2024 7:18 am Post subject: Backups |
|
|
Hi folks
I've just kind-of finished my first install of Gentoo on real hardware. It's a laptop from 2012, and it's working fine.
But this installation took about four days, because a huge amount was built from source. I suspect that if hadn't "USE'd away" many so components, to get a smaller system, more of the installation could have been from binaries. But, in the end, the installation required building Firefox, Thunderbird, Libreoffice, GTK, the X server, most of the Xfce4 desktop, and a bunch of other stuff.
I imagine that my part in this process might get less time-consuming, if I learn more about Gentoo. But I'm not sure the 48+ hours of compiler time can be reduced -- not on this laptop, anyway.
Incidentally, this laptop has a 200W PSU and, after two days of constant running at full capacity, the hot air exhausted from its cooling system has softened the varnish on my desk.That's not a problem that Linux has given me in the past.
I presume that other people are in a similar situation -- at least, there are plenty of folks on this forum bragging about how long it takes Firefox to build on their Pentium II's So how do we protect ourselves against a catastrophic administrator error?
This isn't a problem with Fedora, because the worst that can happen is that I have to re-install from scratch, which usually takes less than an hour. But rebuilding from scratch with Gentoo isn't really an option, if it's on a computer that you use every day.
Perhaps if you're very familiar with Gentoo, there isn't a problem you can't fix by booting from an external drive and tinkering with things. I'm a long way from being at that point.
What steps do other people take, to make it possible to recover from a total bork without reinstalling?
BR, Lars. |
|
Back to top |
|
|
kgdrenefort Guru
Joined: 19 Sep 2023 Posts: 312 Location: Somewhere in the 77
|
Posted: Mon Jun 10, 2024 7:28 am Post subject: |
|
|
Hello,
Indeed backups are very important.
As the binhosting new system is great to decrease time on such hardware.
For firefox, you could use jumbo-build use flag to reduce by ~2 the time of compilation, but the amount of RAM is more important, could lead to a crash of the system while compiling if you did not paid attention to your -jX value.
If you use a swapfile or swap partition, be aware that per default, the swapiness value is at 60, I prefer to set it to 1. RAM is of course way more quicker than disk, specially if you use a HDD.
Another way could be to add zram & zswap I guess.
About backups, every few days I run two RSync script, one keeping the old file and one removing what's not on the source any more, a «live» backup.
Would say, also, the best way to avoid breaking your Gentoo would be to avoid goofy stuff with your system, follow the recommandation of devs (which are mostly given in the news system).
You could use firefox-bin or libreoffice-bin for the packages that are too huge for your CPU.
I hope my answer helps you a bit.
Regards,
GASPARD DE RENEFORT Kévin _________________ Traduction wiki, pour praticiper.
Custom logos/biz card/website. |
|
Back to top |
|
|
lars_the_bear Guru
Joined: 05 Jun 2024 Posts: 522
|
Posted: Mon Jun 10, 2024 8:48 am Post subject: |
|
|
Hi
kgdrenefort wrote: |
You could use firefox-bin or libreoffice-bin for the packages that are too huge for your CPU.
|
Thanks. The problem is that, so far as I know, I can't use these binaries. The binaries in the repository are built with support for things I'm not installing, like pulseaudio, evolution, gstreamer...
I can understand why the binary repositories only host one version of a specific package, but it seems to me that it's not a minimal version. Again, I can understand why.
But if you're building a minimal system for an under-resourced (by modern standards) machine, I think you have to resign yourself to a whole lot of compilation. And it's exactly that kind of machine that's going to struggle with the compilation.
I don't mind doing that once; what I"m trying to avoid is doing it repeatedly.
BR, Lars. |
|
Back to top |
|
|
NeddySeagoon Administrator
Joined: 05 Jul 2003 Posts: 54578 Location: 56N 3W
|
Posted: Mon Jun 10, 2024 10:21 am Post subject: |
|
|
lars_the_bear,
If you have more powerful system, a VM is fire, make your own binary packages. Build in a chroot
As you describe, flexibility has a price.
There is also distcc but that not as useful as it appears at first sight. _________________ Regards,
NeddySeagoon
Computer users fall into two groups:-
those that do backups
those that have never had a hard drive fail. |
|
Back to top |
|
|
kgdrenefort Guru
Joined: 19 Sep 2023 Posts: 312 Location: Somewhere in the 77
|
Posted: Mon Jun 10, 2024 10:24 am Post subject: |
|
|
lars_the_bear wrote: | Hi
kgdrenefort wrote: |
You could use firefox-bin or libreoffice-bin for the packages that are too huge for your CPU.
|
Thanks. The problem is that, so far as I know, I can't use these binaries. The binaries in the repository are built with support for things I'm not installing, like pulseaudio, evolution, gstreamer...
I can understand why the binary repositories only host one version of a specific package, but it seems to me that it's not a minimal version. Again, I can understand why.
But if you're building a minimal system for an under-resourced (by modern standards) machine, I think you have to resign yourself to a whole lot of compilation. And it's exactly that kind of machine that's going to struggle with the compilation.
I don't mind doing that once; what I"m trying to avoid is doing it repeatedly.
BR, Lars. |
Surely it could lead to long compilation.
The most infamous software taking too much time to compile could be:
- Chromium and like it tools, using webkit, such as webkit-gtk, or qtwebengine. These are really painful.
- Firefox, taking way less than Chromium, still a long time to compile it, as thunderbird too.
- Libreoffice, it's almost as awful as Chromium.
On older system, Gentoo can be a bit painful. Specially when you know it's better to update your system max every week, but it could be done less often, but that could lead in the future to more complicated update process.
Regards,
GASPARD DE RENEFORT Kévin _________________ Traduction wiki, pour praticiper.
Custom logos/biz card/website. |
|
Back to top |
|
|
Banana Moderator
Joined: 21 May 2004 Posts: 1734 Location: Germany
|
|
Back to top |
|
|
lars_the_bear Guru
Joined: 05 Jun 2024 Posts: 522
|
Posted: Mon Jun 10, 2024 1:31 pm Post subject: |
|
|
Banana wrote: | Do I understand that you want to avoid the long compile times at the time you need a backup, or what is your goal exactly?
|
My goal is to avoid a four-day downtime, if I carelessly misconfigure something, or delete something that is not easy to reconstruct, and I have to reinstall. My personal and working data is already secure -- it's the long time delay to install Gentoo that's the potential problem. I'm wondering if there's a way to back up key parts of the Gentoo system installation, in order to make it less likely that I will ever have to reinstall from scratch.
I don't worry about this on my Fedora/Ubuntu machines, because reinstalling from scratch only takes a short time, as nothing has to be compiled.
BR, Lars |
|
Back to top |
|
|
lars_the_bear Guru
Joined: 05 Jun 2024 Posts: 522
|
Posted: Mon Jun 10, 2024 1:43 pm Post subject: |
|
|
kgdrenefort wrote: |
On older system, Gentoo can be a bit painful. Specially when you know it's better to update your system max every week, but it could be done less often, but that could lead in the future to more complicated update process.
|
Sure, but here's the problem: it's the possibility to support under-resourced systems (particular, in due course, ARM boards) that draws me to Gentoo in the first place. And, to be fair, my 2012 laptop absolutely flies, with everything compiled for the CPU, and all the cruft removed. I couldn't really run Fedora on this thing. Well, not without going out for a meal while it boots to desktop. But it's a perfectly serviceable computer, for day-to-day work, with Gentoo.
I imagine that the same would be true for a Raspberry Pi, or something of that nature.
"Backing up" a Raspberry Pi system partition is trivially easy: I just take the SD card out, put it in a desktop computer, and 'dd' the entire partition to a file. It's a big file, but it compresses well. I suppose I could do something similar on a desktop system -- I could dd the whole Gentoo system partition onto a file on a removable storage device of some kind.
Seems a lot of hassle, though, which is why I was wondering what other folks have done to avoid the potential problem. Or, in fact, whether there actually is a problem, for other people. If you're in a position to use binaries for everything, there likely isn't even a problem.
BR, Lars. |
|
Back to top |
|
|
Banana Moderator
Joined: 21 May 2004 Posts: 1734 Location: Germany
|
|
Back to top |
|
|
NeddySeagoon Administrator
Joined: 05 Jul 2003 Posts: 54578 Location: 56N 3W
|
Posted: Mon Jun 10, 2024 1:43 pm Post subject: |
|
|
lars_the_bear,
Turn on Code: | FEATURES="buildpkg" | to save binary package copies of everything you build.
You install from the binaries when things go wrong with either -K or -k.
Its all in now you know what to look for.
You will build a collection like my public Pi4 binhost.
The quickpkg tool can make binary packages form your install too but it cannot save the original unmodifed config files.
This has security implications. Do you really want to package your password hashes from /etc/shadow?
As long as you are aware of the risks, you can make an informed choice.
Full disk backup for solid state media is a bad thing at restore time. It backs up and restores all the unallocated space too.
The restore incurs a full media write.
Research a stage4. That's like a stage3 but it includes your install too. _________________ Regards,
NeddySeagoon
Computer users fall into two groups:-
those that do backups
those that have never had a hard drive fail. |
|
Back to top |
|
|
lars_the_bear Guru
Joined: 05 Jun 2024 Posts: 522
|
Posted: Mon Jun 10, 2024 1:50 pm Post subject: |
|
|
NeddySeagoon wrote: |
Turn on Code: | FEATURES="buildpkg" | to save binary package copies of everything you build.
You install from the binaries when things go wrong with either -K or -k.
|
Argh -- why did I not know this last week, before I built everthing?
OK, don't answer that. The answer is obvious -- RTFM
BR, Lars. |
|
Back to top |
|
|
NeddySeagoon Administrator
Joined: 05 Jul 2003 Posts: 54578 Location: 56N 3W
|
Posted: Mon Jun 10, 2024 2:15 pm Post subject: |
|
|
lars_the_bear,
The friendly manual is only useful if you know what want.
It's not a HOWTO. Its a this is what you can do.
will make your eyes glaze over if you try to read it from top to bottom. _________________ Regards,
NeddySeagoon
Computer users fall into two groups:-
those that do backups
those that have never had a hard drive fail. |
|
Back to top |
|
|
Chiitoo Administrator
Joined: 28 Feb 2010 Posts: 2730 Location: Here and Away Again
|
Posted: Mon Jun 10, 2024 7:36 pm Post subject: |
|
|
lars_the_bear wrote: | NeddySeagoon wrote: |
Turn on Code: | FEATURES="buildpkg" | to save binary package copies of everything you build.
You install from the binaries when things go wrong with either -K or -k.
|
Argh -- why did I not know this last week, before I built everthing?
OK, don't answer that. The answer is obvious -- RTFM ;)
BR, Lars. |
The 'quickpkg' command can be used to create binary packages from already installed packages. :] _________________ Kindest of regardses. |
|
Back to top |
|
|
pjp Administrator
Joined: 16 Apr 2002 Posts: 20485
|
Posted: Mon Jun 10, 2024 10:42 pm Post subject: |
|
|
As with any backup solution, remember to test the recovery process.
That's always good to remember, but in particular I bring it up because of references to quickpkg.
In 2019, quickpkg had some odd behavior (I haven't checked to see if any of the linked bugs have been resolved):
buildpkg, downgrade-backup & unmerge-backup conflict? _________________ Quis separabit? Quo animo? |
|
Back to top |
|
|
kgdrenefort Guru
Joined: 19 Sep 2023 Posts: 312 Location: Somewhere in the 77
|
Posted: Tue Jun 11, 2024 11:29 am Post subject: |
|
|
pjp wrote: | As with any backup solution, remember to test the recovery process.
That's always good to remember, but in particular I bring it up because of references to quickpkg.
In 2019, quickpkg had some odd behavior (I haven't checked to see if any of the linked bugs have been resolved):
buildpkg, downgrade-backup & unmerge-backup conflict? |
By curiosity, how would you recommand to test that ?
Remove some files in source that are not critical and apply back the backup ?
Regards,
GASPARD DE RENEFORT Kévin _________________ Traduction wiki, pour praticiper.
Custom logos/biz card/website. |
|
Back to top |
|
|
Hu Administrator
Joined: 06 Mar 2007 Posts: 22657
|
Posted: Tue Jun 11, 2024 12:09 pm Post subject: |
|
|
I would unpack some or all of the backup to an initially empty temporary area, then check that the restored files have the expected permissions, names, and contents. Deleting the live files is risky, since by definition we do not know yet whether you can get them back from the backup, so if the backup is broken, removing the live files will remove your only copy. |
|
Back to top |
|
|
kgdrenefort Guru
Joined: 19 Sep 2023 Posts: 312 Location: Somewhere in the 77
|
Posted: Tue Jun 11, 2024 3:06 pm Post subject: |
|
|
Hu wrote: | I would unpack some or all of the backup to an initially empty temporary area, then check that the restored files have the expected permissions, names, and contents. Deleting the live files is risky, since by definition we do not know yet whether you can get them back from the backup, so if the backup is broken, removing the live files will remove your only copy. |
Very good point (about being sure to retrieve the file).
I think I'll test that inside a nspawn and duck up a little bit it, worst case I broke a fake-system. _________________ Traduction wiki, pour praticiper.
Custom logos/biz card/website. |
|
Back to top |
|
|
lars_the_bear Guru
Joined: 05 Jun 2024 Posts: 522
|
Posted: Tue Jun 11, 2024 4:47 pm Post subject: |
|
|
From what I see, running `quickpkg` generates files in /var/cache/bindings. How do I install these packages, if I need them again? Can I just do
emerge {filename}.tar
or do I have to set up an HTTP server with a specific structure? That's what the documentation alludes to; but I'm not sure whether it's assuming you actually want to distribute the files.
Incidentally, do I need to keep all the .tar files in /var/cache/bindings? They currently take up nearly 1Gb.
BR, Lars. |
|
Back to top |
|
|
NeddySeagoon Administrator
Joined: 05 Jul 2003 Posts: 54578 Location: 56N 3W
|
Posted: Tue Jun 11, 2024 5:33 pm Post subject: |
|
|
lars_the_bear,
man emerge: | --usepkg [ y | n ], -k
Tells emerge to use binary packages (from $PKGDIR) if they are
available, thus possibly avoiding some time-consuming compiles.
This option is useful for CD installs; you can export
PKGDIR=/mnt/cdrom/packages and then use this option to have
emerge "pull" binary packages from the CD in order to satisfy de‐
pendencies. Note this option implies --with-bdeps=n. To include
build time dependencies, --with-bdeps=y must be specified explic‐
itly.
--usepkgonly [ y | n ], -K
Tells emerge to only use binary packages (from $PKGDIR). All the
binary packages must be available at the time of dependency cal‐
culation or emerge will simply abort. Portage does not use
ebuild repositories when calculating dependency information so
all masking information is ignored. Like -k above, this option
implies --with-bdeps=n. To include build time dependencies,
--with-bdeps=y must be specified explicitly. |
You will need Code: | emerge -K =<category/package>-<ver> | to force a specific slot or version.
Its -g or -G for remote binrepros, _________________ Regards,
NeddySeagoon
Computer users fall into two groups:-
those that do backups
those that have never had a hard drive fail. |
|
Back to top |
|
|
lars_the_bear Guru
Joined: 05 Jun 2024 Posts: 522
|
Posted: Tue Jun 11, 2024 7:57 pm Post subject: |
|
|
Hi
Thanks. I hope I don't screw up so badly that I have to rely on this
BR, Lars. |
|
Back to top |
|
|
figueroa Advocate
Joined: 14 Aug 2005 Posts: 3005 Location: Edge of marsh USA
|
Posted: Fri Jun 14, 2024 7:23 pm Post subject: |
|
|
Follow the link (here in these forums) to where I share my well tested stage4 script. https://forums.gentoo.org/viewtopic-t-1132899-highlight-stage4.html Once there you can search the page for stage4 to easily locate the post.
This is part of my regular and regularly tested backup routine. My current stage4 is 6.0 G whereas it is usually closer to 5.0 G, but I currently have three gentoo-sources kernels installed, and I've also reduced zstd compression from 9 to 3. _________________ Andy Figueroa
hp pavilion hpe h8-1260t/2AB5; spinning rust x3
i7-2600 @ 3.40GHz; 16 gb; Radeon HD 7570
amd64/23.0/split-usr/desktop (stable), OpenRC, -systemd -pulseaudio -uefi |
|
Back to top |
|
|
mortonP Tux's lil' helper
Joined: 22 Dec 2015 Posts: 88
|
Posted: Sun Jun 16, 2024 7:28 am Post subject: |
|
|
NeddySeagoon wrote: | lars_the_bear,
There is also distcc but that not as useful as it appears at first sight.
|
Ran the full profile rebuild on a Intel 10th gen quadcore laptop -> ~1300 packages.
Chromium takes the longest with ~3h
Firefox ~30m
Nodejs ~6.5m
Distcc very much helps with an old box, Nodejs seems to parallelize the best, but the bottleneck is now the laptop distributing the work and not the worker cores available on the LAN :-) |
|
Back to top |
|
|
mortonP Tux's lil' helper
Joined: 22 Dec 2015 Posts: 88
|
Posted: Sun Jun 16, 2024 7:36 am Post subject: Re: Backups |
|
|
lars_the_bear wrote: |
What steps do other people take, to make it possible to recover from a total bork without reinstalling?
|
Periodic borg backup/snapshot of root partition - differential+compressed backups don't take up much space.
Any borg snapshot then can be mounted and files recovered if needed.
Saved me several times already when I accidentally mangled important system files and needed the original back.
Whole packages... hmm... fortunately never had that problem. |
|
Back to top |
|
|
NeddySeagoon Administrator
Joined: 05 Jul 2003 Posts: 54578 Location: 56N 3W
|
Posted: Sun Jun 16, 2024 8:42 am Post subject: |
|
|
mortonP,
With whole binary packages, its possible to extract single files.
The HOWTO is left as an exercise for the reader :) _________________ Regards,
NeddySeagoon
Computer users fall into two groups:-
those that do backups
those that have never had a hard drive fail. |
|
Back to top |
|
|
szatox Advocate
Joined: 27 Aug 2013 Posts: 3432
|
Posted: Sun Jun 16, 2024 10:31 am Post subject: |
|
|
Yeah, it is possible to pick a single file from a binary package, but why would you even want to do that?
The title of this thread is "backup", yet all you guys have been talking about is speeding up installation process.
Faster installation is not a backup. It doesn't cover user data, and also if you have a good backup, you don't need a separate solution for reinstalling your system.
Instead of figuring out how to build and extract binary packages and infrastructure for throwing more CPUs at the problem, just copy _everything_ with rsync, serialize with tar, or feed it to borg. Or pick any of a few dozens of other options. They all get the job done, though depending on your particular setup and use case (requirements and constraints AKA your metric of choice), one will always do it slightly better than the other. _________________ Make Computing Fun Again |
|
Back to top |
|
|
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
|