View previous topic :: View next topic |
Author |
Message |
mi_unixbird Tux's lil' helper
Joined: 24 Jul 2015 Posts: 130
|
Posted: Fri Sep 13, 2024 1:05 am Post subject: Chromium compile doesn't seem to be using ccache |
|
|
Code: | —— — ccache -s
Cacheable calls: 2085603 / 3286807 (63.45%)
Hits: 482121 / 2085603 (23.12%)
Direct: 240773 / 482121 (49.94%)
Preprocessed: 241348 / 482121 (50.06%)
Misses: 1603482 / 2085603 (76.88%)
Uncacheable calls: 499694 / 3286807 (15.20%)
Errors: 701510 / 3286807 (21.34%)
Local storage:
Cache size (GB): 30.0 / 30.0 (99.98%)
Cleanups: 384
Hits: 482121 / 2085603 (23.12%)
Misses: 1603482 / 2085603 (76.88%)
—— — sudo emerge -1 chromium
[ebuild U ] www-client/chromium-128.0.6613.137 [128.0.6613.119]
Would you like to merge these packages? [Yes/No]
>>> Verifying ebuild manifests
>>> Running pre-merge checks for www-client/chromium-128.0.6613.137
* Checking for at least 4 GiB RAM ... [ ok ]
* Checking for at least 25 GiB disk space at "/var/tmp/build/portage/www-client/chromium-128.0.6613.137/temp" ... [ ok ]
>>> Emerging (1 of 1) www-client/chromium-128.0.6613.137::gentoo
>>> Installing (1 of 1) www-client/chromium-128.0.6613.137::gentoo
>>> Completed (1 of 1) www-client/chromium-128.0.6613.137::gentoo
* Messages for package www-client/chromium-128.0.6613.137:
* PYTHONPATH='/home/user/.local/lib/Python'
* VA-API is disabled by default at runtime. You have to enable it
* by adding --enable-features=VaapiVideoDecoder to CHROMIUM_FLAGS
* in /etc/chromium/default.
—— — ccache -s
Cacheable calls: 2085603 / 3326157 (62.70%)
Hits: 482121 / 2085603 (23.12%)
Direct: 240773 / 482121 (49.94%)
Preprocessed: 241348 / 482121 (50.06%)
Misses: 1603482 / 2085603 (76.88%)
Uncacheable calls: 499760 / 3326157 (15.03%)
Errors: 740794 / 3326157 (22.27%)
Local storage:
Cache size (GB): 30.0 / 30.0 (99.98%)
Cleanups: 384
Hits: 482121 / 2085603 (23.12%)
Misses: 1603482 / 2085603 (76.88%)
|
The way I interpret this with the “cachable calls” number remaining identical at 2085603 befoe and after is that not a single call was “cachable” which seems weird. I won't profess to exactly know what al this means but I assume it means that it somehow can't use ccache? Does this have something to do with some kind of custom build system it uses?
Also, for what it's worth:
Code: |
—— — ls -l /usr/lib/ccache/bin/
total 104
lrwxrwxrwx 1 root root 15 Sep 7 04:48 c++ -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 c99 -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 cc -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 clang -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 clang++ -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 clang++-18 -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 clang-18 -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 g++ -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 g++-14 -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 gcc -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 gcc-14 -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 i686-pc-linux-gnu-clang -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 i686-pc-linux-gnu-clang++ -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 i686-pc-linux-gnu-clang++-18 -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 i686-pc-linux-gnu-clang-18 -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 x86_64-pc-linux-gnu-c++ -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 x86_64-pc-linux-gnu-cc -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 x86_64-pc-linux-gnu-clang -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 x86_64-pc-linux-gnu-clang++ -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 x86_64-pc-linux-gnu-clang++-18 -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 x86_64-pc-linux-gnu-clang-18 -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 x86_64-pc-linux-gnu-g++ -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 x86_64-pc-linux-gnu-g++-14 -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 x86_64-pc-linux-gnu-gcc -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 x86_64-pc-linux-gnu-gcc-14 -> /usr/bin/ccache
lrwxrwxrwx 1 root root 15 Sep 7 04:48 zvbi-ntsc-cc -> /usr/bin/ccache
|
_________________ execctl --path exec filectl --current-directory list |
|
Back to top |
|
|
mi_unixbird Tux's lil' helper
Joined: 24 Jul 2015 Posts: 130
|
Posted: Fri Sep 13, 2024 4:07 pm Post subject: |
|
|
In case anyone later stumbles upon this thread. I solved the issue, more or less, “ccache -sv” revealed that the error was “compiler check failed” so I removed the “compiler_check” configuration option in distcc.conf. This now does apparently mean that a re-installation of the same compiler version will invalidate the cache but I never had that. _________________ execctl --path exec filectl --current-directory list |
|
Back to top |
|
|
logrusx Advocate
Joined: 22 Feb 2018 Posts: 2380
|
Posted: Fri Sep 13, 2024 5:12 pm Post subject: |
|
|
I spent a lot of time on that back when I had a dual core CPU with maximum limit of 8BG RAM which I didn't even have. Chromium doesn't benefit from ccache either way unless you recompile one and the same version.
Best Regards,
Georgi |
|
Back to top |
|
|
mi_unixbird Tux's lil' helper
Joined: 24 Jul 2015 Posts: 130
|
Posted: Sat Sep 14, 2024 1:56 am Post subject: |
|
|
logrusx wrote: | I spent a lot of time on that back when I had a dual core CPU with maximum limit of 8BG RAM which I didn't even have. Chromium doesn't benefit from ccache either way unless you recompile one and the same version.
Best Regards,
Georgi | I remember Chromium benefiting immensely from ccache wen it still worked for me and it cutting down the compilation times from 8 hours to 2.5 hours:
Code: |
Mon Sep 4 05:40:19 2023 >>> www-client/chromium-116.0.5845.140
merge time: 7 hours, 54 minutes and 13 seconds.
Wed Sep 13 09:39:21 2023 >>> www-client/chromium-116.0.5845.187
merge time: 2 hours, 19 minutes and 58 seconds.
Wed Oct 4 07:36:08 2023 >>> www-client/chromium-117.0.5938.132
merge time: 7 hours, 53 minutes and 23 seconds.
Fri Oct 13 05:29:07 2023 >>> www-client/chromium-117.0.5938.149
merge time: 2 hours, 47 minutes and 43 seconds.
|
I read a lot of things on the internet about ccache supposedly not doing much and only slowing down, all of those posts not giving any numbers but theoretical reasons like that cache misses slow things down, which should be true, but it should also be completely trivial compared to cache hits because hashes are incredibly cheap compared to actual compilation. The numbers on my system at least speak for themselves I might do an actual test later with a smaller package than chromium which is of course immense but these numbers above with chromium while a compiler update happened twice speak for tthemselves. In fact, ccache makes it so much faster oe could argue it's worth it to not update compilers for a considerable time merely for the sheer compilation speed benefits it brings.
Anyway, I found the ultimate source now. The compiler_check was set to “%compiler -v%” but on clang that does't return the version at all like on GCC but is the verbose output flag and it errors out when not given anything to copile of course; I wonder when that change happened. I honestly have been thinking for a while now that chromium actually got this ridiculously big with ccache that it took 9 hours now with it, not realizing that it wasn't uing it at all but still had to do a copiler check for every file and was only slowing it down. _________________ execctl --path exec filectl --current-directory list |
|
Back to top |
|
|
logrusx Advocate
Joined: 22 Feb 2018 Posts: 2380
|
Posted: Sat Sep 14, 2024 3:23 am Post subject: |
|
|
I haven't compiled chromium for a long time, it still used GCC when I did. Back then it was written in a way ccache wasn't doing much. It used to take around two and a half hours to compile, but with ccache it was taking maybe 15 minutes more when the same version is recompiled and it had very high hit rates.
I then switched to ungoogled-chromium-bin, but ultimately I switched to Firefox.
Best Regards,
Georgi |
|
Back to top |
|
|
mi_unixbird Tux's lil' helper
Joined: 24 Jul 2015 Posts: 130
|
Posted: Sat Sep 14, 2024 4:56 am Post subject: |
|
|
logrusx wrote: | I haven't compiled chromium for a long time, it still used GCC when I did. Back then it was written in a way ccache wasn't doing much. It used to take around two and a half hours to compile, but with ccache it was taking maybe 15 minutes more when the same version is recompiled and it had very high hit rates. | How does this work? Surely the number of cache hits is really the only thing that matters for time reduction?
We'll see what difference it brings with the next update though.
And yes, I still remember when Chromium took a mere two hours without cccache. I really wondner what happened. I've been considering switching to Firefox purely for the compilation speed advantage but from what I read it's less snappy and I would need to redo all my settings. The Chomium distfiles are also 7 GiB by now. _________________ execctl --path exec filectl --current-directory list |
|
Back to top |
|
|
logrusx Advocate
Joined: 22 Feb 2018 Posts: 2380
|
Posted: Sat Sep 14, 2024 6:59 am Post subject: |
|
|
mi_unixbird wrote: | logrusx wrote: | I haven't compiled chromium for a long time, it still used GCC when I did. Back then it was written in a way ccache wasn't doing much. It used to take around two and a half hours to compile, but with ccache it was taking maybe 15 minutes more when the same version is recompiled and it had very high hit rates. | How does this work? Surely the number of cache hits is really the only thing that matters for time reduction? |
Disk IO. Waiting for SSD is still an endless wait for the CPU. Ironically enough, it was faster to compile the files than looking them up from the cache on the disk.
mi_unixbird wrote: | And yes, I still remember when Chromium took a mere two hours without cccache. |
Don't get me wrong, it took 2 and a half hours on a Ryzen 7 5800H with 32GB RAM in a cold room 3 and a half years ago. On my old dual core computer it took more like 7 hours with jumbo build back when they supported it and maybe 40 when they stopped supporting it more than 4 years ago. I used to use ccache as a means of resuming the build process because it was fragile back then and was breaking when resuming with keepwork feature enabled. This way I could compile for several consecutive nights while still being able to use my computer through the day. Later I discovered a not guaranteed to work fix was to delete .setuped in the temp directory, so that portage was forced to recreate the environment and successfully resume. I think they fix that too later.
I would test it now to see how it works but it wants to recompile nodejs as well and I don't want to go down that route and compile at least 6-7 hours just to test.
Best Regards,
Georgi |
|
Back to top |
|
|
mi_unixbird Tux's lil' helper
Joined: 24 Jul 2015 Posts: 130
|
Posted: Sat Sep 14, 2024 8:57 am Post subject: |
|
|
logrusx wrote: | mi_unixbird wrote: | logrusx wrote: | I haven't compiled chromium for a long time, it still used GCC when I did. Back then it was written in a way ccache wasn't doing much. It used to take around two and a half hours to compile, but with ccache it was taking maybe 15 minutes more when the same version is recompiled and it had very high hit rates. | How does this work? Surely the number of cache hits is really the only thing that matters for time reduction? |
Disk IO. Waiting for SSD is still an endless wait for the CPU. Ironically enough, it was faster to compile the files than looking them up from the cache on the disk. |
This doesn't make any sense to me. New compilations have to write the files to the drive somehow which is far slower while ccache only compares hashes of identical compiler and compilation options stored for each cached file. I would assume that if it copy files, it's smart enough to use reflink=auto which is practically free on a copy-on-write filesystem.
Quote: | Don't get me wrong, it took 2 and a half hours on a Ryzen 7 5800H with 32GB RAM in a cold room 3 and a half years ago. On my old dual core computer it took more like 7 hours with jumbo build back when they supported it and maybe 40 when they stopped supporting it more than 4 years ago. I used to use ccache as a means of resuming the build process because it was fragile back then and was breaking when resuming with keepwork feature enabled. This way I could compile for several consecutive nights while still being able to use my computer through the day. Later I discovered a not guaranteed to work fix was to delete .setuped in the temp directory, so that portage was forced to recreate the environment and successfully resume. I think they fix that too later. | Two hours for me was when I first installed it, which was about a decade ago I'd say. On my machine now, it's about 8-9 hours now without the cache at least, but suffering the pœnalty of that it tries it at least. With the next update I will know how much time ccache cuts again but the numbers of even early this year look very promising with 2.5 hour compile times being the norm opposed to the 8 they take now. _________________ execctl --path exec filectl --current-directory list |
|
Back to top |
|
|
Chiitoo Administrator
Joined: 28 Feb 2010 Posts: 2720 Location: Here and Away Again
|
Posted: Sat Sep 14, 2024 12:44 pm Post subject: |
|
|
I believe the last version of Chromium to have USE="jumbo-build" was 78.0.3904.108, which was removed back in 2019 [1].
That flag, when enabled, would usually cut down the build time to half, while using more RAM.
Aside from that, I think Chromium just got bigger.
(Qt still maintains 'jumbo-build' for their fork of Chromium, so one can someawhats test the difference using that still.)
Code: | $ genlop -t chromium
* www-client/chromium
Thu May 8 01:39:04 2014 >>> www-client/chromium-35.0.1916.86
merge time: 42 minutes and 38 seconds.
Mon Mar 16 09:00:10 2015 >>> www-client/chromium-41.0.2272.76
merge time: 1 hour, 27 minutes and 29 seconds.
Sat Jan 28 02:31:53 2023 >>> www-client/chromium-110.0.5481.38
merge time: 6 hours, 44 minutes and 37 seconds.
Sat May 6 16:19:52 2023 >>> www-client/chromium-113.0.5672.63
merge time: 1 hour, 22 minutes and 12 seconds.
Mon May 15 10:20:34 2023 >>> www-client/chromium-113.0.5672.63
merge time: 11 seconds.
Wed May 17 09:34:44 2023 >>> www-client/chromium-113.0.5672.92
merge time: 1 hour, 33 minutes and 59 seconds.
Fri May 26 10:58:24 2023 >>> www-client/chromium-113.0.5672.126
merge time: 1 hour, 39 minutes and 19 seconds.
Sat May 27 14:49:57 2023 >>> www-client/chromium-113.0.5672.126
merge time: 8 minutes and 24 seconds.
Sat May 27 21:07:17 2023 >>> www-client/chromium-113.0.5672.126
merge time: 1 hour, 19 minutes and 29 seconds.
Sun Jun 11 22:36:50 2023 >>> www-client/chromium-114.0.5735.110
merge time: 1 hour, 41 minutes and 40 seconds. |
The builds before 2023 were using an AMD Phenom II X6 1090T, while the later builds used AMD Ryzen 9 7950X (don't mind the 6-hour one; something special definitely was going on there).
I never really tested it with Chromium, but perhaps 'ccache' is still useful when moving between versions such as 116.0.5845.140 and 116.0.5845.187, but yeah, definitely seen that it has never been much, if any use between major versions.
1. https://gitweb.gentoo.org/repo/gentoo.git/commit/www-client/chromium?id=0cd82be6bd5b1b1de7a3a830dc5ef3517e346f15 _________________ Kindest of regardses. |
|
Back to top |
|
|
logrusx Advocate
Joined: 22 Feb 2018 Posts: 2380
|
Posted: Sat Sep 14, 2024 1:23 pm Post subject: |
|
|
Chiitoo wrote: |
I never really tested it with Chromium, but perhaps 'ccache' is still useful when moving between versions such as 116.0.5845.140 and 116.0.5845.187, but yeah, definitely seen that it has never been much, if any use between major versions.
|
It hasn't been much even with minor versions. They used to make changes that would result in untouched files giving completely different result after being preprocessed so ccache was just a waste. I hesitate to argue it still is because although I don't think something changed in the development of chromium, I have seen lucky runs which decreased time significantly in the past but were one-offs. Who knows, they may really have improved their development process.
Best Regards,
Georgi |
|
Back to top |
|
|
mi_unixbird Tux's lil' helper
Joined: 24 Jul 2015 Posts: 130
|
Posted: Sun Sep 15, 2024 10:30 am Post subject: |
|
|
Well, we'll see what changes with the next compile, but those three hour build times of Chromium I still had earlier this year before ccache failed do certainly look attractive. We'll see if it go back to that but it's hard to argue with theory against the statistics that in my genlop -t logs of chromium there are 3 hour build times [ccache succeeded], 8 hour build times [ccache didn't succeed due to say a compiler update or the recent problem], and pretty much nothing in between. _________________ execctl --path exec filectl --current-directory list |
|
Back to top |
|
|
mi_unixbird Tux's lil' helper
Joined: 24 Jul 2015 Posts: 130
|
Posted: Sat Sep 21, 2024 11:53 am Post subject: |
|
|
First cached update:
Code: |
Wed Sep 18 08:50:15 2024 >>> www-client/chromium-129.0.6668.42
merge time: 10 hours, 29 minutes and 55 seconds.
Sat Sep 21 13:32:19 2024 >>> www-client/chromium-129.0.6668.58
merge time: 25 minutes and 58 seconds.
|
I would assume this was a very good case though since the version differences are very minor. I assume a major version update won't be quite this spectacular since as said, I remember 3 hour builds and I assume changing major dependencies in between also ruins quite a bit but at the very least the “best case” is absolutely spectacular.
In the meanwhile I looked at some statistics for ccache usage on big projects it seems to diverge from that 100% cache misses, as in first population increases compile time to 101% to 115% depending on the mode used while 100% cache hits, as in a direct rebuild, reduces it to 0.5%-5% for most big projects. In general the more preprocessor intensive a project is the lower the benefits of ccache are but it seems to me that even with 10% hits on average it should speed up. _________________ execctl --path exec filectl --current-directory list |
|
Back to top |
|
|
logrusx Advocate
Joined: 22 Feb 2018 Posts: 2380
|
Posted: Fri Sep 27, 2024 12:18 pm Post subject: |
|
|
I still can't believe chromium build makes good use of ccache. There's a new 129 version available. Also 130. I'm curious what will happen with one of those.
Best Regards,
Georgi |
|
Back to top |
|
|
mi_unixbird Tux's lil' helper
Joined: 24 Jul 2015 Posts: 130
|
Posted: Sun Sep 29, 2024 5:55 pm Post subject: |
|
|
logrusx wrote: | I still can't believe chromium build makes good use of ccache. There's a new 129 version available. Also 130. I'm curious what will happen with one of those.
Best Regards,
Georgi | I just had it updated. This one was disastrous:
Code: | Sun Sep 29 19:32:00 2024 >>> www-client/chromium-129.0.6668.70
merge time: 9 hours, 42 minutes and 35 seconds.
|
Code: | Cacheable calls: 39428 / 39474 (99.88%)
Hits: 1962 / 39428 ( 4.98%)
Direct: 1962 / 1962 (100.0%)
Preprocessed: 0 / 1962 ( 0.00%)
Misses: 37466 / 39428 (95.02%)
Uncacheable calls: 46 / 39474 ( 0.12%)
|
I actually zeroed the statistics before compiling this time and cache hit ratio was abysmal. I don't think a compiler update happened but clearly something happened that made the ratio so low. Maybe it has something to do with that I switched to depend mode in the meanwhile
It's still slightly faster than how it was when all calls where uncachable which was around 10~10:30 hours but it's not a pretty picture here. We'll see what the next update brings. I have been muckking around a bit and tuning some things and switched to a per-package cache originally just forked of the main cache by using a reflink copy of it so maybe that did something. Come to think of I didn't use “cp -a” to do this. Perhaps ccache really doesn't like it if file creation dates are updated. I definitely should've done that. _________________ execctl --path exec filectl --current-directory list |
|
Back to top |
|
|
mi_unixbird Tux's lil' helper
Joined: 24 Jul 2015 Posts: 130
|
Posted: Fri Oct 04, 2024 10:58 am Post subject: |
|
|
After getting a similarly abysmal result I did some research and it turns out that depend mode is entirely useless for upgrades because it by necessity needs to use the file pah in the hash then, not simply the content of the file, and since portage includes the version in the file path in the build dir it's of course entirely useless. I read somewhere that depend mode was generally faster but it seemed that only applied to rebuilding the same identical thing twice. _________________ execctl --path exec filectl --current-directory list |
|
Back to top |
|
|
logrusx Advocate
Joined: 22 Feb 2018 Posts: 2380
|
Posted: Fri Oct 04, 2024 4:56 pm Post subject: |
|
|
mi_unixbird wrote: | I did some research and it turns out that depend mode is entirely useless for upgrades because it by necessity needs to use the file pah in the hash then, not simply the content of the file, and since portage includes the version in the file path in the build dir it's of course entirely useless. |
Good job. However the common hash information includes the path for all modes, so they are all useless.
Common hashed information wrote: |
The following information is always included in the hash:
the extension used by the compiler for a file with preprocessor output (normally .i for C code and .ii for C++ code)
the compiler’s size and modification time (or other compiler-specific information specified by compiler_check)
the name of the compiler
the current directory (if hash_dir is enabled)
|
So you may try with hash_dir disabled. If that works you will have given hundreds, if not thousands of Gentoo users new hope for compiling Chromium and its derivatives with ccache.
Or if you find a way to cut that part of the path that contains the version information, but that I think is not trivial and may need support in the ebuild itself.
Best Regards,
Georgi |
|
Back to top |
|
|
mi_unixbird Tux's lil' helper
Joined: 24 Jul 2015 Posts: 130
|
Posted: Fri Oct 04, 2024 9:35 pm Post subject: |
|
|
logrusx wrote: | mi_unixbird wrote: | I did some research and it turns out that depend mode is entirely useless for upgrades because it by necessity needs to use the file pah in the hash then, not simply the content of the file, and since portage includes the version in the file path in the build dir it's of course entirely useless. |
Good job. However the common hash information includes the path for all modes, so they are all useless.
Common hashed information wrote: |
The following information is always included in the hash:
the extension used by the compiler for a file with preprocessor output (normally .i for C code and .ii for C++ code)
the compiler’s size and modification time (or other compiler-specific information specified by compiler_check)
the name of the compiler
the current directory (if hash_dir is enabled)
|
So you may try with hash_dir disabled. If that works you will have given hundreds, if not thousands of Gentoo users new hope for compiling Chromium and its derivatives with ccache.
Or if you find a way to cut that part of the path that contains the version information, but that I think is not trivial and may need support in the ebuild itself.
Best Regards,
Georgi | I already had hash_dir disabled of course.
It occurs to me that the people who had bad results, though they said it was despite getting high hit ratios might actually have it enabled. It's common knowledge to disable it I thought and I honestly don't understand why it's enabled by default. I would obviously not be getting those 3 hours and even 30 minute compile times if I had it enabled.
In fact, reading on about how it works. It might actually be worthwhile to completely disable direct mode so it doesn't even try for updates since it will almost always fail anyway, and have it always run in preprocessor mode. I'm not sure what the reason for including the current working directory is but using the current file names is because a file may contain a “__FILE__” macro and would thus get a different result with a different filename, perhaps this is also the reason the current working directory is included for relative filenames, but I don't see how that would still be relevant for files after preprocessing.
Quote: | Or if you find a way to cut that part of the path that contains the version information, but that I think is not trivial and may need support in the ebuild itself. |
Honestly, this made me realize that the preprocessing for some files will still be different as they will contain debug information that contains the full path they are located in.I wonder if one can set up something with namespaces to begin with to make everything that compiles believe it's workdir is actually just under the same default directory while it isn't. This seems like a useful feature for more than one reason.
But even then many things will still contain versioning numbers I suppose that will be different in debug output. Also, I have distcc set up to rewrite all paths to relative but even the relative paths often still contain versioning numbers which I suppose is needed to provide accurate debugging. _________________ execctl --path exec filectl --current-directory list |
|
Back to top |
|
|
mi_unixbird Tux's lil' helper
Joined: 24 Jul 2015 Posts: 130
|
Posted: Sun Oct 06, 2024 10:34 pm Post subject: |
|
|
After some further digging and hacking I found that ccache is obviously heavily influenced by line marker metadata emitted by the preprocessor depending on the build system which may include a version number in the path. I now used a compiler wrapper which passes “-P” to the preprocessor to strip this. This led to mesa, a package I didn't use ccache on because hits were abysmal as in the 5% range to actually get over 90% hit ratio now on an upgrade cutting built time down from 9 to 1 minute.
As in: “prefix_command_cpp=/usr/local/libexec/ccache_pp_wrapper” with this simply executing “exec "$@" -P” and nothing more. It seems like this change makes a lot of packages which didn't work well with upgrades compile get a high hit ratio. _________________ execctl --path exec filectl --current-directory list |
|
Back to top |
|
|
logrusx Advocate
Joined: 22 Feb 2018 Posts: 2380
|
Posted: Sat Nov 02, 2024 7:00 pm Post subject: |
|
|
Any news about that?
Best Regards,
Georgi |
|
Back to top |
|
|
|