When CPUs were a lot slower you could genuinely get noticeable performance improvements by compiling packages yourself, but nowadays the overhead from running pre-compiled binaries is negligible.
Yes, I tried it around 2002/2003, back when the recommended way was from stage1. I think I had a P4 with HT. It was noticeably faster than redhat or mandrake (yes, I was distro hopping a lot). Emerge gnome-mono was a night run. Openoffice about 24hrs.
Lots of wasted time but I did learn how to setup some things manually.
once there was a bug with dependencies of transcode and some other package (mplayer I think). it would ask to downgrade one and upgrade the other. then several hours of compiling later it would agree to upgrade both. then several more hours of compiling later it would again want to downgrade one again
I think there was a groove worn in my hard drive from this
Compiling your own packages only ensures that, well, you’re running packages that you compiled. This definitely does not mean that your computer is running what you intend at all.
Half the time I don’t know what my CPU is executing, and that’s code that I wrote myself.
I like to imagine that the early heroes who programmed in punch cards and basically raw machine code knew exactly what the CPU was the computer was running, but who knows…
I used Gentoo for 3y. in hindsight I wasted so many CPU cycles just because I thought --march=native would make things faster.
nope.
you know what made things faster? switching to arch 😂
You know what was even faster? Switching to something easier like Fedora/Linux mint/Debian
When CPUs were a lot slower you could genuinely get noticeable performance improvements by compiling packages yourself, but nowadays the overhead from running pre-compiled binaries is negligible.
Hell, even Gentoo optionally offers binary packages now.
@atmur i had no clue they were doing that, very interesting
Yes, I tried it around 2002/2003, back when the recommended way was from stage1. I think I had a P4 with HT. It was noticeably faster than redhat or mandrake (yes, I was distro hopping a lot). Emerge gnome-mono was a night run. Openoffice about 24hrs.
Lots of wasted time but I did learn how to setup some things manually.
once there was a bug with dependencies of transcode and some other package (mplayer I think). it would ask to downgrade one and upgrade the other. then several hours of compiling later it would agree to upgrade both. then several more hours of compiling later it would again want to downgrade one again
I think there was a groove worn in my hard drive from this
Oh yeah, I remember those. My solution was to not emerge anything for 24 hours, by the next day usually they fixed the issue.
so even after 24h compiling ur not done! u need to dispatch-config through so many config files…
Most of the reason to build your own packages is a form of runtime assurance - to know what your computer is running is 100% what you intend.
At least as a guix user that’s what I tell myself.
Compiling your own packages only ensures that, well, you’re running packages that you compiled. This definitely does not mean that your computer is running what you intend at all.
Half the time I don’t know what my CPU is executing, and that’s code that I wrote myself.
This is true of all programming
I like to imagine that the early heroes who programmed in punch cards and basically raw machine code knew exactly what the CPU was the computer was running, but who knows…
I did jump onto Gentoo ship chasing performance, but stayed because of USE flags.