On Linux, most programs are available pre-compiled through your package manager. However if you wanted a package that wasn’t already available for your specific family of distros, you would usually have to compile it yourself. It’s not that hard to do, but it does take awhile on install and every update after.
Flatpak (as well as snap and appimage) fix this by being more universal, a flatpak program can be installed on any distro that flatpak is available for.
There are downsides to flapaks though, and not everyone likes them. But if you want an app that’s not available through conventional means and you don’t want to use flatpak/snap/etc, compiling from source may be the only choice left.
you can use the latest and greatest upstream versions (even versions under active development)
you can let the compiler optimize for your system (especially useful with PGO, but honestly only worth it for a few applications, e.g. video encoders)
you have full flexibility over optional features
And it’s a good learning experience, sooner or later things won’t just work and you have to learn about compilers, linkers, various build systems and script languages etc.
But yeah, for most people it’s not worth the effort
Only 2 things i can think of (of the top of my head)
You can be sure that the application wasn’t tampered (and does exactly what the source defines) without having to relying on signing/encryption by any other parties.
You might be able to increase performance or just ensure correct functionality by tweaking somethings (compile arguments, settings) Might be userful if you have special hardware/system, which isn’t supported otherwise.
A list of installed programs forms a web of packages and their dependencies, what you want to install will form constraints of what versions etc. of each package and their shared dependencies that all need to be linked together.
This means your combination of programs you want to install has to be taken into account when you compile everything. If you are relying on the distribution to do the compilation for you and sending you the binaries, they will have had to have accounted for this combination or otherwise are hoping everything just works.
This is why on Debian it is advisable to not mix stable and testing repos, and in arch you keep up or are left behind. In gentoo you are limited only by the compatibility’s of the individual packages themselves, not of the specific combiled binaries, and nearly everyone mixes stable with testing or git head all the time. Plus some people like to procrastinate…
Omg that sounds awful, and also isn’t compiling it a wast of time, what benefits does it get you? I’m confused not smart
On Linux, most programs are available pre-compiled through your package manager. However if you wanted a package that wasn’t already available for your specific family of distros, you would usually have to compile it yourself. It’s not that hard to do, but it does take awhile on install and every update after.
Flatpak (as well as snap and appimage) fix this by being more universal, a flatpak program can be installed on any distro that flatpak is available for.
There are downsides to flapaks though, and not everyone likes them. But if you want an app that’s not available through conventional means and you don’t want to use flatpak/snap/etc, compiling from source may be the only choice left.
There are a few advantages
And it’s a good learning experience, sooner or later things won’t just work and you have to learn about compilers, linkers, various build systems and script languages etc.
But yeah, for most people it’s not worth the effort
That’s kinda why I love Nix. Everything builds no problem, and most of the time it’s already cached so you don’t even have to build it.
The other day I wanted to patch Picom, it was as simple as creating an overlay to change the source files. Seriously, it’s all so freaking easy.
Only 2 things i can think of (of the top of my head)
A list of installed programs forms a web of packages and their dependencies, what you want to install will form constraints of what versions etc. of each package and their shared dependencies that all need to be linked together.
This means your combination of programs you want to install has to be taken into account when you compile everything. If you are relying on the distribution to do the compilation for you and sending you the binaries, they will have had to have accounted for this combination or otherwise are hoping everything just works.
This is why on Debian it is advisable to not mix stable and testing repos, and in arch you keep up or are left behind. In gentoo you are limited only by the compatibility’s of the individual packages themselves, not of the specific combiled binaries, and nearly everyone mixes stable with testing or git head all the time. Plus some people like to procrastinate…
Many programs have compile time options, and the binaries may not have them enabled.
The only benefit is , sometimes the generic compiled code may not work as intended , that is the only time I compile from source !