this post was submitted on 29 Jul 2023
29 points (93.9% liked)

Linux

48157 readers
600 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

The reality is that reliable backports of security fixes is expensive (partly because backports are hard in general). The older a distribution version is, generally the more work is required. To generalize somewhat, this work does not get done for free; someone has to pay for it.

People using Linux distributions have for years been in the fortunate position that companies with money were willing to fund a lot of painstaking work and then make the result available for free. One of the artifacts of this was free distributions with long support periods. My view is that this supply of corporate money is in the process of drying up, and with it will go that free long term support. This won't be a pleasant process.

you are viewing a single comment's thread
view the rest of the comments
[–] lemmyng@beehaw.org 7 points 1 year ago (2 children)

The rationale for using LTS distros is being eroded by widespread adoption of containers and approaches like flatpak and nix. Applications and services are becoming less dependent on any single distro and instead just require a skeleton core system that is easier to keep up to date. Coupled with the increased cost needed to maintain security backports we are getting to a point where it's less risky for companies to use bleeding edge over stable.

[–] tetha@feddit.de 6 points 1 year ago

And that skeleton of a system becomes easier to test.

I don't need to test ~80 - 100 different in-house applications on whatever many different versions of java, python, .net and so on.

I rather end up with 12 different classes of systems. My integration tests on a buildserver can check these thoroughly every night against multiple versions of the OS. And if the integration tests are green, I can be 95 - 99% sure things will work right. The dev and testing environments will figure out the rest if something wonky is going on with docker and new kernels.

[–] lemmyvore@feddit.nl 5 points 1 year ago (1 children)

widespread adoption of containers and approaches like flatpak and nix

And it's about flippin time. Despite predating app stores by decades, the Linux package systems have been surprisingly conservative in their approach.

The outdated and hardcoded file hierarchy system combined with the rigid package file management systems have ossified to a ridiculous degree.

It's actually telling that Linux packaging systems had to be circumvented with third party approaches like snap, flatpak, appimage etc. — because for the longest time they couldn't handle stuff like having two versions of the same package installed at the same time, or solving old dependencies, or system downgrades, or recovery etc.

Linux had advanced stuff like overlayfs 20 years ago but did not use any of it for packages. But we have 20 different solutions for init.

[–] Manbart@beehaw.org 2 points 1 year ago

Like everything, it's a trade off. Windows allows different versions of the same libraries, but at the cost of an ever growing WinSXS folder and slow updates