• 5 Posts
  • 138 Comments
Joined 2 years ago
cake
Cake day: June 9th, 2023

help-circle
  • It actually happened to me today on Arch.

    I updated the system, including the kernel, everything went smoothly with no errors or warnings, I rebooted, and it said the ZSTD image created by mkinitcpio was corrupt and it failed to boot.

    I booted the arch install iso, chrooted into my installation and reinstalled the linux package, rebooted, and it worked again.

    I have no explanation, this is on a perfectly working laptop with a high end SSD, no errors in memtest, not overclocked, and I’ve been using this Arch install for over a year.

    The chances of the package being corrupt when I downloaded it and the hash still being correct are astronomically low, the chances of a cosmic ray hitting the RAM at just the right time are probably just as low, the fact that mkinitcpio doesn’t verify the images that it creates is shocking, the whole thing would have been avoided on an immutable distro with A/B partitions.






  • Generally speaking, Linux needs better binary compatibility.

    Currently, if you compile something, it’s usually dynamically linked against dozens of libraries that are present on your system, but if you give the executable to someone else with a different distro, they may not have those libraries or their version may be too old or incompatible.

    Statically linking programs is often impossible and generally discouraged, making software distribution a nightmare. Flatpak and similar systems made things easier, but it’s such a crap solution and basically involves having an entire separate OS installed in parallel, with its own problems like having a version of Mesa that’s too old for a new GPU and stuff like that. Applications must be able to be packaged with everything they need with them, there is no reason for dynamic linking to be so important in Linux these days.

    I’m not in favor of proprietary software, but better binary compatibility is a necessity for Linux to succeed, and I’m saying this as someone who’s been using Linux for over a decade and who refuses to install any proprietary software. Sometimes I find myself using apps and games in Wine even when a native version is available just to avoid the hassle of having to find and probably compile libobsoletecrap-5.so




  • I’d say I’m a “time-strapped” user since I have a full time job and I’d rather spend my free time gaming rather than fixing a broken OS, nevertheless… I have 2 PCs with Arch Linux (one for personal stuff and one for work) and a server with NixOS.

    When things break on Arch (which is rare these days but it can happen, especially if you play around with things from the AUR), I just rollback with timeshift (it takes just a few seconds with btrfs) and try that update again in a few days. Minor issues I can just ignore or work around them and take care of them when I feel like it, but they usually get fixed with updates within a few days. The only time I felt that it was actively wasting my time was when Plasma 6 came out a few months ago and a lot of little things broke, especially on wayland, but they were fixed rather quickly with 6.1 so I can’t complain too much.

    NixOS on the other end has been nothing but trouble and a waste of time ever since I installed it. It took me a week to configure it, some packages are kinda old, most have incomplete declarative config, I had to manually write some units myself, and when things break it drives me crazy because even basic troubleshooting of services can be a pain in the ass because I have to find out where stuff is, know which config files are going to be overwritten, launch the correct nix-shell, … it’s all so tiresome… so I just revert to an older config and hope for the best. To make things worse, major updates often require manual changes to the config or even to application files themselves (looking at you, nextcloud) and you will excuse me if I can’t be bothered to do that on a DECLARATIVE DISTRO. Even debian doesn’t need that, come on! I don’t care what people say on NixOS, this OS is not ready yet, I don’t have time for this shit when I’m working and that server will be going back to debian next summer.






  • Arch Linux. Everyone said it was hard to use, unstable, etc. but my experience with it has been the exact opposite.

    Yes, the install process is needlessly complicated (although it got a lot simpler now that we have archinstall), but the OS itself is rock solid and rarely has any issues that require more than a reboot or a package reinstall to solve. The AUR is a godsend too if you don’t want or don’t know how to compile stuff from source.


  • The first time I heard about programming being obsolete was when I was taught UML in university. That was over almost 15 years ago and it didn’t happen, if anything programmers now also had to know UML, which isn’t all that bad but it definitely didn’t replace anything, it’s just useful for designing and documenting projects.

    I also heard from colleagues that in the 80s and 90s people said that SQL was supposed to be used by users directly, making (some) programming obsolete.

    Now AI bullshit claims to be making programming obsolete. I won’t hold my breath.