from Churbleyimyam@lemm.ee to linux@lemmy.ml on 24 Mar 10:15
https://lemm.ee/post/59322609
Those who don’t have the time or appetite to tweak/modify/troubleshoot their computers: What is your setup for a reliable and low-maintenance system?
Context:
I switched to Linux a couple of years ago (Debian 11/12). It took me a little while to learn new software and get things set up how I wanted, which I did and was fine.
I’ve had to replace my laptop though and install a distro (Fedora 41) with a newer kernel to make it work but even so, have had to fix a number of issues. This has also coincided with me having a lot less free time and being less interested in crafting my system and more interested in using it efficiently for tasks and creativity. I believe Debian 13 will have a new enough kernel to support my hardware out of the box and although it will still be a hassle for me to reinstall my OS again, I like the idea of getting it over with, starting again with something thoroughly tested and then not having to really touch anything for a couple of years. I don’t need the latest software at all times.
I know there are others here who have similar priorities, whether due to time constraints, age etc.
Do you have any other recommendations?
threaded - newest
Desktop:
Server:
Zero maintenance for any of them. Not just low maintenance, but zero.
This is the way. The uBlue derivatives benefit from the most shared knowledge and problem-solving skills being delivered directly to users.
Between that, and using a decorative distrobox config, I get an actually reliable system with packages from any distro I want.
Doesn’t ucore also have to restart to apply updates?
Not super ideal for a server as far as maintenance and uptime to have unexpected, frequent restarts as opposed to in-place updates, unless one’s startup is completely automated and drives are on-device keyfile decrypted, but that probably fits some threat models for security.
The desktop versions are great!
They won’t apply unexpectedly, so you can reboot at a time that suits. Unless there’s a specific security risk there’s no need to apply them frequently. Total downtime is the length of a restart, which is also nice and easy.
It won’t fit every use-case, but if you’re looking for a zero-maintenance containerized-workload option, it can’t be beat.
This is such a weird take given that 99.9% of people here are just running this on their home servers which aren’t dictated by a SLA, so it’s not like people need to worry about reboots. Just reboot once a month unless there’s some odd CVE you need to hit sooner than later.
That is very fair!!
But on the other hand, 99.9% of users don’t read all of the change notes for their packages and don’t have notifications for CVEs. In that case, in my opinion just doing updates as they come would be easier and safer.
So why would somebody run that on their homeserver compared to tried and true staples with tons of documentation? 🍿
It’s just Fedora CoreOS with some small quality-of-life packages added to the build.
There’s tons of documentation for CoreOS and it’s been around for more than a decade.
If you’re running a container workload, it can’t be beat in my opinion. All the security and configuration issues are handled for you, which is especially ideal for a home user who is generally not a security expert.
You’re right, they should be running Windows Server as God intended 😆
Run k3s on top and run your stateless services on a lightweight kubernetes, then you won’t care you have to reboot your hosts to apply updates?
.
Yeah, sure. I was running Bluefin-DX. One day image maintainers decided to replace something and things break. UBlue is an amazing project. Team is trying hard but it’s definitely not zero mainainace. I fear they are chasing so many UBlue flavours, recently an LTS one based on CoreOS, spreading thin.
🤷 I’ve been running Aurora and uCore for over a year and have yet to do any maintenance.
You can roll back to the previous working build by simply restarting, it’s pretty much the easiest fix ever and still zero maintenance (since you didn’t have to reconfigure or troubleshoot anything, just restart).
If you depend on third party modules you’ll end up with third party maintenance - we didn’t purposely decide to break this we don’t work at Nvidia.
Jorge, OP asked about “not having to really touch anything for a couple of years”. I am just sharing my experience. Big fan of containers and really appreciate your efforts of pulling containers tech into Linux desktop. Thank you!
I don’t understand the answer though. Maybe I am missing something here. There’s an official Bluefin-DX-Nvidia iso. Nvidia-containers-toolkit was part of that iso.
On a separate note, I liked the idea of GTS edition. Since few weeks ago iso became unavailable pending some fix. At the same time I see loads of new LTS edition buzz. It’s still in Alpha though. I feel confused.
The answer is if you’re depending on software that is closed and out of your control (aka. you have an Nvidia card) then you should have support expectations around that hardware and linux.
There are no GTS ISOs because we don’t have a reliable way to make ISOs (the ones we have now are workarounds) but that should be finished soon.
Thanks for clarifying, Jorge. I wish I lived in a perfect world where all hardware and software follow FOSS principles. Until then I will have to rely on the other distros that embrace an imperfect reality. I cannot reconcile how Bluefin targets developers and NVidia, unfortunately is not something many of those developers can afford to ignore. Good luck with your project!
It’s like a saving throw in a video game, most times you can make it, but every once in a while you don’t lol.
Running exotic niche server images out in the wild…
It’s just Fedora CoreOS with some QoL packages added at build time. Not niche at all. The very minor changes made are all transparent on GitHub.
Choose CoreOS if you prefer, it’s equally zero maintenance.
I am a longtime fan of Debian Stable, for exactly that reason. I installed the XFCE version using the custom installer about 8 years ago and have had very few issues.
Initially my GPU wasn’t well supported so I had to use the installer from Nvidia, forcing me to manually reinstall the driver after every kernel update. That issue has been fixed in recent years so now I can just use the driver from the Debian repos.
I installed the unattended-updates package about 2 years ago and it has been smooth sailing since
For as much hate as it gets Ubuntu (or kubuntu for the kde version) will feel very familiar in usage and will have a newer kernel. It’s my default it just needs to work distro if regular Debian isn’t an issue due to drivers or something similar.
As others have mentioned, Debian stable and Xubuntu are my default recommendations for anyone who wants a simple “just works” kind of system. Debian if they want it to be as clean as possible, Xubuntu if they want some creature comfort right out of the box.
Idem, I default to debian on server and Xubuntu on laptop.
I’ve been distro hopping for decades. I got exhausted with things constantly breaking. I’ve been using mint for the past six months with zero issues. It’s so refreshing that everything just works.
I second Mint. I’ve installed it on my laptop with zero issues, although that thing is pretty old so your mileage may vary on newer hardware. But mint comes with pretty up to date kernels these days so it’s definitely worth a try.
Same here. I got to a point I wanted to use the OS rather than play with and fix it. Went back to Mint and stayed there.
Every time I stray from Mint I am reminded why I go back to it.
Get a big mainstream distro and stop tinkering with it.
This really is the answer. The more services you add, the more of your attention they will require. Granted, for most services already integrated into the distro’s repo, the added admin overhead will likely be minimal, but it can add up. That’s not to say the admin overhead can’t be addressed. That’s why scripting and crons, among some other utilities, exist!
i think its more about modifying the system behavior, esp on desktop oses. i have many local services running on my server, and if set up right, its almost no maintenance.
i want to try another distro than ubuntu, but the damn thing isnt giving me a single excuse to format my system. it doesnt break if you don’t fuck with it.
Such a bad comment, what does tinkering mean? Not use any software besides the default one? So only browsing and text apps? facepalm
Tinkering, in my personal definition, would mean installing third party repositories for the package manager (or something like the AUR on Arch) or performing configuration changes on the system level… Just keep away as most as possible from accessing the root user (including su/sudo) is a general a good advice I would say.
Keeping away from sudo, got it.
If you want to take that from my text then feel free.
I’ve posted something similar a couple of days ago after my Endeavour OS took a dump to no return and I needed a reinstall. I, too, want a system where I set it and forget it. I’ve researched so much and now I have two things I’m experimenting with. I’m currently running Nobara OS (because I play games here and there) as an experiment to see how long it lasts without breaking. I have backed up everything.
Its users swore up and down that it never breaks if you’re not a “tinkerer”. Even its creator said that the distro isn’t for those who like to tinker. His goal was to have a distro that is as stable as an immutable, but not immutable itself.
So far, I like how it tries so hard to keep you away from the terminal. There is a GUI app for everything. Even their updating process is different than Fedora (which is what it’s based on). The developers are even planning on making something for upgrading between major releases that is a press of a button like they do with their updates through an app. So far so good.
My next experiment after this (if it fails) will be to run an immutable distro. Most likely Bazzite. They’re not my cup of tea, but I’ll sacrifice that for my sanity and for the sake of getting shit done.
The thing with Debian is that yes, it’s the most stable distro family, but stable != “just works”, especially when talking about a PC and not a server (as a PC is more likely to need additional hardware drivers). Furthermore, when the time comes that you DO want to upgrade Debian to a newer version, it’s one of the more painful distros to do so.
I think fedora is a good compromise there. It’s unstable compared to RHEL, but it’s generally well-vetted and won’t cause a serious headache once every few years like Debian.
What makes Debian 12 a painful distro to upgrade?
I don’t understand that comment either. I’ve been using Debian for years on my server, and it just keeps up with the times (well with Debian times, not necessarily current times).
It’s way easier than Kubuntu was for me, for example, which required reinstalling practically every time I wanted to upgrade. A few times the upgrade actually worked, but most of the time I had to reinstall.
Debian as a server is fine and probably the best ! However as a daily drive OS I don’t think it’s the best choice.
I have always seen Debian as server distro and that’s probably what they meant ?
I have debian as my server distro since the beginning of my Linux journey (NEVER failed me !) However I can’t see how Debian as daily drive is a good idea. Sure they try to catch up with testing repo for those who wan’t a more up to date distro, but it’s seems harder to keep up when something breaks along the way.
That’s where Arch and derivatives shine, if something goes wrong it’s fixed in a few days.
I’ve been daily driving it on my desktop and laptop for several months now, seems fine. But I don’t need the bleeding edge either.
But that’s not what the comment was about… The top level comment said Debian was hard to upgrade, and I have not had that experience.
Specifically upgrading major versions. See the official documentation for upgrading Debian 11 to 12. It’s far more involved than minor version upgrades.
www.debian.org/releases/…/ch-upgrading.html
This is what I’ve always done. It has worked fine for me every time.
Even then, there’s a warning that the upgrade process can take several hours. Even if it’s largely hands off, that’s not exactly my image of an easy upgrade.
How quickly do you think an os upgrade of this type finish?
The problem is when it comes time for a major version upgrade. Debian 12.10.0 to 12.11.0 probably won’t be a big deal. But upgrading from Debian 11 to 12 was a pain. Debian 12 to 13 will probably be a pain as well.
In what way? I haven’t upgraded between major releases on Debian before.
Here’s the official documentation for upgrading from Debian 11 to 12. The TL;DR is that it takes 8 chapters to describe the process.
www.debian.org/releases/…/ch-upgrading.html
Ubuntu. It’s boring but it all works.
Ubuntu is literally just Debian unstable with a bunch of patches. Literally every time I’ve been forced to use it, it’s been broken in at least a few obvious places.
So, you are saying Debian is the better choice, right?
Absolutely. I’ve been running Debian for literally decades both personally & professionally (on servers) and it’s rock-solid.
On the desktop, it’s also very stable, but holy-fuck is it old. I’m happy to accept the occasionally bug in exchange for modern software though, so I use Arch (btw) on the desktop.
Ubuntu comes with non-free drivers which can make it easier to set up and use. I use Debian on my server and Ubuntu on my laptops. They have both been pretty reliable for me. LTS versions of Ubuntu are pretty bug free but have older versions of software. I’d guess that Daniel was using a non-LTS release which are a bit more bleeding edge. The LTS ones strike a good balance between modernity and stability.
I am currently using an recent version of Ubuntu live USB for backups and a “serious” error window pops up every time I boot it. Same experience with Ubuntu installations. For me at least, Ubuntu isn’t anything close to stable.
Ubuntu. Or, get a Mac - which is even more “boring”.
As someone who just had to bandaid an unexplained battery draw on his wife’s MacBook - no, Mac OS no longer “just works”. Apple buries some of the most basic settings inside a command line-only tool called
pmset
, and even then those can be arbitrarily overridden by other processes.And even after a fresh reinstall and new battery, it still drains the battery faster in hibernation mode than my Thinkpad T14 G1 running LMDE does while sleeping. Yeah, that was a fun discovery.
That Thinkpad is by far one of my most dependable machines.
If you have battery drain, make sure you’ve disabled the option to regularly wake up and do some background processing (check for emails, sync photos, etc.). Settings → Battery → Options… → Wake for network access. (Or search for “Power Nap” in the System Sertings dialog.)
No need to use
pmset
for that.So here’s the thing - if you can think of it, I’ve already tried it 😅 I spent a week and a half sifting through countless forum posts on Apple’s own support center, Macrumors, reddit, and a host of other forums.
The “Wake for network access” setting was the first thing I disabled after I wiped and reinstalled the OS. Among a number of other settings, including “Power Nap”. Still got the fucking “EC.DarkPME (Maintenance)” process firing off every ~45 seconds, no matter what I did, causing excessive insomnia and draining the battery within 12 hours.
What I ended up doing was using a little tool called “FluTooth” to automatically disable wifi/Bluetooth on sleep (the built-in OS settings did fuck-all), set
hibernationmode
to 25, and a few other tweaks withpmset
that currently escape me (edit: disablednetworkoversleep
,womp
,ttyskeepawake
,powernap
- which was still set to1
even with the setting in System Settings was disabled 🤨), and a couple others I can’t remember as it’s not here in front of me).I put several full charge cycles on the brand new battery before it finally calmed the fuck down.
I feel you. I still use an intel macbook with tweaks i cannot remember plus 3rd party utils like Turbo Boost switcher. That experience alone has kept me from upgrading to newer models.
In retrospect my powerbook g4 (Ti) and os 9 was peak computing.
My Thinkpad T14 running Linux Mint (LMDE) gets better battery life on “Suspend” than that damn MBP does when hibernated. It’s the 2017 A1706, too - out of ALL the variants it had to be that one 😂
Oh no. Maybe some Incense to cleanse the demons? (⊙_⊙)
Edit: I just remembered I had a similar problem, after changing the battery on my 2015. This thread at macrumors helped me tremendously especially the last entry (did it on three seperate days before it had an effect.) but I’m sure you already tried all of that. Just for the off chance.
these Intel Macs were such a bad experience.
That thread was a godsend. Turning off
tcpkeepalive
was the other one that I couldn’t remember, but that seemed to help out as well.My wife has had multiple MacBooks over the years (I set up her old 2009-era A1278 with Linux Mint for the kids to do homework), and after I “fixed” it and talked about the longer wake-up process, she told me that’s what she was used to already and the “super fast wake up” was a very new thing for her when she bought it. So no complaints from her, and the battery performs better. Win/win.
You’re not going to believe this, but I’ve found Arch is it. My desktop install was in December 2018: Sway with Gnome apps. Save for Gnome rolling dice on every major update, it’s been perfectly boring and dependable.
There are two camps of Arch users:
The fact that you’re even saying such things as “time constraints” or “to learn new software” suggests an attitude to computing shared by about 0.01% of the population. It cannot be re-stressed enough to the (sadly shrinking) bubble that frequents this community: the vast majority of people in the world have never touched a laptop let alone a desktop computer. Literally everything now happens on mobile, where FOSS is vanishingly insignificant, and soon AI is going to add a whole new layer of dystopia. But that is slightly offtopic.
It’s a good question IMO. Choosing software freedom - to the small extent that you still can - should not just be about the freedom to tinker, it should also just be easy.
The answer is Ubuntu or Mint or Fedora.
Use timeshift, It saved my ass like 3 times
Debian stable + XFCE for me. Missing newer packages though. I’m interested in what problems you had with Fedora
This! Debian with Gnome or others is the answer. Take an afternoon to make it yours, then forget it. You can use backported kernels on Debian, to support newer hardware. Try this or upgrade to Debian 13 right now by changing the sourcefile to trixie instead of bookworm. Note : if you use Gnome, let gnome-software handle the updates for you (there’s an equivalent for kde). If you use others, configure unattented-upgrades for automatic updates.
I had problems with waking from sleep/hibernate, audio issues (total dropouts as well as distortion in screen-recording apps), choppy video playback and refusal to enter fullscreen, wonky cursor scaling, apps not working as expected or not running at all. I’ve managed to fix most of these or find temporary workarounds (grateful for flatpaks for once!) or alternative applications. But the experience was not fun, particularly as there was only a 2 week return window for the laptop and I needed to be sure the problems weren’t hardware design/choice related. And I’m finding it 50/50 whether an app actually works when I install it from the repo. There’s a lot less documentation for manually installing things as well and DNF is slow compared to apt…
I don’t want to say for certain that Fedora as a distro is to blame but I suspect that it is. I miss my Debian days.
That’s how I run my system right now. Fedora KDE + pretty much everything as Flatpak.
Gives me a recent enough kernel and KDE version so I don’t have to worry when I get new hardware or new features drop but also restricts major updates to new Fedora versions so I can hold those back for a few weeks.
I made a similar switch as you but from Ubuntu to Fedora because of outdated firmware and kernel.
what graphics do you have? Don’t expect that to go away with nvidia. no such issues on AMD though, intel should be fine though
Intel Arc integrated graphics.
Let’s hope Debian fits you. I had to change to an Intel WiFi card but everything else worked OOTB for me on my laptop
My desktop has been running debian for 5 years no problem including 2 major debian version upgrades, and a new(er) GPU.
I had an old laptop that ran the same debian install for 8 years. All upgrades in place, no reinstalls.
boring, and works. Stable + backports should cover the majority of people with new hardware support needs.
If you like debian and just need a newer kernel you could just add backports to your debian install then install the kernel during the install process.
Nixos?
NixOS was troubleshoot central for me. Not all programs behaved as expected with Nix’s unique design.
Once you get it setup tho, it works the same forever.
in my app the post starts with this sentence:
Yeah just use the default setup. Some minor tweaks at first, then it stays the same forever.
A minor tweak on another system, like an obscure driver, can be a huge headache on nix
PopOS is very stable as a desktop. It also keeps up to date with packages better than base Ubuntu in my opinion.
I’ve been running Manjaro for the last 4 months and it’s been incredibly reliable and smooth. I haven’t done any serious tweaking beyond installing a realtime audio kernal. I run updates every few days and I haven’t had a single issue so far.
Edit: what’s up with the down voting? If there’s something incorrect with recommending Manjaro in this context, I’d love to know why, since I’m still relatively new to Linux.
Are you using the liquorix kernel?
I can only see one downvote and four upvotes from here - I think you’re good!
You simply don’t do any maintenance whatsoever.
t. Got a arch linux install that I (rarely) perform “sudo pacman -Syu --noconfirm” and it works like a champ.
same
All you have to do is to install “Common sense antivirus”, pretty much.
I used to lose my keys all the time. I don’t want to spend so much time looking for my keys, nowadays I mostly just leave them in the front door, I rarely lock it and it works like a champ.
Comparing a PC maintenance to leaving the keys outside the front door is too dramatic, to not say the least…
…unless you work at NASA and/or your PC is holding something too valuable/sensitive/high-priority for others to want to hack it “that badly” – which I (highly) doubt it.
No it is
pandasecurity.com/…/consequences-not-applying-pat…
And:
And:
From reddit.com/…/for_individuals_what_are_the_actual_…
Nice cherry picking/moving the goalpost, but that is not how refuting works. A PC at NASA has a much higher “threat level” than my Orange pi zero 3, just chilling on the background. Which means, a potential “security hole” may prove harmful for these pcs… but it’ll definitely not hurt me in the slightest.
And before you parrot with other links and/or excuses… yes, I’m not negating their existence. I’m just saying they are there… but, well… “who cares”? If anything, its much faster to set up my distro back up “just like never happened before” than performing any “maintenance” whatsoever. Again, “Common sense antivirus” reigns supreme here – know what you are doing, and none of these things will matter.
You keep using the word “maintenance”. All I’m worried about is not installing any security patches for months.
The problem that I tried to highlight with my “cherry picking” is:
So unless you have separated this Orange Pi into its own VLAN or done some other advanced router magic, the Orange Pi can reach, and thus more easily attack all your other devices on the network.
Unless you treat your entire home network as untrusted and have everything shut off on the computers where you do keep private data, the Orange Pi will still be a security risk to your entire home network, regardless of what can be found on the little machine itself.
Depends on the environment surrounding the door, as well as the environment surrounding the computer.
Some people simply care less about their computer security. The debate stops there. Security operates on a foundation of what you want to secure.
By comparing two environments of someone’s life you know little about, you are commenting from ignorance.
If they don’t keep any private data on any computer that trusts their home network/wifi and don’t do taxes or banking on those, there’s no problem.
But if they do, I maintain that the analogy is correct: their unpatched machine is an easy way to digitally get access to their home, just like an unlocked door is to a physical home.
Wait your previous comment was not sarcastic? 😱
Same with fedora. Just run the upgrade once in a while and it work.
fedora has been this for myself. maybe tweaking every now and then to fix whatever edge cases I’ve run into but it’s the least painful distro I’ve used so far
My Arch Linux setup on my desktop and my servers are low-maintenance. I do updates on my servers every month or so (unless some security issue was announced, that will be patched right away) and my desktop a few times a week.
Nearly anything can be low-maintenance with the proper care and consideration.
For your constraints I would use just use Debian, Alma Linux or Linux Mint and stick with the official packages, flathub and default configuration on the system level. Those are low-maintenance out of the box in general.
Xubuntu LTS. I’ve been meaning to switch to Debian Stable when something breaks, but it’s my third LTS on the desktop and 5th on the laptop and there was just no opportunity. I also learned to avoid PPAs and other 3rd party repos, and just use appimages when possible.
You can have a kernel from Testing or even Sid, I believe, but yeah, it’s what we want to avoid - tweaking.
LTS is released every 2 years, for reference.
Peppermint , based on debian (also a devuan flavor). “Everything you need and nothing you don’t”
Debian XFCE or Xubuntu LTS.
xfce is stubbornly slow at introducing new features, but it is absolutely rock-solid. Hell I don’t think they’ve changed their icon set in some 20 years.
Debian and *buntu LTS are also likewise slow feature updaters that focus on stability.
fedora with gnome for me.
Linux Mint Debian Edition (LMDE) is my pick.
I’ve got two study laptops and apart from Tailscale giving me some grief very recently with DNS resolution, I literally haven’t had any problems with either machine. Both have been going for 1.5 years.
I like the LMDE route for the DE already having pretty decent defaults and not requiring much tweaking from the get-go. Xfce (as it ships by default in Debian) absolutely works, but I end up spending an hour theming it and adding panel applets and rearranging everything so that it… ends up looking similar to Cinnamon anyway, because default Xfce looks horrible in my opinion
avoid nixos
I use fedora and Ansible to fix things I want to be different all the time. After I install the OS I run Ansible pull and it makes all the changes I want
Debian. Unattended upgrades. Maybe flatpaks if your (GUI) stuff isn’t on debian
Debian
I use pop os. works out of the box.
Debian stable is as hassle-free as you’ll get.
It sounds like your issue is more with having to migrate to a new laptop. Firstly - buy laptops that are more linux compatible and you’ll have fewer niggles like with sound, suspend and drivers.
Secondly - use “dpkg --get-selections” and “–set-selections” to transfer your list of installed software across to your new laptop. Combined with transferring your /home directory, user migration can be speeded up.
This is the thing: The laptop is from Starlabs, supposedly made for Linux…
every system is only as stable as the user. anybody can break Debian or any other “stable” distro of renown the second they go tinkering, adding PPAs or anything else