I'm one of these people. Although it was a several years long plan for me, which involved learning new software (especially 3DS Max to Blender). I finally decided to bite the bullet around 8 weeks ago. Backed up everything, wiped Windows, installed Ubuntu.
I wouldn't says it's been smooth sailing. I've had a ton of technical issues which are not all resolved (I still can't get my laptop to wake from sleep, I have to do a hard reset if it sleeps) but at least everything else is working now.
A lot of minor display related issues (mostly involving using an external monitor as my main screen) were solved when I switched to Gnome on X11 instead of Wayland.
The most important lesson I'm taking from this is: if you want to use Linux, you need to buy certified hardware. Don't assume your laptop will work. Don't recommend other people to install Linux on random laptops, they'll probably have loads of minor and major issues that you don't have if you bought hardware specifically with the goal of running Linux.
Agreed. Note though that the Ubuntu live installer has a "Try" mode that gives you a nearly normal desktop experience without installing, and is an effective test environment to discover what will work and what won't. Wifi, graphics and suspend are worth trying out this way.
This is good advice and I should have done that. However, most of the issues I had only showed up after some time of using it. For example, a big one was wifi and it was very intermittent and hard to debug. I think it was going into low power mode in some circumstances and my wifi speed would randomly drop to 100kbs or less. Sometimes turning wifi off and on would fix it, sometimes a restart was required. Thankfully I did fix it eventually but I have no idea which of the many things I tried ended up working.
One thing I didn't have major issues with was my Nvidia graphics card, which was a surprise.
Ah, the issues you mentioned are the top 3 nvidia issues :P
- wake from suspend
- laptop power consumption
- poor wayland support
- sudden desktop restarts
- hidpi, external monitors, virtualization
Nvidia's proprietary crap has always worked badly, the re-implement everything and put all their development effort into NIH syndrome instead of playing along with the already existing open source stack.
> - wake from suspend - laptop power consumption - poor wayland support - sudden desktop restarts - hidpi, external monitors, virtualization
Now I'm a desktop user, so the first two don't really impact me, and the rest has been working fine for the ~2 years I've been in Wayland+Arch+Gnome across 3 different NVIDIA GPUs. Not sure why virtualization is there, but it also seems to work, although via WSL2 is slightly buggy, but I blame that on Microsoft rather than Linux/NVIDIA.
Yup, I'm very fond of Dell's Ubuntu laptops. Ubuntu just works, and I rarely need to mess with anything. (I have had occasional hardware problems over the years, but they have an on-site support option that works well even in rural areas.) I honestly have more weird software issues with my work Mac.
Random Windows laptops have been slightly more frustrating. Hibernation may not work out of the box, and sometimes one other piece of random hardware won't be usable. For a laptop that basically lives on a desk, you might get away with it.
Desktops are usually easier. They don't have as much built-in hardware, hibernation to disk may not be necessary, and it's easier to replace a webcam or something if you need to. I'd still check the graphics card, especially if it's an expensive one.
I'd wager that a lot of your issues could boil down to GNOME and the ancient software stack that Ubuntu is infamous for. I would recommend giving a more sensible distro like Fedora with KDE a try, or even CachyOS if you're not afraid of the terminal.
In my experience the linux way is finding things that don't work and, after not being able to fix them, deciding that you don't need really them after all.
Personally I focused on reducing the boot time when the suspend never worked properly.
Or use hardware a year or two old. If your hardware is bleeding edge you’re going to have a hard time. Or if you depend on some peripheral that’s not “mainstream”. The thing windows always, always had going for it was driver support. Linux never had that so it’s always been a game of catch up.
> I still can't get my laptop to wake from sleep, I have to do a hard reset if it sleeps
This is the weakest point of modern Linux - suspend/resuming not working properly on many machines. And the systemd-logind / inhibitor architecture makes it incredibly hard for non-technical users to opt out of suspend.
I had issues as well when I was attempting to transition over 15 years ago. Made a few false starts trying to use Ubuntu. What eventually did it for me was switching to Zorin OS for a few months, and then Kubuntu which I've been on ever since (back then, and I suspect even now, Ubuntu had limits primarily stemming from software principles, such as no mp3 support) without issue.
Honestly, nowadays, for new comers I would not advise to use Ubuntu anymore. It is just a shadow of itself, most often than not inconvenient and broken for user in term of ux.
I would suggest to use Linux Mint or Pop_os for example.
Some days ago I just mentioned that Linux Mint is a great entrypoint for people and it was quick to get replies about how that's an outdated distro still on X11 instead of Wayland... like if the tech stack behind a product mattered at all to new people. And here we are, replying to someone who had to drop Wayland because it was simply not working for them. Oh, the irony!
Why? The only complaint I hear about and sort of agree with is that they use snaps, and that's not always a good idea for some packages. But otherwise, for regular users, I'd say Ubuntu has much less foot guns than even Mint. Everything just works, and it's easier to find support and help online whenever something doesn't. I
Pop_os is in a weird state right now too, with the upcoming migration to their new GUI framework
Wifi never seems to work 'out the box', so be ready to connect to the internet using ethernet to download the latest drivers and thus get wifi to work.
Ah I think what is happening is that a brand new laptop may not have had its drivers added to the latest download/image version of your chosen distro. Hence the need to connect and update to get those drivers.
For me, it was a ThinkPad X1 Gen 11 and Mint 21.2 (MATE) about a year ago.
IMO the best way for Linux to win is to advocate it to your kids. People usually stick to whatever they learned and played with during childhood, and if we can make a whole generation Linux aware, not as a server but as a desktop, then the time of Linux desktop is foreseeable.
Instead of poaching to the adults, poach it to the kids. Tell them stories of hackers, etc.
We can also volunteer in schools for a few sessions of introduction to Linux.
Agree
It drives me nuts that at school they give them tablet rather than pcs. All the work happens on pcs, not tablets.
My kids have kubuntu on some old laptops and will have linux everywhere. Hopefully it is ok to maintain that,I'm afraid of school forcing incompatible software
This is even more important now as younger generations are just given iPads and other locked down devices, to the point where they have to learn how to navigate a file system in their introductory university courses.
> to the point where they have to learn how to navigate a file system in their introductory university courses.
What happened to IT lessons at school? I had IT starting at the age of 13 onwards, and that was in the early 2000s. By the time you graduated from school, you were guaranteed to have at least basic IT skills.
I think those classes/lessons were added when the availability of computers were kind of so and so, not every kid had a computer at home but most adults at the time understood that computers will be used in the kids future one way or another, so they figured everyone should know it.
At one point it shifted to "everyone has a computer at home" to "everyone has a computer in their pocket, and learns typing by themselves" and the classes seems to have disappeared.
My IT lessons in the 90s were “here’s how to use spreadsheets and Microsoft office”. It wasn’t until later that I was taught programming and other more foundational concepts that would take me beyond being an office worker that knew how to format and print a letter. Fortunately, I had computers at home and my parents encouraged me to learn how they worked.
Or if you want to make them tech capable, another possibility is to give them a cheap Mac (apparently Apple is going to release a MacBook with an A-series SoC soon). You can open a terminal and show how it works, they can explore the filesystem with Finder, but when they need to do school work, they also have access to good DTP applications (Swift Publisher is really affordable and nice for kids), Affinity Designer/Photo, etc.
Our daughter initially had a Linux machine (also for the reasons you outlined). But she quickly ran into issues when doing work for school, collaborating with other kids, etc. So we gave her a Mac Mini M1. It was not very expensive, she can run many common apps (MS Office is a fact of life) and it's more secure (app sandboxing, sealed system volume, etc). But many of her skills are be transferrable to Linux if needed (in contrast to using an iPad or Android tablet).
My kid used an ancient HP laptop that found new life as a Hackintosh. Was rock solid for his early HS needs. Side bonus was the peer admiration he got around it.
That’s so awesome! I started using Macs after my brother and I were dabbling with Hackintosh in 2007 (mostly just trying it for fun). Unfortunately, it didn’t work on my laptop, but after seeing Hackintosh on his, I decided to get a Mac Mini.
So cool that your kid was able to use a Hackintosh in high school!
Unless you've already got an Nvidia GPU, in which case you might encounter repeating pains.
Thanks to Nvidia's Linux driver, my desktop environment breaks and hangs when it comes back from sleep, and every few days my screen starts flashing black until I reboot. I've been told the trigger may be the combination of DisplayPort and AdaptiveSync.
I think nerds like us are more likely to experiment and make occasional drastic changes to our computing setup. And in my case I grew up during a time when there was a lot of change in the desktop OS landscape, so sticking with one OS from my childhood to adulthood wasn’t really possible (otherwise I’d still be on Microsoft BASIC and OS-9).
Familiarity and “just-workitude” really are by far the most important factors for people picking OSs, and these days it’s possible to stick with one platform for decades.
So, if you have money, you can throw away as much as you can, right? I have bought loads of Apple tech during 2010s and all of it is trash. Every iPhone (e.g. models from 4 to 7) costs tens of dollars today, down from a thousand. Not only that, they’re basically useless. I have a few iPads 3 (the first one with Retina display), and it’s perfectly fine display, but VLC is the only app that works with it. Also, you can watch YouTube via browser, yet it’s not a pleasant experience. At least my MacBook Pro (pre-retina) works great with Linux, and possibly I’d keep it around as a Linux server, till it breaks. I cannot do anything like that with any of iOS devices that I have. Why?
When I purchased them, I had no idea that would be the case. My perfectly good iPhone 12 mini is already at its last legs (I bet a couple of years), thanks to iOS 26 and the liquid ass. Why is it so? Why should I stress about initial purchases? I’m not that poor and young, sorry, I’m worried about the footprint. And I hope you would one day too.
Old machines use a lot of power, so not good for the climate. My rule of thumb is if it's less powerful than a $90 raspberry pi 400, which uses 10 watts max, it's probably worth e-wasting.
I've got one. I just played a triple-A video game, but my motherboard doesn't support Win11 so no-go. I've got another year to figure out what's next but this will be my last windows computer. While the developer-focuses side of the company has treated me pretty well over the years, you no longer need a windows box to work in their tech and the consumer side of the business seems to actively hate people.
Unfortunately the Linux desktop is in its worst state it has been since two decades with containerized apps/frameworks/browsers updating when they feel like it all the time and the related unsolvable permission and layering problems on top of the usual WLAN, touchpad, and power management issues unsolved for nearly 30 years as a growing number of second system effect frameworks bring new issues with ungooglable solutions so Linux users themselves flock to Mac OS.
That is highly debatable, largely subjective and probably hardware dependent to an extent.
I'm running Linux on five different machines, and two of them are portables: a ThinkPad Z13 and a GPD Win Mini 2024 - and have none of the issues you mentioned. In fact there's literally zero issues at all, I couldn't be any happier to be honest.
> Unfortunately the Linux desktop is in its worst state it has been since two decades
I primarily used Linux in various ways since Ubuntu 6.06, and every year it gets a little bit better, useful and stable, from my point of view at least. But I also moved to Arch some years ago, and CachyOS this year, so that might cloud my view on how well things work, and I also stopped using laptops, which makes Linux life a lot easier.
A year after Windows 7 reached EOL, we began running Debian with XFCE for internet facing boxes. They all run uninterrupted with zero issues, and are orders of magnitude more stable than Windows 11. The biggest bonus to me is, Linux/BSD sets up in minutes and you're done; literally done and ready for work/pleasure. Properly setting up Windows using group policy and other privacy controlling mechanisms takes five+ hours. I still use Windows 7 which was the last user friendly OS put out by Redmond; it is still a great OS when kept off the internet.
Well, my first Ubuntu (circa 2006…07) wasn’t working well on me (with wired internet, no Bluetooth and an average Intel PC. These days, only one obsolete GPU wasn’t working for me. Basically, a default Fedora installation works well with everything I throw at it. In my view, the situation improved tremendously.
If people don't want to buy a new computer then I'd imagine the majority will stick with Win10 for a period until their current computer dies. As long as Chrome still supports Win10 then they'll think it's OK. Win11 is so radically different to Win10 that most would see less change going to Mint but won't.
The day EA games will run on Linux will be the day Windows dies IMHO. This will mean full support of all broken graphical APIs and stability of drivers from e.g. nvidia that currently looks like a rollercoaster ride - every other app works stable on different nvidia driver.
As W10 user, W11 denialer I’m waiting for that day!
I have a 7 year old mini 'NUC' PC that ran Windows 10. I installed Windows 11 and used a popular Debloat program [1] - this produced a very nice operating system that suits me well and performed well.
I also tried to install various Linux distros on a partition but all of the installations failed towards the end of the process, causing various boot loader and other problems that required a lot of uncomfortable fixing in terminals and BIOS.
I would have liked to be using Linux but as it turned out a de-bloated Windows 11 experience is very good for me.
Please remember to go gently and slowly/appropriately with people; Help people who want to move, show off how you're using stuff - but don't push too hard and make sure the setup is right for the person.
I agree that HN is atypical of the world, but so is any surveys by Stack Overflow too, it's really heavily Windows and C# biased across the board, so take whatever results they get with a grain of salt.
Why do you think that? Is it really biased, considering the majority of the world uses Windows? Or is it just that you don't like it?
By the way, I'm not using Windows (but love .NET) and prefer Linux, but I'm not fooling myself that it's a popular choice on the desktop by any metric.
I think there are many reasons. Joel Spolsky and Jeff Atwood were both hugely into .NET/C# and Windows Tech (one of them worked at Microsoft even maybe?). Then the entire startup was built with Microsoft tech, because they knew that best, so when they talk about their architecture they obviously draw a particular crowd among others. It felt like a lot of the early community came from the founders blogs and Microsoft circles, and I was also asking mostly C# questions on SO too in the beginning.
Probably someone remembers the history better than me and have more clarity and/or corrections.
I cannot work productively with Linux because there are no Linux alternatives to the following Windows tools (and the 'alternatives' people tell me about are a sad joke):
With WSL2, Windows 11 could easily have been the best Linux distribution on the planet, supporting all Windows apps while also giving you the full power of Linux at your terminal.
Unfortunately Microsoft decided to stuff it with user hostile spyware instead.
Eh, no. WSL2 is kind of bad compared to a proper Linux environment, you'll encounter so many tiny cuts that eventually you'll yearn for a proper environment where stuff just works instead of having to fight against the built-in "interoperability" stuff Microsoft has added that just gets in the way and introduces new issues.
Saying this as someone who've used both Linux and Windows as a main OS for decades, and would dump Windows 100% if I could get Ableton to run properly on Linux. I'm still using WSL2 from time to time, but for things I need to be productive with, I prefer my Linux environment 99% of the time.
I like WSL2 as a polished way to launch VMs in Windows. It's been good for testing out Linux software in an odd environment, but I haven't figured out how to enable SELinux or the systemd firewall on the Red Hat variants.
My trick is to use git bash as my primary shell on windows and Podman for Windows (not desktop / just the cli app). It's still not nearly as nice as using Arch but a reasonable approximation in a pinch.
Well, sounds worse as everything has to be a container then? I'll just continue with the slightly borked Arch-inside-WSL2 setup I have for that instead, as it kind of works, and continue to mostly run Arch bare metal when I can.
No, only containerized things need to be in a container. For everything else you are just using Windows except it feels more Linux like with git bash and has most basic Linux tools. If you need to run a container it is a lot more like it is on Linux podman container run ... right in your Windows shell (git bash in my case). This does things behind the scenes with a podman WSL image but it is mostly seamless.
I do actually start up Arch / Ubuntu / whatever in WSL for some things but mostly just use the above setup which is most Linux like without having to shell into WSL all the time. That being said, I use actual Linux / Arch whenever I can - yeah, I use Arch btw.
> No, only containerized things need to be in a container. For everything else you are just using Windows except it feels more Linux like with git bash and has most basic Linux tools.
But developer focused software is trash on Windows, if I want to remain productive I need a Linux environment so I can just run stuff without having to fuck around to configure it for Windows and what not.
Also used WSL2, and keep hitting sharp edges. Last issue involved not being able to properly use the GPU through Docker running in WSL2 on a Arch installation, and Windows "helpfully" aliasing .exe's available on the host OS into the guest, confusing a lot of the stack.
A year ago or something, changing the size of the WSL disk via some Powershell command also corrupted the disk after it was done, I had to start from scratch which was a bit annoying.
Of course there is, lots of software doesn't work on Linux. I've switched to using Linux on my personal laptop, but I can't recommend it to anyone else in my small company (we're not a tech company, if I did get anyone else running it that would immediately make me their tech support, and I'm not doing that), and I can't use it on any of our branch machines as the POS software only runs on Windows or Android (might switch to touchscreen Android kiosks in the future though).
Absolutely. MacOS treats the user like a child (Windows isn’t much better, but it is better), and Linux doesn’t have all the domain integration features and breadth of software that Windows brings to the table. I’m extremely comfortable on Linux at home, but I use Windows at work.
Sorry, but this is just nonsense. I used Linux on the desktop from 1994 to 2007 (lucky enough to start using Linux when I was 12). I have used macOS as my main desktop since 2007, but I also have a ThinkPad with NixOS and a Linux workstation (currently used headless though). I feel like I'm able to make a fair comparison.
macOS for me feels like a much more mature version of the Linux desktop, on more mature hardware, with applications that are not available on Linux. macOS is rarely getting in my way. The primary reason I also use Linux (and contribute in various ways) is that I find FLOSS morally preferable and hope for that reason that the Linux desktop wins in the long term.
Though having been around for many 'Linux is soon ready for the desktop'-moments since around 2000 (anyone remember Corel Linux?), I don't hold high hopes. As long as basic things like waking sleep results in a kernel panic because waking your Thunderbolt display puts the kernel in a weird state (and the 999 other paper cuts), Linux on the desktop won't happen.
I'm one of these people. Although it was a several years long plan for me, which involved learning new software (especially 3DS Max to Blender). I finally decided to bite the bullet around 8 weeks ago. Backed up everything, wiped Windows, installed Ubuntu.
I wouldn't says it's been smooth sailing. I've had a ton of technical issues which are not all resolved (I still can't get my laptop to wake from sleep, I have to do a hard reset if it sleeps) but at least everything else is working now.
A lot of minor display related issues (mostly involving using an external monitor as my main screen) were solved when I switched to Gnome on X11 instead of Wayland.
The most important lesson I'm taking from this is: if you want to use Linux, you need to buy certified hardware. Don't assume your laptop will work. Don't recommend other people to install Linux on random laptops, they'll probably have loads of minor and major issues that you don't have if you bought hardware specifically with the goal of running Linux.
> Don't assume your laptop will work.
Agreed. Note though that the Ubuntu live installer has a "Try" mode that gives you a nearly normal desktop experience without installing, and is an effective test environment to discover what will work and what won't. Wifi, graphics and suspend are worth trying out this way.
This is good advice and I should have done that. However, most of the issues I had only showed up after some time of using it. For example, a big one was wifi and it was very intermittent and hard to debug. I think it was going into low power mode in some circumstances and my wifi speed would randomly drop to 100kbs or less. Sometimes turning wifi off and on would fix it, sometimes a restart was required. Thankfully I did fix it eventually but I have no idea which of the many things I tried ended up working.
One thing I didn't have major issues with was my Nvidia graphics card, which was a surprise.
Ah, the issues you mentioned are the top 3 nvidia issues :P
- wake from suspend - laptop power consumption - poor wayland support - sudden desktop restarts - hidpi, external monitors, virtualization
Nvidia's proprietary crap has always worked badly, the re-implement everything and put all their development effort into NIH syndrome instead of playing along with the already existing open source stack.
> - wake from suspend - laptop power consumption - poor wayland support - sudden desktop restarts - hidpi, external monitors, virtualization
Now I'm a desktop user, so the first two don't really impact me, and the rest has been working fine for the ~2 years I've been in Wayland+Arch+Gnome across 3 different NVIDIA GPUs. Not sure why virtualization is there, but it also seems to work, although via WSL2 is slightly buggy, but I blame that on Microsoft rather than Linux/NVIDIA.
Yup, I'm very fond of Dell's Ubuntu laptops. Ubuntu just works, and I rarely need to mess with anything. (I have had occasional hardware problems over the years, but they have an on-site support option that works well even in rural areas.) I honestly have more weird software issues with my work Mac.
Random Windows laptops have been slightly more frustrating. Hibernation may not work out of the box, and sometimes one other piece of random hardware won't be usable. For a laptop that basically lives on a desk, you might get away with it.
Desktops are usually easier. They don't have as much built-in hardware, hibernation to disk may not be necessary, and it's easier to replace a webcam or something if you need to. I'd still check the graphics card, especially if it's an expensive one.
I'd wager that a lot of your issues could boil down to GNOME and the ancient software stack that Ubuntu is infamous for. I would recommend giving a more sensible distro like Fedora with KDE a try, or even CachyOS if you're not afraid of the terminal.
GNOME is just as maintained and up to date as KDE.
In my experience the linux way is finding things that don't work and, after not being able to fix them, deciding that you don't need really them after all.
Personally I focused on reducing the boot time when the suspend never worked properly.
> if you want to use Linux, you need to buy certified hardware
Or test yours with a live distro beforehand
And if it doesn't work, check again with a distro that uses a newer Kernel.
That can often make all the difference, but it's not intuitive for Windows people who are used to install hardware drivers.
Especially for hardware like new Bluetooth or Wifi chips, fingerprint readers, but also when there is a new Intel or AMD CPU generation and chipset.
E.g. instead of Ubuntu or Mint, try Fedora or CachyOS. Or even Nobara or Bazzite for gaming-specific optimizations.
Or use hardware a year or two old. If your hardware is bleeding edge you’re going to have a hard time. Or if you depend on some peripheral that’s not “mainstream”. The thing windows always, always had going for it was driver support. Linux never had that so it’s always been a game of catch up.
> I still can't get my laptop to wake from sleep, I have to do a hard reset if it sleeps
This is the weakest point of modern Linux - suspend/resuming not working properly on many machines. And the systemd-logind / inhibitor architecture makes it incredibly hard for non-technical users to opt out of suspend.
I had issues as well when I was attempting to transition over 15 years ago. Made a few false starts trying to use Ubuntu. What eventually did it for me was switching to Zorin OS for a few months, and then Kubuntu which I've been on ever since (back then, and I suspect even now, Ubuntu had limits primarily stemming from software principles, such as no mp3 support) without issue.
Honestly, nowadays, for new comers I would not advise to use Ubuntu anymore. It is just a shadow of itself, most often than not inconvenient and broken for user in term of ux.
I would suggest to use Linux Mint or Pop_os for example.
Some days ago I just mentioned that Linux Mint is a great entrypoint for people and it was quick to get replies about how that's an outdated distro still on X11 instead of Wayland... like if the tech stack behind a product mattered at all to new people. And here we are, replying to someone who had to drop Wayland because it was simply not working for them. Oh, the irony!
Why? The only complaint I hear about and sort of agree with is that they use snaps, and that's not always a good idea for some packages. But otherwise, for regular users, I'd say Ubuntu has much less foot guns than even Mint. Everything just works, and it's easier to find support and help online whenever something doesn't. I
Pop_os is in a weird state right now too, with the upcoming migration to their new GUI framework
to fix the wake from sleep issue you will likely need to change a setting in 'grub', but what that is depends on the ubuntu version and your hardware.
Wifi never seems to work 'out the box', so be ready to connect to the internet using ethernet to download the latest drivers and thus get wifi to work.
Wifi works most of the time out of the box, for many years now. That was different two decades ago, but that's a long time.
But if it does not work that is highly annoying of course.
Ah I think what is happening is that a brand new laptop may not have had its drivers added to the latest download/image version of your chosen distro. Hence the need to connect and update to get those drivers.
For me, it was a ThinkPad X1 Gen 11 and Mint 21.2 (MATE) about a year ago.
Yeah, absolutely. And with very new hardware it absolutely depends on the distro as well, how new the kernel is.
> Linux Mint 21.2 features a Linux kernel 5.15 and an Ubuntu Jammy package base.
That's from Oct. 2021, so it wasn't new a year ago.
Ah indeed - actually 2 years ago. I believe 21.2 (Victoria) was released in 2023:
https://tuxcare.com/blog/is-linux-mint-based-on-ubuntu/
Doesn't time fly!
IMO the best way for Linux to win is to advocate it to your kids. People usually stick to whatever they learned and played with during childhood, and if we can make a whole generation Linux aware, not as a server but as a desktop, then the time of Linux desktop is foreseeable.
Instead of poaching to the adults, poach it to the kids. Tell them stories of hackers, etc.
We can also volunteer in schools for a few sessions of introduction to Linux.
Agree It drives me nuts that at school they give them tablet rather than pcs. All the work happens on pcs, not tablets.
My kids have kubuntu on some old laptops and will have linux everywhere. Hopefully it is ok to maintain that,I'm afraid of school forcing incompatible software
This is even more important now as younger generations are just given iPads and other locked down devices, to the point where they have to learn how to navigate a file system in their introductory university courses.
> to the point where they have to learn how to navigate a file system in their introductory university courses.
What happened to IT lessons at school? I had IT starting at the age of 13 onwards, and that was in the early 2000s. By the time you graduated from school, you were guaranteed to have at least basic IT skills.
I think those classes/lessons were added when the availability of computers were kind of so and so, not every kid had a computer at home but most adults at the time understood that computers will be used in the kids future one way or another, so they figured everyone should know it.
At one point it shifted to "everyone has a computer at home" to "everyone has a computer in their pocket, and learns typing by themselves" and the classes seems to have disappeared.
My IT lessons in the 90s were “here’s how to use spreadsheets and Microsoft office”. It wasn’t until later that I was taught programming and other more foundational concepts that would take me beyond being an office worker that knew how to format and print a letter. Fortunately, I had computers at home and my parents encouraged me to learn how they worked.
Or if you want to make them tech capable, another possibility is to give them a cheap Mac (apparently Apple is going to release a MacBook with an A-series SoC soon). You can open a terminal and show how it works, they can explore the filesystem with Finder, but when they need to do school work, they also have access to good DTP applications (Swift Publisher is really affordable and nice for kids), Affinity Designer/Photo, etc.
Our daughter initially had a Linux machine (also for the reasons you outlined). But she quickly ran into issues when doing work for school, collaborating with other kids, etc. So we gave her a Mac Mini M1. It was not very expensive, she can run many common apps (MS Office is a fact of life) and it's more secure (app sandboxing, sealed system volume, etc). But many of her skills are be transferrable to Linux if needed (in contrast to using an iPad or Android tablet).
My kid used an ancient HP laptop that found new life as a Hackintosh. Was rock solid for his early HS needs. Side bonus was the peer admiration he got around it.
That’s so awesome! I started using Macs after my brother and I were dabbling with Hackintosh in 2007 (mostly just trying it for fun). Unfortunately, it didn’t work on my laptop, but after seeing Hackintosh on his, I decided to get a Mac Mini.
So cool that your kid was able to use a Hackintosh in high school!
[dead]
>People usually stick to whatever they learned and played with during childhood
Any source for this? I grown on Atari, DOS and Windows. Most kids today are on mobile or consoles.
Linux does not need advertising, it is by far the best option from point of hardware selection, development, gaming and ease of use.
Unless you've already got an Nvidia GPU, in which case you might encounter repeating pains.
Thanks to Nvidia's Linux driver, my desktop environment breaks and hangs when it comes back from sleep, and every few days my screen starts flashing black until I reboot. I've been told the trigger may be the combination of DisplayPort and AdaptiveSync.
I think nerds like us are more likely to experiment and make occasional drastic changes to our computing setup. And in my case I grew up during a time when there was a lot of change in the desktop OS landscape, so sticking with one OS from my childhood to adulthood wasn’t really possible (otherwise I’d still be on Microsoft BASIC and OS-9).
Familiarity and “just-workitude” really are by far the most important factors for people picking OSs, and these days it’s possible to stick with one platform for decades.
> Linux does not need advertising, it is by far the best option from point of hardware selection, development, gaming and ease of use.
Gaming? No. Hardware selection? Lol absolutely not.
give them a dual boot machine, tell them linux is the secret superpower partition, windows partition is so you can move amongst the normies.
demonstrate how linux faithfully executes your commands when windows tells you its too dangerous or insecure.
Lots of "obsolete" Pc's are perfectly able to run W11, only missing the blessings from Redmond.
I wish I could also live in a planet where throwing away perfectly good PCs is fine and there's no looming climate-change-induced destruction...
you can donate them: https://labdoo.org/ this organization distributes donated laptops with linux all over the world.
I wish AI is used to solve real-world problems instead of creating AI slop, thanks OpenAI/Sora.
Usually they get recycled. I wouldn't stress about it.
Where is this dream-land? They got dumped along other mixed trash and lay for decades. Same with old, perfectly fine iPads with retina displays.
Still, I wouldn't stress about throwing things out. Stress about initial purchases instead.
So, if you have money, you can throw away as much as you can, right? I have bought loads of Apple tech during 2010s and all of it is trash. Every iPhone (e.g. models from 4 to 7) costs tens of dollars today, down from a thousand. Not only that, they’re basically useless. I have a few iPads 3 (the first one with Retina display), and it’s perfectly fine display, but VLC is the only app that works with it. Also, you can watch YouTube via browser, yet it’s not a pleasant experience. At least my MacBook Pro (pre-retina) works great with Linux, and possibly I’d keep it around as a Linux server, till it breaks. I cannot do anything like that with any of iOS devices that I have. Why?
When I purchased them, I had no idea that would be the case. My perfectly good iPhone 12 mini is already at its last legs (I bet a couple of years), thanks to iOS 26 and the liquid ass. Why is it so? Why should I stress about initial purchases? I’m not that poor and young, sorry, I’m worried about the footprint. And I hope you would one day too.
That is what I mean, stress about the impact at the time of purchase, not at disposal time.
After that, don't stress at all. As you say, you had no idea at the time of purchase, so toss it in the dump and move on.
Around 22% is "recycled" sometimes in disgusting ways:
https://www.youtube.com/watch?v=G5r4hFY_oh8
Same story with clothes, paper and plastic recycling, it's just talk and green-washing.
The question is, how much guilt-free landfill waste should each person get to create in their lifetime? Don't say zero.
Old machines use a lot of power, so not good for the climate. My rule of thumb is if it's less powerful than a $90 raspberry pi 400, which uses 10 watts max, it's probably worth e-wasting.
>Old machines use a lot of power,
So does producing a new one:
https://www.networkworld.com/article/752694/computer-factori...
I've got one. I just played a triple-A video game, but my motherboard doesn't support Win11 so no-go. I've got another year to figure out what's next but this will be my last windows computer. While the developer-focuses side of the company has treated me pretty well over the years, you no longer need a windows box to work in their tech and the consumer side of the business seems to actively hate people.
Unfortunately the Linux desktop is in its worst state it has been since two decades with containerized apps/frameworks/browsers updating when they feel like it all the time and the related unsolvable permission and layering problems on top of the usual WLAN, touchpad, and power management issues unsolved for nearly 30 years as a growing number of second system effect frameworks bring new issues with ungooglable solutions so Linux users themselves flock to Mac OS.
That is highly debatable, largely subjective and probably hardware dependent to an extent.
I'm running Linux on five different machines, and two of them are portables: a ThinkPad Z13 and a GPD Win Mini 2024 - and have none of the issues you mentioned. In fact there's literally zero issues at all, I couldn't be any happier to be honest.
> Unfortunately the Linux desktop is in its worst state it has been since two decades
I primarily used Linux in various ways since Ubuntu 6.06, and every year it gets a little bit better, useful and stable, from my point of view at least. But I also moved to Arch some years ago, and CachyOS this year, so that might cloud my view on how well things work, and I also stopped using laptops, which makes Linux life a lot easier.
A year after Windows 7 reached EOL, we began running Debian with XFCE for internet facing boxes. They all run uninterrupted with zero issues, and are orders of magnitude more stable than Windows 11. The biggest bonus to me is, Linux/BSD sets up in minutes and you're done; literally done and ready for work/pleasure. Properly setting up Windows using group policy and other privacy controlling mechanisms takes five+ hours. I still use Windows 7 which was the last user friendly OS put out by Redmond; it is still a great OS when kept off the internet.
> setting up Windows using group policy and other privacy controlling mechanisms takes five+ hours
I went through this not too long ago. And the effort required feels like a very deliberate dark pattern move.
Well, my first Ubuntu (circa 2006…07) wasn’t working well on me (with wired internet, no Bluetooth and an average Intel PC. These days, only one obsolete GPU wasn’t working for me. Basically, a default Fedora installation works well with everything I throw at it. In my view, the situation improved tremendously.
If people don't want to buy a new computer then I'd imagine the majority will stick with Win10 for a period until their current computer dies. As long as Chrome still supports Win10 then they'll think it's OK. Win11 is so radically different to Win10 that most would see less change going to Mint but won't.
The day EA games will run on Linux will be the day Windows dies IMHO. This will mean full support of all broken graphical APIs and stability of drivers from e.g. nvidia that currently looks like a rollercoaster ride - every other app works stable on different nvidia driver.
As W10 user, W11 denialer I’m waiting for that day!
You’re going to be waiting a long time, sorry to say. The question is; are you going to let these corporations rule your life in this way?
I have a 7 year old mini 'NUC' PC that ran Windows 10. I installed Windows 11 and used a popular Debloat program [1] - this produced a very nice operating system that suits me well and performed well.
I also tried to install various Linux distros on a partition but all of the installations failed towards the end of the process, causing various boot loader and other problems that required a lot of uncomfortable fixing in terminals and BIOS.
I would have liked to be using Linux but as it turned out a de-bloated Windows 11 experience is very good for me.
[1] https://github.com/Raphire/Win11Debloat?utm_source=perplexit...
Please remember to go gently and slowly/appropriately with people; Help people who want to move, show off how you're using stuff - but don't push too hard and make sure the setup is right for the person.
Is anyone here even still using Windows in a professional way?
Nearly everyone who wants to get work done and is not in the IT-Bubble for example:
https://www.autodesk.com/products
https://en.wikipedia.org/wiki/List_of_computer-aided_manufac...
As the Stack Overflow developer survey shows, a majority of developers use Windows:
https://survey.stackoverflow.co/2025/technology#1-computer-o...
HN is atypical of the world, particularly anything outside of the USA.
It may also be that Windows users are just not as vocal.
I ditched windows and switched to Linux back in the Windows 98 days. Have not looked back since.
I agree that HN is atypical of the world, but so is any surveys by Stack Overflow too, it's really heavily Windows and C# biased across the board, so take whatever results they get with a grain of salt.
Why do you think that? Is it really biased, considering the majority of the world uses Windows? Or is it just that you don't like it?
By the way, I'm not using Windows (but love .NET) and prefer Linux, but I'm not fooling myself that it's a popular choice on the desktop by any metric.
I think there are many reasons. Joel Spolsky and Jeff Atwood were both hugely into .NET/C# and Windows Tech (one of them worked at Microsoft even maybe?). Then the entire startup was built with Microsoft tech, because they knew that best, so when they talk about their architecture they obviously draw a particular crowd among others. It felt like a lot of the early community came from the founders blogs and Microsoft circles, and I was also asking mostly C# questions on SO too in the beginning.
Probably someone remembers the history better than me and have more clarity and/or corrections.
I cannot work productively with Linux because there are no Linux alternatives to the following Windows tools (and the 'alternatives' people tell me about are a sad joke):
* AutoHotkey
* Voidtools Everything
* Agent Ransack
* Irfanview
With WSL2, Windows 11 could easily have been the best Linux distribution on the planet, supporting all Windows apps while also giving you the full power of Linux at your terminal.
Unfortunately Microsoft decided to stuff it with user hostile spyware instead.
Eh, no. WSL2 is kind of bad compared to a proper Linux environment, you'll encounter so many tiny cuts that eventually you'll yearn for a proper environment where stuff just works instead of having to fight against the built-in "interoperability" stuff Microsoft has added that just gets in the way and introduces new issues.
Saying this as someone who've used both Linux and Windows as a main OS for decades, and would dump Windows 100% if I could get Ableton to run properly on Linux. I'm still using WSL2 from time to time, but for things I need to be productive with, I prefer my Linux environment 99% of the time.
I like WSL2 as a polished way to launch VMs in Windows. It's been good for testing out Linux software in an odd environment, but I haven't figured out how to enable SELinux or the systemd firewall on the Red Hat variants.
My trick is to use git bash as my primary shell on windows and Podman for Windows (not desktop / just the cli app). It's still not nearly as nice as using Arch but a reasonable approximation in a pinch.
Well, sounds worse as everything has to be a container then? I'll just continue with the slightly borked Arch-inside-WSL2 setup I have for that instead, as it kind of works, and continue to mostly run Arch bare metal when I can.
No, only containerized things need to be in a container. For everything else you are just using Windows except it feels more Linux like with git bash and has most basic Linux tools. If you need to run a container it is a lot more like it is on Linux podman container run ... right in your Windows shell (git bash in my case). This does things behind the scenes with a podman WSL image but it is mostly seamless.
I do actually start up Arch / Ubuntu / whatever in WSL for some things but mostly just use the above setup which is most Linux like without having to shell into WSL all the time. That being said, I use actual Linux / Arch whenever I can - yeah, I use Arch btw.
> No, only containerized things need to be in a container. For everything else you are just using Windows except it feels more Linux like with git bash and has most basic Linux tools.
But developer focused software is trash on Windows, if I want to remain productive I need a Linux environment so I can just run stuff without having to fuck around to configure it for Windows and what not.
Maybe but curious as to what you mean by developer focused software.
> WSL2 is kind of bad compared to a proper Linux environment, you'll encounter so many tiny cuts
I used WSL2 for years and never had any issues with it.
Also used WSL2, and keep hitting sharp edges. Last issue involved not being able to properly use the GPU through Docker running in WSL2 on a Arch installation, and Windows "helpfully" aliasing .exe's available on the host OS into the guest, confusing a lot of the stack.
A year ago or something, changing the size of the WSL disk via some Powershell command also corrupted the disk after it was done, I had to start from scratch which was a bit annoying.
At least 90% of our internal administration and my boss, mostly for excel and exchange. But it is quite present in offices at public institutions too.
Mostly people who don't have any choice.
Of course there is, lots of software doesn't work on Linux. I've switched to using Linux on my personal laptop, but I can't recommend it to anyone else in my small company (we're not a tech company, if I did get anyone else running it that would immediately make me their tech support, and I'm not doing that), and I can't use it on any of our branch machines as the POS software only runs on Windows or Android (might switch to touchscreen Android kiosks in the future though).
Absolutely. MacOS treats the user like a child (Windows isn’t much better, but it is better), and Linux doesn’t have all the domain integration features and breadth of software that Windows brings to the table. I’m extremely comfortable on Linux at home, but I use Windows at work.
MacOS treats the user like a child
Sorry, but this is just nonsense. I used Linux on the desktop from 1994 to 2007 (lucky enough to start using Linux when I was 12). I have used macOS as my main desktop since 2007, but I also have a ThinkPad with NixOS and a Linux workstation (currently used headless though). I feel like I'm able to make a fair comparison.
macOS for me feels like a much more mature version of the Linux desktop, on more mature hardware, with applications that are not available on Linux. macOS is rarely getting in my way. The primary reason I also use Linux (and contribute in various ways) is that I find FLOSS morally preferable and hope for that reason that the Linux desktop wins in the long term.
Though having been around for many 'Linux is soon ready for the desktop'-moments since around 2000 (anyone remember Corel Linux?), I don't hold high hopes. As long as basic things like waking sleep results in a kernel panic because waking your Thunderbolt display puts the kernel in a weird state (and the 999 other paper cuts), Linux on the desktop won't happen.
Yes. Audio and video. Tried Linux several times. Nope, not yet.