"More powerful than Linux" is silly. It's a VM. The most useful thing is that it does a bunch of convenience features for you. I am not suggesting that it is not extremely convenient, but it's not somehow more powerful than just using Linux.
You know what's even more convenient than a VM? Not needing a VM and still having the exact same functionality. And you don't need a bunch of janky wrapper scripts, there's more than one tool that gives you essentially the same thing; I have used both Distrobox and toolbx to quickly drop into a Ubuntu or Fedora shell. It's pretty handy on NixOS if I want to test building some software in a more typical Linux environment. As a bonus, you get working hardware acceleration, graphical applications work out of the box, there is no I/O tax for going over a 9p bridge because there is no 9p bridge, and there is no weird memory balloon issues to deal with because there is no VM and there is no guest kernel.
I get that WSL is revolutionary for Windows users, but I'm sorry, the reason why there's no WSL is because on Linux we don't need to use VMs to use Linux. It's that simple...
Yeah if you are working with Linux only, its better to go full linux.
WSL2 is really handy when you want to run other software though. For example, I use Solidworks, so I need to run windows. Forscan for Ford vehicles also has to run under Windows. Having WSL2 means that I can just have one laptop and run any software that I want.
My development is mainly Windows and I prefer Linux host with Windows VM guests. The experience is more stable and I can revert to a snapshot when Windows or Microsoft product update brakes something or new test configuration does. It also allows to backup and retain multiple QA environments that are rarely used, like a client's Oracle DB. It is nice being able to save the VM state at the end of the week and shut it all down so you can start the next right where you left off. Cannot do that when your development environment is the bare metal OS. Windows has known issues of waking a sleeping laptop.
I too think it would be definitely more stable Linux Host with Win VM guests, but I can see the other way around being more convenient to get support for commercially. Though with the VMWare licensing changes, I think what is by default easier for commercial support options may be changing too.
I'm on Lenovo Yoga 6, Gentoo, 6.12 kernel, 4.20 Xfce. Sleeps works perfect. Same on my Asus+AMD desktop. I've not had sleep related issues for years. And last time I did, it was an out-of-tree Wifi driver causing the whole mess.
I discovered over the weekend that only 1 monitor works over HDMI, DisplayPort not working, tried different drivers.
Suspend takes a good 5 minutes, and on resume, the UI is either turn or things barely display.
I might buy a Windows license, especially if I can't get multi-screen to work.
This has been a pain point for us and our development process… not all versions of Nvidia drivers are the same… even released ones. You have to find a “good” version and keep to it, and then selectively upgrade… at least this has been the case the last 5 years, folks shout out if they have had different experiences.
Side note: our main use case is using cuda for image processing.
"Works on my machine!" is stupid when it comes to software running under an OS, because a userland program that is correct shouldn't work any differently from box to box. (Exceptions you already know notwithstanding.) It is very different when it comes to an operating system.
I know people here hate this, but if you want a good Linux experience, you need to start by picking the right hardware. Hardware support is far and away the number one issue with having a good Linux experience anymore. It's, unfortunately, very possible to even set out to pick good hardware and get burnt for various reasons, like people misrepresenting how well a given device works, or perhaps just simply very similar SKUs having vastly different hardware/support. Still, i'm not saying you have to buy something from a vendor like System76 that specifically caters to Linux. You could also choose a machine that just happens to have good Linux support by happenstance, or a vendor that explicitly supports Linux as an option. I'm running a Framework Laptop 16 and it works just fine, no sleep issues. As far as I know, the sole errata that exists for this laptop is... Panel Self Refresh is broken in the AMDGPU driver. It sorta works, but it's a bit buggy, causing occasional screen artifacts. NixOS with nixos-hardware disables it for me using the kernel cmdline argument amdgpu.dcdebugmask=0x10. That's about it. The fingerprint reader is a little fidgety, and Linux could do a better job at laptop audio out of the box, but generally speaking the hardware works day in and day out. It's not held together with ducktape.
I don't usually bother checking to see if a given motherboard will work under Linux before buying it, since desktop motherboards tend to be much better about actually running Linux well. For laptops, Arch wiki often has useful information for a given laptop. For example, here's the Arch wiki regarding the Framework 16:
It's fair to blame Linux for the faults it actually has, which are definitely numerous. But let's be fair here, if you just pick a given random device, there is a good chance it will have some issues.
I recall having a sleep issue with linux 15 years ago, I think its been fixed long ago, except maybe on some very new hardware or if you install the wrong linux on an M series Mac you could have issues with sleep.
The less coupled software is to hardware, the less likely it is tested in that hardware and the higher likelihood of bugs. Linux can run fine but arbitrary Linux distros may not. This is not the fault of hardware makers.
> The less coupled software is to hardware, the less likely it is tested in that hardware and the higher likelihood of bugs.
Yes, exactly! There are whole teams inside Dell etc. dealing with this. The term is "system integration." If you're doing this on your own, without support or chip unfo, you are going to (potentially) have a very, very bad time.
> This is not the fault of hardware makers.
It is if they ship Linux on their hardware.
This is why you have to buy a computer that was built for Linux, that ships with Linux, and with support that you can call.
Hardware support is more than just kernel support. Additionally, not every kernel release works well for every piece of hardware. Each distro is unique and ensuring the correct software is used together to support the hardware can be difficult when you are not involved in the distro. This is why vertical integration between the distro and hardware leads to higher quality.
ChromeOS, where sleep presumably worked, is also Linux. You just exchanged a working Linux for a distro with more bugs. The fact that you're able to do that is pretty cool.
That's not to detract from the larger point here though. It's pretty funny that all of the replies in this thread identify different causes and suggest different fixes for the same symptom. Matches my experience learning Linux very well.
Can you share more details of how you make that work well? What hypervisor, what backup/replication, for instance? I can only imagine that being a world of irritation.
It's been a few years since I used it, but Virtualbox (free) had perfectly good suspend/restore functionality, and the suspended VM state was just a file.
In the same spirit if "it depends", there are other options that may work for people with different Linux/Windows balance points:
* Wine is surprisingly good these days for a lot of software. If you only have an app or two that need Windows it is probably worth trying Wine to see if it meets your needs.
* Similarly, if gaming is your thing Valve has made enormous strides in getting the majority of games to work flawlessly on Linux.
* If neither of the above are good enough, dual booting is nearly painless these days, with easy setup and fast boot times across both OSes. I have grub set to boot Linux by default but give me a few seconds to pick Windows instead if I need to do one of the few things that I actually use Windows for.
Which you go for really depends on your ratio of Linux to Windows usage and whether you regularly need to mix the two.
I'm struggling to find an option for running x86 Windows software on MacOS/Apple Silicon performantly. (LiDAR point cloud processing.)
The possibilities seem endless and kinda confusing with Windows on ARM vs Rosetta and Wine, think there's some other options which use MacOS's included virtualization frameworks.
(Edit: just so you know, the UI is a bit weird, there is a bit of a learning curve. But the app behaves in a very sane manner, with every step the previous state is maintained and a new node is created. It takes time to get used to it, but you'll learn to appreciate it.
May your cloud have oriented normals, and your samples be uniformely distributed. Godspeed!)
Have you tried to install Windows 11 ARM under UTM on Mac? UTM is a kind of open source Parallels. Then you'll run x86 software using Windows' variant of Rosetta. Probably slower than Rosetta but perhaps good enough.
I wanted to play around with Windows 11 for a while now. It boots in UTM just to the degree that I can confirm my suspicions that Windows 11 sucks compared to Windows 10, but is not otherwise usable. (MacBook Air M3, slightly outdated macOS)
The thing about WINE is that it's not necessarily solid enough to rely on at work. You never know when the next software upgrade will break something that used to work.
That's always true, of course. But, compared to other options, relying on WINE increases the chances of it happening by an amount that someone could be forgiven for thinking isn't acceptable.
In my mind, I almost feel like the opposite is true. Wine is getting better and better, especially with the amount of resources that Valve is putting into it.
If you want a stable, repeatable way to wrangle a Windows tool: Wine is it. It's easy to deploy and repeat, requires no licenses, and has consistent behavior every time (unless you upgrade your Wine version or something). Great integration with Linux. No Windows Updates are going to come in and wreck your systems. No licensing, no IT issues, no active directory requirements, no forced reboots.
You can fix this issue by using a wine "bottle manager" like... Bottles. This allows you to easily manage multiple instances of wine installations (like having multiple windows installations) with better and easy to use tooling around it. More importantly, it also allows you to select across many system agnostic versions of wine that won't be upgraded automatically thus reducing the possibility of something that you rely breaking on you.
I used to a long time ago but even back then I was getting more value out of q4wine (a defunct project now) than from CodeWeavers stuff. Granted, I was perhaps too "enthusiast" using git versions of wine with staging patches and my own patches rolled into it, so q4wine (and I guess now Bottles) more DIY approach won me over.
That all said, I haven't tried CodeWeavers in almost 10 years so it might have improved a lot.
Wine is fantastic, but it is fantastic in the sense of being an amazing piece of technology. It's really lacking bits that would make it a great product.
It's possible to see what Wine as a great product would look like. No offense to crossover because they do good work, but Valve's Steam Play shows what you can really do with Wine if you focus on delivering a product using Wine.
Steam offers two main things:
- It pins the version of Wine, providing a unified stable runtime. Apps don't just break with Wine updates, they're tested with specific Proton versions. You can manually override this and 9 times out of 10 it's totally fine. Often times it's better. But, if you want it to work 10 out of 10 times, you have to do what Valve does here.
- It manages the wineserver (the lifecycle of the running Wine instance) and wine prefix for you.
The latter is an interesting bit to me. I think desktop environments should in fact integrate with Wine. I think they should show a tray icon or something when a Wineserver is running and offer options like killing the wineserver or spawning task manager. (I actually experimented with a standalone program to do this.[1]) Wine processes should show up nested under a wineserver in system process views, with an option to go to the wineprefix, and there should be graphical tools to manage wine prefixes.
To be fair, some of that has existed forever in some forms, but it never really felt that great. I think to feel good, it needs to feel like it's all a part of the desktop system, like Wine can really integrate into GNOME and KDE as a first-class thing. Really it'd be nice if Wine could optionally expose a D-Bus interface to make it so that desktop environments could nicely integrate with it without needing to do very nasty things, but Wine really likes to just be as C/POSIX/XDG as possible so I have no idea if something like that would have a snowball's chance in hell of working either on the Wine or desktop environment side.
Still, it bums me out a bit.
One pet peeve of mine regarding using Wine on Linux is that EXE icons didn't work out of the box on Dolphin in NixOS; I found that the old EXE thumb creator in kio-extras was a bit gnarly and involved shelling out to an old weird C program that wasn't all that fast and parsing the command line output. NixOS was missing the runtime dependency, but I decided it'd be better to just write a new EXE parser to extract the icon, and thankfully KDE accepted this approach, so now KDE has its own PE/NE parser. Thumb creators are not sandboxed on KDE yet, so enable it at your own risk; it should be disabled by default but available if you have kio-extras installed. (Sidenote: I don't know anything about icons in OS/2 LX executables, but I think it'd be cool to make those work, too.) The next pet peeve I had is that over network shares, most EXE files I had wouldn't get icons... It's because of the file size limit for remote thumbnails. If you bump the limit up really high, you'll get EXE thumbnails, but at the cost of downloading every single EXE, every single time you browse a remote folder. Yes, no caching, due to another bug. The next KDE frameworks version fixes most of this: other people sorted out multiple PreviewJob issues with caching on remote files, and I finally merged an MR that makes KIO use kio-fuse when available to spawn thumb creators instead of always copying to a temporary file. With these improvements combined, not just EXE thumbnails, but also video thumbnails work great on remote shares provided you have kio-fuse running. There's still no mechanism to bypass the file size limit even if both the thumbcreator and kio-fuse remote can handle reading only a small portion of the file, but maybe some day. (This would require more work. Some kio slaves, like for example the mpt one, could support partially reading files but don't because it's complicated. Others can't but there's no way for a kio-fuse client to know that. Meanwhile thumb creators may sometimes be able to produce a thumbnail without reading most of the file and sometimes not, so it feels like you would need a way to bail out if it turns out you need to read a lot of data. Complicated...)
I could've left most of that detail out, but I want to keep the giant textwall. To me this little bit of polish actually matters. If you browse an SMB share on Linux you should see icons for the EXE files just like on Windows, without any need to configure anything. If you don't have that, then right from the very first double-click the first experience is a bad one. That sucks.
Linux has thousands of these papercuts everywhere and easily hundreds for Wine alone. They seem small, but when you try to fix them it's not actually that easy; you can make a quick hack, but what if we want to do things right, and make a robust integration? Not as easy. But if you don't do that work, you get where we're at today, where users just expect and somewhat tolerate mediocre user experience. I think we can do better, but it takes a lot more people doing some ultimately very boring groundwork. And the payoff is not something that feels amazing, it's the opposite: it's something boring, where the user never really has any hesitation because they already know it will work and never even think about the idea that it might not. Once you can get users into that mode you know you've done something right.
Thanks for coming to my TED talk. Next time you have a minor pet peeve on Linux, please try to file a bug. The maintainers may not care, and maybe there won't be anyone to work on it, and maybe it would be hard to coordinate a fix across multiple projects. But honestly, I think a huge component of the problem is literally complacency. Most of us Linux users have dealt with desktop Linux forever and don't even register the workarounds we do (anymore than Windows or Mac users, albeit they probably have a lot less of them.) To get to a better state, we've gotta confront those workarounds and attack them at the source.
If you (or whoever is reading this) want(s) a more refined Wine, I highly recommend CodeWeavers. Their work gets folded back into open source WINE, no less.
> To get to a better state, we've gotta confront those workarounds and attack them at the source.
To my eye, the biggest problem with Linux is that so few are willing to pony up for its support. From hardware to software.
Buy Linux computers and donate to the projects you use!
That's true, but even when money is donated, it needs to be directed somewhere. And one big problem, IMO, is that polish and UX issues are not usually the highest priority to sort out; many would rather focus on higher impact. That's all well and good and there's plenty of high impact work that needs to be done (we need more funding on accessibility, for example.) But if there's always bigger fires to put out, it's going to be rather hard to ever find time to do anything about the random smaller issues. I think the best thing anyone can do about the smaller issues is having more individual people reporting and working on them.
If your at work, it's probably a Windows shop. Use windows. At home you can chance a bad update, and probably also have access to windows. Can always use a VM, wine is great in some cases, like WSL. Both don't meet every use case.
why bring wine into a vm discussion? just run windows in a vm too. problem solved without entering the whining about wine not being better than windows itself
I work in embedded systems. In that space, it's pretty common to need some vendor-provided tool that's Windows-only. I often need to automate that tool, maybe as part of a CI/CD pipeline or something.
If I were to do it with a Windows VM, I'd need to:
1. Create the VM image and figure out how to build/deploy it.
2. Sort out the Windows licensing concerns.
3. Figure out how to launch my tool (maybe put an SSH server into the VM).
4. Figure out how to share the filesystem (maybe rsync-on-SSH? Or an SMB fileshare?).
If I do it with Wine instead, all I need to do is:
1. Install some pinned version of Wine.
2. Install my tool into Wine.
3. Run it directly.
> For example, I use Solidworks, so I need to run windows.
Right. One of the things a lot of people don't get is the extent to which multidisciplinary workflows require Windows. This is particularly true of web-centric software engineers who simply do not have any exposure to the rest of the engineering universe.
Years ago this was the reason we had to drop using Raspberry Pi's little embedded microcontroller. The company is Linux-centric to such an extent that they simply could not comprehend how telling someone "Just switch to Linux" is in a range between impossible and nonsensical. They were, effectively, asking people to upend their PLM process just for the sake of using a little $0.50 part. You would have to do things like store entire OS images and configurations just to be able to reconstruct and maintain a design iteration from a few years ago.
WSL2 is pretty good. We still haven't fully integrated this into PLM workflows though. That said, what we've done on our machines was to install a separate SSD for WSL2. With that in place, backing-up and maintaining Linux distributions or distributions created in support of a project is much, much easier. This, effectively, in some ways, isolates WSL2 distributions from Windows. I can clone that drive and move it from a Windows 10 machine to a Windows 11 machine and life is good.
For AI workflows with NVIDIA GPU's WSL2 is less than ideal. I don't know if things have changed in this domain since I last looked. Our conclusion from a while back was that, if you have to do AI with the usual toolchains, you need to be on a machine running Linux natively rather than a VM running under Windows. It would be fantastic if this changed and one could run AI workflows on WSL2 without CUDA and other issues. Like I said, I have not checked in probably a year, maybe things are better now?
EDIT: The other reality is that one can have a nice powerful Linux machine next to the Windows box and simply SSH into it to work. Most good IDE's these days support remote development as well. If you are doing something serious, this is probably the best setup. This is what we do.
Im sure with enough tinkering I could get Solidworks to run. The thing is I don't want to spend time tinkering, I want to spend time doing. WSL2 gives me the optimal solution for all of that + dev.
I really want to like Windows 11, and I enjoy using WSL, but Microsoft treats me too much like an adversary for me to tolerate it as a daily driver. Only a complete scumbag of a product manager would think pushing Candy Crush ads is a good idea.
I’ve got an airgapped Toughbook that I use for the few Windows apps I really need to talk to strange hardware.
You don't need LTSC, you just need Windows Pro versions.
Lots of people bitch and moan about Windows problems that only exist because they buy the cheaper "Home" or whatever license and complain that Microsoft made different product decisions for average users than for people who have bought the explicitly labeled "power user" version.
Remember, the average computer user IS a hostile entity to Microsoft. They will delete System32 and then cry that Windows is so bad! They will turn off all antivirus software and bitch about Windows being insecure. They refuse to update and then get pwned and complain. They blame Microsoft for all the BSODs that were caused by Nvidia's drivers during the Vista era. They will follow a step by step procedure in some random forum from ten years ago that tells them to turn off their entire swap file despite running with lots of RAM and spinning rust and then bitch that Windows is slow.
Don't expect Microsoft to not deal with morons using their software. Buy the Pro versions if you don't want the version meant for morons.
I shouldn’t need to spend this much time and energy turning off AI rubbish, bypassing cloud features, or knobbling telemetry and ads because some shitbag at Microsoft decided this was a good way of getting a promotion.
My computer is supposed to work for me, not the other way around.
My coworkers stubbornly try to use WSL instead of Linux directly. They constantly run into corner cases and waste time working around them compared to just using Linux. Some tooling detects that it is running on Windows, and some detects that it is running on Linux. In practice, it's the worst of both worlds.
What may hap be your workload? The only thing that aren't working on Linux day 1 are GPU's, and it's mostly because kernel/distro timings (we haven't had a GPU release without support for mainline kernel in years).
I am into small and portable, decently powerful, high DPI laptops (battery be damned), ideally with touch support. And this category just gets no love in the linux world.
I was holding hopes for the Framework 12" but they cheaped on the screen to target the student market, with no upgrade option at this point.
Or a way worse touchpad experience. No swiping geastures. No smooth scrolling. FN-buttons not working. Or any other million issues. I have never been able to install Linux on a laptop and getting things to work within a weekend. And then reverting becuase I need my computer.
If you're thinking of apple… as a former apple owner and current thinkpad owner… the built quality of apple is severely overrated. Please come back with comments that are not just shilling.
That was kind of my point: we're still at a stage where checking a list of supported laptops and vendors is pretty much mandatory.
This is totally laptop vendors' fault, but that doesn't change the fact of the matter.
PS: it would be fine if there was a few good options in all categories. Right now I see nothing comparable to an Asus Z13 but with first class Linux support for instance.
What modern hardware isn't supported by Linux? I haven't had driver problems in probably over a decade. I don't even target Linux for my builds, it just works. Same with the pile of random laptops I've installed it on. Wifi out of the box etc.
Fingerprint sensors and IR login cameras that are pre-installed on many laptops, and have Windows-only drivers.
As an end-user (yes, I'm an engineer too, but from the perspective of the OS and driver developers I am an end-user) I don't care who is in charge of getting the device to work on an OS—I only care that it works or not. And these devices don't, on Linux. So, they are broken.
Why would your primary work device be running an OS not supported by the device vendor? That's just bizarre.
I use Linux as my primary OS, and while Proton/Steam are pretty good now I'm still rebooting into (unactivated) Windows for some games. It's fine. It's also the only thing I use Windows for.
On an unrelated note, I'm frankly confused about who wants Apple's janky OS, because I've been forced to use it for work and it is very annoying.
Yesterday, they tried to get a Python library that built a native library using Meson to work. They were working under WSL, but somehow, Meson was attempting to use the MSVC toolchain and failing.
And they were using pip/uv whatever from linux, the linux version.
One of the most common issues is calling a windows executable from within wsl… it’s a “convenience” feature that takes about 2 seconds to disable in the wsl config but causes these kinds of weird bugs
For me, the best part of running Linux as the base OS is not having to deal with Windows.
No ridiculous start menu spam; a sane, non-bloated operating system (imagine being able to update user space libraries without a reboot, due to being able to delete files that other processes still have opened!); being able to back up my data at the file level without relying on weird block-level imaging shenanigans and so much more.
How is inverting the host/guest relationship an improvement on that?
> For me, the best part of running Linux as the base OS is not having to deal with Windows.
This is correct, but let's not pretend that linux is perfect. 99% of linux _for me_ is my terminal environment. WSL delivers on that _for me_.
I don't see any start menu spam because I rarely use it, when I do I type what I'm looking for before my eyes even move to look at that start menu.
oh, I can play destiny 2 and other games without shenanigans. Also don't need to figure out why Slack wants to open links in chromium, but discord in firefox (I have to deal with edge asking to be a default browser, but IMO it's less annoying).
Oh and multi-monitor with multiple DPI values works out of the box without looking up how to handle it in one of the frameworks this app uses.
> when I do I type what I'm looking for before my eyes even move to look at that start menu.
That's a /s, right? When I start typing immediately after the windows button, the initial letters are lost, the results are bad either way, and most turn into just web suggestions rather than things named exactly like the input.
> That's a /s, right? When I start typing immediately after the windows button, the initial letters are lost, the results are bad either way, and most turn into just web suggestions rather than things named exactly like the input.
No, I rarely have issues with search in start menu.
> imagine being able to update user space libraries without a reboot
That's... a very weird criticism to level at Windows, considering that the advice I've seen for Linux is to reboot if you update glibc (which is very much a user space library).
Why? It directly results in almost every Windows update requiring a reboot to apply, compared to usually only an application restart or at most desktop logout/login on Linux.
Having to constantly reboot my computer, or risk missing important security patches, was very annoying to me on Windows.
I've never had to reboot after updating glibc in years of using Linux, as far as I can remember.
Running programs will continue to use the libc version that was on disk when they started. They won't even know glibc was upgraded. If something is broken before rebooting, it'll stay broken after.
This is not true. Different programs on the same system that interoperate and use different versions of the same shared library can absolutely cause issues.
For a trivial change to glibc, it won't cause issues. But there's a lot of shared libraries and lots of different kinds of changes in different kinds of libraries that can happen.
I still haven't nailed if it was due to a shared library update, but just the other day, after running upgrades I was unable to su or sudo / authenticate as a user until after rebooting.
It does happen, but it's pretty rare compared to Windows in my experience, where inconvenience is essentially guaranteed.
Firefox on Linux did not really enjoy being updated while running, as far as I remember; Chrome was fine with it, but only since it does some extra work to bypass the problem via its "zygote process": https://chromium.googlesource.com/chromium/src/+/main/docs/l...
I responded "This is not true" to a sibling comment about this same topic, but about "shared libraries", which is the opposite problem (multiple programs could load the same shared library and try to interact).
This is absolutely not true for Linux kernel updating. While you won't be using the new kernel before rebooting, there's 0 risk in not rebooting, because there's exactly 1 version of the kernel running on the machine -- it's loaded into memory when your computer starts.
There's of course rare exceptions, like when a dynamically linked library you just installed depends on a minimum specific version of the Linux kernel you also just installed, but this is extremely rare in Linux land, as backwards compatibility of programs with older kernels is generally a given. "We do not break userspace"
One problem not rebooting with the kernel is drivers. They aren’t all built in.
Most distros leave the current running kernel and boot into the new one next time.
Some, like Arch, overwrite the kernel on an update, so modules can’t be loaded. It is a shock the first time you plug in a USB drive and nothing happens.
Windows at its core just does not seem like a serious operating system to me. Whenever there are two ways to do something, its developers seem to have picked the non-reasonable one compared to Unix – and doing that for decades adds up.
But yes, first impressions undoubtedly matter too.
I have no idea what Windows does with the various network services but my Pi-Hole gets rate-limited* when it connects to the network--there's just constant DNS lookups to countless MS domains, far beyond what could reasonably be expected for a barebones install.
This isn't even a corpo-sloptop with Qualys and Zscaler and crap running, Just a basic WIndows box I rarely boot. It's deeply offensive to me.
When you compare thing on API level, NT is generally superior to POSIX - just look at what a mess fork() is for one example, or fd reuse, or async I/O.
It is not the standard in Windows land to run processes by handing them fifty commandline arguments. Simple as that. Win32 apps have strong support for selecting multiple files to pass to the app from within the file select dialog, as long as you follow the documentation.
It's like complaining that Unix is hard to use because I can't just drop a dll into a folder to hook functionality like I can on Windows. It's a radically different design following different ideologies and you can't magically expect everything to transfer over perfectly. If you want to do that on Linux land, you learn about LD_PRELOAD or hook system calls.
If you want to build powerful, interoperable modules that can pipe into each other and compose on the commandline, Powershell has existed since 2006. IMO, passing well formed objects from module to module is RADICALLY better than passing around text strings that you have to parse or mangle or fuck with if you want actual composibility. Powershell's equivalent of ls doesn't have to go looking at whether it is being called by an actual terminal or by an app Pipe for example in order to support weird quirks. Powershell support for Windows internals and functionality is also just radically better than mucking around in "everything is a file" pseudo folders that are a hacky way to represent important parts of the operating system, or calling IOCntrls.
I also think the way Windows OS handles scheduled tasks and operations is better than cron.
I also think Windows Event logging is better than something like dmesg, but that's preference.
Also EVERYTHING in Windows land is designed around remote administration. Both the scheduled tasks and Event Logging systems are transparently and magically functional from other machines if you have you AD setup right. Is there anything in Linux land like AD?
> Win32 apps have strong support for selecting multiple files to pass to the app from within the file select dialog
The problem is when you want to click a file on your file manager and you want it to open in the associated application. Because the file manager can only hope the associated application parses the escapes the same way it generates them. Otherwise it's file not found :)
I'm not going to bother to reply point by point since you completely missed the point in the first few words.
So, basically yesterday, and not default like how it is with execve, and you can never know if the command you're trying to call implements it the same way or does a different escaping.
Care to explain how fork "breaks" threaded apps? You can't mix them for doing multiprocessing, but it's fine if you use one model or the other.
Win10 has been around for literally a decade now. So much so that it's going out of support.
fork() breaks threaded apps by forking the state of all threads, including any locks (such as e.g. the global heap lock!) that any given thread might hold at that moment. In practice this means that you have to choose either fork or threads for your process. And this extends to libraries - if the library that you need happens to spawn a background thread for any reason, no more fork for you. On macOS this means that many system APIs are unusable. Nor is any of this hypothetical - it's a footgun that people run into regularly (just google for "fork deadlock") even in higher level languages such as Python.
How long has fork() existed? Is it less than 10 year? Is it much much more?
> just google for "fork deadlock"
I did, results were completely unrelated to what you're talking about.
Anyway libraries spawning hidden threads… I bet they don't even bother to use reentrant functions? I mean… ok they are written by clueless developers. There's lots and lots of them, they exist on windows too. What's your point?
I have used Windows for years, and I loved it. I never understood why Linux and Mac users kept bashing on it. I just didn't know any better.
These days I'm avoiding booting into Windows unless I really have no choice. The ridiculousness of it is simply limitless. I would open a folder with a bunch of files in it and the Explorer shows me a progress bar for nearly a minute. Why? What the heck is it doing? I just want to see the list of files, I'm not even doing anything crazy. Why the heck not a single other file navigator does that — not in Linux, not on Mac, darn — even the specialized apps built for Windows work fine, but the built-in thing just doesn't. What gives? I would close the window and re-open the exact same folder, not even three minutes later and it shows the progress bar again. "WTF? Can't you fucker just cache it? Da fuk you doing?"
Or I would install an app. And seconds after installing it I would try to search for it in the Start menu, and guess what? Windows instead opens Edge and searches the web for it. wat? Why the heck I can't remove that Edge BS once and for all? Nope, not really possible. wat?
Or like why can't I ever rebind Cmd+L? I can disable it but can't rebind it, there's just no way. Is it trying to operate my computer, or 'S' in 'OS' stands for "soul"?
Or for whatever reason it can't even get the time right. Every single time I boot into it, my clock time is wrong. I have to manually re-sync it. It just doesn't do it, even with the location enabled. Stupid ass bitch.
And don't even let me rant about those pesky updates.
I dunno, I just cannot not hate Windows anymore. Even when I need to boot in it "for just a few minutes", it always ends up taking more time for some absolute fiddlesticks made of bullcrap. Screw Windows! Especially the 11 one.
> Or for whatever reason it can't even get the time right. Every single time I boot into it, my clock time is wrong.
Dual booting will do that because linux & windows treat the system clock differently. From what I recall one of them will set it directly to the local time and the other always sets it to UTC and then applies the offset.
The most reliable fix is to get Windows to use UTC for the hardware clock, which is usually the default on Linux. (It's more reliable because it means the hardware clock doesn't need to be adjusted when DST begins or ends, so there's no need for the OSs to cooperate on that.)
That flag has been broken for at least several Windows versions, unfortunately. A shame, given that that's the only sane way of using the RTC in the presence of DST or time zone shifts...
That's exactly the type of Windows-ism I'm talking about. Two options (use UTC or the local time), and Windows chose to pick the nonsensical one.
Yeah, well, I use ntfs in Linux. It somehow knows how to treat the partitions. Even though it can't fix the issues when they arise (which almost never happens) — there's no chkdsk for Linux. So, I just don't understand why Windows can't automatically sync the clock (as it explicitly set to do it) when it boots? Why does one have to get creative to fix the darn clock? If I can't even trust the OS to manage the time correctly, what can I trust it with, if anything at all?
I loved windows XP and Windows 7. They were a bit brittle regarding malware, but I was using a lot of pirated software at the times, so that may have been me. Win 8 was bad UX wise, but 8.1 resolved a lot of the issues. But since then, I barely touched windows.
I want a OS, not an entertainment center, meaning I want to launch a program, organize my files, and connect to other computers. Anything that hinders those is bad. I moved from macOS for the same reason, as they are trying to make those difficult too.
Exactomundo! I'm a software developer, not a florist. I don't care about all those animations, transitions, dancing emojis, styled sliding notifications, windings and dingleberries. If I want to rebind a fucking key I should be able to. If I want to replace the entire desktop with a tiling manager of my choosing — that should be possible. And definitely, absolutely, in no way, should just about any kind of app, especially a web-browser, be shoved in my face. "Edge is not that bad", they would say. And would be completely missing the whole point.
Are you one of those guys that fiddles with registry settings and decrapifiers? To me, it sounds like you turned off file indexing. I turn it off when doing audio recording and yeah, that slows down file browsing.
The reason varies by the decade. Microsoft has a tendency to fix one thing, then break another.
That said, a distaste for advertising goes beyond OCD. Advertisers frequently have questionable ethics, ranging from intruding upon people's privacy (in the many senses of the word) to manipulating people. It is simply something that many of us would rather do without.
Advertising triggers a lot more than OCD in me outside of my start menu. On my machine, where I spend most of my waking hours, it was certainly the last straw for me.
But there's also the thing where Microsoft stops supporting older machines, creating a massive pile of insecure boxes and normie-generated e-waste; and the thing where it dials home constantly; and the thing where they try and force their browser on you, and the expensive and predatory software ecosystem, and the insane bloat, and the requiring a Microsoft account just to use my own computer. Oh yeah, and I gotta pay for this crap?!
I went full Linux back when Windows 11 came out and will only use it if a job requires. Utterly disgusting software.
What makes you think I’m not chill already? You engaged in a slightly rude trope, and I provided a very mild push back, at least from my point of view the stakes are all correctly low.
But you still get the worst of the Windows world, which is more than many are willing to deal with. I was using windows for years as my main gaming OS, but after they announced W11 being the only way forward. Switching to Linux on the desktop was like a breath of fresh air. I'll leave it at that.
If I were to run an OS on a VM it's gonna be windows, not Linux
You obviously don't. Maybe WSL is the best compromise for people who need both Windows and Linux.
But it's ridiculous to think that WSL is better than just Linux for people who don't need Windows at all. And that's kind of what the author of this thread seems to imply.
I think that case could be made. For example for people who have a laptop that is not well supported by linux. With WSL they get linux and can use all of their hardware.
If it’s impossible to massage Linux into working well with your laptop – sure. But you’re missing out so much, like, well, not having to deal with Windows.
Similarly powerful would be totally fine. More powerful really is silly. Personally I couldn't make a lot of my workflows work very well with WSL2. Some of the stuff I run is very memory intensive and the behavior is pretty bad for this in WSL2. Their Wayland compositor is also pretty buggy and unpolished last I used it, and I was never able to get hardware acceleration working right even with the special drivers installed, but hopefully they've made some progress on that front.
Having Windows and Linux in the same desktop the way that WSL2 does obviously means that it does add a lot of value, but what you get in the box isn't exactly the same as the thing running natively. Rather than a strict superset or strict subset, it's a bit more like a Venn diagram of strengths.
By default wsl2 grabs half of the memory, but that's adjustable. The biggest pain point I have is to run servers inside wsl that serve to non-localhost (localhost works auto-magically).
I am surprised you had such problems with wsl2 graphics acceleration. That just worked for me, including CUDA accelerated workloads on the linux side.
As everyone said, WSL2 is actually virtual machines and it is what most people are actually using now. That said, I feel the need to chime in and say I actually love WSL1 and I love Windows NT the kernel. It bums me out all the time that we probably won't get major portions of the NT kernel, even an out-of-date version, in some open source form.
I like Linux, and I use Linux as my daily desktop, but it's not because I think Linux or even UNIX is really that elegant. If I had to pick a favorite design it would be Windows NT for sure, even with all its warts. That said, the company behind Windows NT really likes to pile a lot of shit I hate on top of that pretty neat OS design, and now it's full of dubious practices. Automatic "malware submission" on by default, sending apps you download and compile yourself to Microsoft and even executing them in a VM. Forced updates with versions that expire. Unbelievable volumes of network traffic, exfiltrating untold amounts of data from your local machine to Microsoft. Ads and unwanted news all over the UI. Increasing insistence in using a Microsoft account. I could go on and on.
From a technical standpoint I do not think the Linux OS design is superior. I think Linux has some amazing tools and APIs. dmabufs are sweet. Namespaces and cgroups are cool. BPF and it's various integrations are borderline insane. But at its core, ... It's kinda ugly. These things don't all compose nicely and the kernel is an enormous hard-to-tame beast. Windows NT has its design warts too, all over, like the amount of involvement the kernel has in the GUI for historical reasons, and the enormous syscall surface area, and untold amounts of legacy cruft. But all in all, I think the core of what they made is really cool, the subsystems concept is super cool, and it is an OS design that has stood up well to time. I also think the PE format is better than ELF and that it is literally better for the capabilities it doesn't have w.r.t. symbols. Sure it's ugly, in part due to the COFF lineage, but it's functionally very well done IMO.
I feel the need to say this because I think I probably came off as a hater, and tbh I'm not even a hater of WSL2. It's not as cool as WSL1 and subsystems and pico processes, but it's very practical and the 9p bridge works way better than it has any right to.
Turns out that it's easier to emulate a CPU than syscalls. The CPU churns a lot less, too, which means that once things start working things tend to keep working.
You're thinking of the POSIX personality of Windows NT of old. This was based on Interix and has been deprecated about two decades ago and is now buried so deep that it couldn't be revived.
The new WSL1 uses kernel call translation, like Wine in reverse and WSL2 runs a full blown Linux kernel in a Hyper-V VM. To my knowledge neither of these share anything with the aforementioned POSIX subsystem.
I mean... WINE does the same on windows, but microsoft refuses to release their API docs for all internal APIs. They release WSL by relying on Linux's open-ness, while refusing the same for themselves.
A big one of those reasons was Docker. Docker was still fairly niche when WSL was released in 2016, but demand for it grew rapidly, and I don't think there was any realistic way they could have made it work on the NT kernel.
I think the two fairly deep integrations are window's ability to navigate WSL's filesystem and wslg's fairly good ability to serve up guis.
The filesystem navigation is something that AFAIK can't easily be replicated. wslg, however, is something that other VMs have and can do. It's a bit of a pain, but doable.
What makes WSL nice is the fact that it feels pretty close to being a native terminal that can launch native application.
I do wish that WSL1 was taken further. My biggest grip with WSL is the fact that it is a VM and thus takes a large memory footprint. It'd be nice if the WSL1 approach panned out and we instead had a nice clean compatibility wrapper over winapi for linux applications.
> The filesystem navigation is something that AFAIK can't easily be replicated.
The filesystem navigation getting partially open sourced is one of the more interesting parts being open sourced per this announcement. The Plan9 file server that serves files from Windows into Linux is included in the new open source dump. (The Windows filesystem driver that runs a Plan9 client on the Windows side to get files from Linux is not in the open source expansion.)
It's still fascinating that the whole thing is Plan9-based, given the OS never really succeeded, but apparently its network file system is a really good inter-compatibility file communication layer between Linux and Windows.
> I do wish that WSL1 was taken further.
WSL1 survives and there's still a chance it will see more work eventually, as the tides shift. I think the biggest thing that blocked WSL1 from more success was lack of partners and user interest in Windows Subsystem for Android apps. That still remains a potentially good idea for Windows if it had been allowed "real" access to Google Play Services and App Store, rather than second rate copy of Amazon's copy of Google Play Services and Fire App Store. An actual Google partnership seems doomed given one of the reasons to get Windows Subsystem for Android competitive was fear of ChromeOS, but Google still loves to talk about how "Open" Android is despite the Google Play Services moat and that still sounds like something that a court with enough fortitude could challenge (even if it is probably unlikely to happen).
> The integration between Windows and the WSL VM is far deeper than a typical VM hypervisor.
Sure, but I never claimed otherwise.
> You cannot claim with a straight face that Virtualbox is easier to use.
I also didn't claim that. I wasn't comparing WSL to other virtualization solutions.
WSL2 is cool. Linux doesn't have a tool like WSL2 that manages Linux virtual machines.
The catch 22 is that it doesn't need one. If you want to drop a shell in a virtual environment Linux can do that six ways through Sunday with no hardware VM in sight using the myriad of namespacing technologies available.
So while you don't have WSL2 on Linux, you don't need it. If you just want a ubuntu2204 shell or something, and you want it to magically work, you don't need a huge thing with tons of integration like WSL2. A standalone program can provide all of the functionality.
I have a feeling people might actually be legitimately skeptical. Let me prove this out. I am on NixOS, on a machine that does not have distrobox. It's not even installed, and I don't really have to install it since it's just a simple standalone program. I will do:
$ nix run nixpkgs#distrobox enter
Here's what happened:
$ nix run nixpkgs#distrobox enter
Error: no such container my-distrobox
Create it now, out of image registry.fedoraproject.org/fedora-toolbox:latest? [Y/n]: Y
Creating the container my-distrobox
Trying to pull registry.fedoraproject.org/fedora-toolbox:latest...
...
0f3de909e96d48bd294d138b1a525a6a22621f38cb775a991974313eda1a4119
Creating 'my-distrobox' using image registry.fedoraproject.org/fedora-toolbox:latest [ OK ]
Distrobox 'my-distrobox' successfully created.
To enter, run:
distrobox enter my-distrobox
Starting container... [ OK ]
Installing basic packages... [ OK ]
Setting up devpts mounts... [ OK ]
Setting up read-only mounts... [ OK ]
Setting up read-write mounts... [ OK ]
Setting up host's sockets integration... [ OK ]
Integrating host's themes, icons, fonts... [ OK ]
Setting up distrobox profile... [ OK ]
Setting up sudo... [ OK ]
Setting up user groups... [ OK ]
Setting up user's group list... [ OK ]
Setting up existing user... [ OK ]
Ensuring user's access... [ OK ]
Container Setup Complete!
[john@my-distrobox]~% sudo yum install glxgears
...
Complete!
[john@my-distrobox]~% glxgears
Running synchronized to the vertical refresh. The framerate should be
approximately the same as the monitor refresh rate.
302 frames in 5.0 seconds = 60.261 FPS
^C
No steps omitted. I can install software, including desktop software, including things that need hardware acceleration (yep, even on NixOS where everything is weird) and just run them. There's nothing to configure at all.
That's just Fedora. WSL can run a lot of distros, including Ubuntu. Of course, you can do the same thing with Distrobox. Is it hard? Let's find out by using Ubuntu 22.04 instead, with console output omitted:
To be completely, 100% fair: running an old version of Ubuntu like this does actually have one downside: it triggers OpenGL software rendering for me, because the OpenGL drivers in Ubuntu 22.04 are too old to support my relatively new RX 9070 XT. You'd need to install or copy in newer drivers to make it work. There are in fact ways to do that (Ubuntu has no shortage of repos just for getting more up-to-date drivers and they work inside Distrobox pretty much the same way they work in real hardware.) Amusingly, this problem doesn't impact NVIDIA since you can just tell distrobox to copy in the NVIDIA driver verbatim with the --nvidia flag. (One of the few major points in favor of proprietary drivers, I suppose.)
On the other hand, even trying pretty hard (and using special drivers) I could never get hardware acceleration for OpenGL working inside of WSL2, so it could be worse.
That aside, everything works. More complex applications (e.g. file browsers, Krita, Blender) work just fine and you get your normal home folder mapped in just like you'd expect.
> I get that WSL is revolutionary for Windows users
It is... I'm working these days on bringing a legacy windows only application to the 21st century.
We are throwing a WSL container behind it and relying on the huge ecosystem of server software available for Linux to add functionality.
Yes that stuff could run directly on windows, but you'd be a lot more limited in what's supported. Even for some restricted values of supported. And you'd have to reinvent the wheel for a few parts.
With WSL you can use “Linux the good parts” (command line tools, efficient-enough paradigms for fork() servers) and completely avoid X Windows, the Wayland death spiral, 100 revisions of Gnome and KDE that not so much reinvent the wheel but instead show us why the wheel is not square or triangular…
It's all opinion of course, but IMO Windows is the most clumsy and unintuitive desktop experience out there. We're all just used to the jank upon jank that we think it's intuitive.
KDE is much more cohesive, stable, and has significantly more features.
/s indeed because there are actually no plans at all to replace Wayland!
I think the infamous cascade of attention-deficit teenagers (CADT) has slowed down quite a bit in the desktop space because... well, most developers there are over 30 now.
It blows my mind that people can complain about the direction KDE is going when trying to paint a picture about how it's so much nicer to use Windows. I know the boiling frog experiment is fake, but just checking: are you sure the water isn't getting a little uncomfortably warm in the Windows pool right now?
Agreed. I used tiling WMs for a long while (ion3, XMonad) and it was such a productivity boost.
Then I was forced to use a Mac for work, so I was using a floating WM again. On my personal machine, ion3 went away and I never fully got around to migrate to i3.
By the time I got enough free time to really work on my personal setup, it had accumulated two huge monitors and was a different machine. I found I was pretty happy just scattering windows around everywhere. Especially with a trackball's cursor throw. This was pretty surprising to me at first.
Anyway this is just my little personal anecdote. If I go back to a Linux install I'll definitely have to check out i3 again. Thanks for reminding me :)
Compiling and testing cross-platform software for Linux lately (Ubuntu and similar)... You can't even launch an application or script without CLI. Bad UX, IMO. For these decisions, There are always reasons, a justification, something about security. I don't buy it.
I compile my program using WSL, or Linux native. It won't launch; not an executable. So, into the CLI: chmod +x. Ok. It's a compiled binary program, so semantically I don't see the purpose of this. Probably another use case bleeding into this. (I think there's a GUI way too). Still can't double click it. Nothing to launch from the right-click menu. After doing some research, it appears you used to be able to do it (Ubuntu/Gnome[?]), but it was removed at some point. Can launch from CLI.
I make a .desktop file and shell script to move it to the right place. Double click the shell file. It opens a text editor. Search the right click menu; still no way. To the CLI we go; chmod +x, and launch if from the CLI. Then after adding the Desktop icon, I can launch it.
On windows, you just double click the identified-through-file-extension executable file. This, like most things in Linux, implies the UX is designed for workflows I don't use as a PC user. Likely servers?
This sounds very weird to me. Any sane build toolchain should produce a runnable executable that already has +x. What did you use to compile it?
Removing double-click to run an executable binary certainly sounds like something either Gnome or Ubuntu would do, but thankfully that's not the only option in town. In KDE I believe the same exact Windows workflow would just work.
Yeah the typical way programs are run is by using a .desktop file that's installed. The reason nobody cares is because running random executable that have a GUI is a pretty rare use case for Linux desktops. We don't have wizards or .msi installers, we just install using the package manager. And then it shows up where it needs to.
If you're on KDE, you can right-click the start menu and add the application. Also, right-click menu should give you a run option.
This is very much YMMV thing. There is no objectively best platform. There are different users and requirements.
I’ve been a software developer for 20 years and in _my_ opinion Windows is the best platform for professional software development. I only drop of to linux when need some of the excellent posix tools but my whole work ergonomy is based on Windows shortcuts and Visual Studio.
I’ve been forced to use Mac for the past 1.5y but would prefer not to.
Why would Windows be superior for me? Because that’s where the users are (for the work stuff I did before this latest gig). I started in real time graphics and then spent over a decade in CAD for AEC (developing components for various offerings including SketchUp). The most critical thing for the stuff I did was the need to develop on the same platform as users run the software - C++ is only theoretically platform independent.
Windows API:s are shit for sure for the most part.
But still, from this pov, WSL was and will be the best Linux for me as well.
I fully agree with you - "YMMV" is the one true take. Visual Studio has never been particularly attractive to me, my whole workflow is filled with POSIX tools, and my code mostly runs on Docker and Linux servers. Windows is just another thing to worry about for me, be it having to deal with the subtle quirks of WSL not running on raw metal or having to deal with running UNIX-first tooling (or finding alternatives) on Windows. If it wasn't for our work provided machines being Windows by default, and at home, being into VR gaming and audio production (mostly commercial plugins), I'd completely ditch Windows in a heartbeat.
Just FYI, you may also enjoy systemd-machine. It's essentially the same thing as toolbx but it handles the system bus much more sanely, and you can see everything running inside the guest from the host's systemctl.
If Windows provided easier access to hardware, especially USB, from WSL it would be nice. In fact, if WSL enumerated devices and dealt with them as native Linux does, even better.
Windows has many useful software that is not available on Linux.
So, for me Windows + WSL is more productive than just using Linux.
The UI is still better on Windows(basic utilities like File Explorer and Config Management is better on Windows). No Remoting Software beats RDP. When I remote to a Windows workstation through RDP, I can't tell the difference. VNC is always janky. Of course there is Word/Excel/Illustrator which is simply not available on Linux
File Explorer is better on Windows? How? I tried Windows 11 for the first time a month ago and it takes several seconds for file explorer to open, it's asynchronously loading like 3 different UI frameworks as random elements pop in with no consistency, there's two different rightclick menus because they couldn't figure out how to make the new one have all the functionality of the old one so they decided to just keep the old one behind "Show More Options", and it's constantly pushing OneDrive in your face. I'm offended that this is what they thought is good enough to ship to a billion users.
The File Explorer on Windows 11 is the worst experience ever. Windows 7 was snappy as hell, but I don't know what they did to damage it that badly. I use XYplorer, which is written in Visual Basic (so a 32 bit application), but is so much faster the native explorer (and is full with features).
> No Remoting Software beats RDP. When I remote to a Windows workstation through RDP, I can't tell the difference. VNC is always janky
Any recent distro running Gnome or KDE has built-in support for connecting and hosting an RDP session. This used to be a pain point, you don't need to use VNC anymore.
It's actually worse on windows since you need to pony up for a pro license to get RDP hosting support...
> The UI is still better on Windows(basic utilities like File Explorer and Config Management is better on Windows).
5 years ago, we would be comparing old GNOME 3 or KDE Plasma 5 on X11 and Windows 10. I would be forced to agree. The Windows UI was better in many ways at that point.
Today we have KDE Plasma 6.3 on Wayland and Windows 11. This is an entirely different ball game. It's hard to explain. Wayland feels like it has taken an eternity to lift off, like well over a decade, but now things change dramatically on the scale of months. A few months ago HDR basically didn't work anywhere. Right now it's right in front of me and it works great. You can configure color profiles, SDR applications don't break ever, and you even get emulated brightness. Display scaling? Multiple monitors with different scale factors? What about one monitor at 150% and another at 175% scale factor? What about seamlessly dragging windows between displays with different scale factors? Yes, Yes, Yes, and Yes. No `xrandr` commands. You configure it in the GUI. I am dead serious.
File Explorer? That's the application that has two context menus, right? I think at this point Windows users might actually be better off installing KDE's Dolphin file manager in Windows for the sake of their own productivity. If I had the option to use Windows File Explorer on KDE I would impolitely decline. I have not encountered any advertising built into my file explorer. I do not have an annoying OneDrive item in the menu on the left. I have a file tree, a list of Places, and some remote file shares. When I right click it does not freeze, instead it tends to show the context menu right away. And no, I'm not impressed by Tabs and Dark Mode, because we've had that on Linux file managers for so long that some people reading this were probably born after it was already supported.
Windows still has the edge in some areas, but it just isn't what it used to be. The Linux UI is no longer a toy.
> When I remote to a Windows workstation through RDP, I can't tell the difference. VNC is always janky.
I don't really blame you if you don't believe me, but I, just now, went into System Settings, went to the Remote Desktop setting, and clicked a toggle box, at which point an RDP server spawned. Yes, RDP, not VNC, not something else. I just logged into it using Reminna.
Not everything on Linux is seamless and simple like this, but in this case it really is. I'm not omitting a bunch of confusing troubleshooting steps here, you really can do this on a modern Linux setup, with your mouse cursor. Only one hand required.
> Of course there is Word/Excel/Illustrator which is simply not available on Linux
True, but if you want to use Linux and you're held back by needing some specific software, maybe it's not the end of the world. You have many options today. You can install VirtualBox and run your spreadsheets in there. You can use Office 365 in a browser. You can run Crossover[1] and emulate it. You can use an office alternative, like possibly WPS Office. You can dual boot. You can go the crazy route and set up a KVM GPU passthrough virtual machine, for actually native performance without needing to reboot.
The point I'm making here is not "Look, Linux is better now! Everyone go use it and get disappointed ASAP!" If you are happy with Windows, there's literally no point in going and setting yourself up for disappointment. Most people who use Linux do so because they are very much not happy with Windows. I'm sure you can tell that I am not. However, in trying to temper the unending optimism of Linux nerds, sometimes people go too far the other way and represent Linux as being in far worse of a state than it actually is. It really isn't that bad.
The worst thing about modern Linux is, IMO, getting it to work well on your hardware. Once you have that part figured out, I think modern Linux is a pretty good experience, and I highly recommend people give it a shot if they're curious. I think Bazzite is a really nice distro to throw on a random spare computer just to see what modern Linux is actually capable of. It's not the absolute most cutting edge, but it gives you a nice blend of fairly up-to-date software and a fairly modern RPM ostree base system for better stability and robustness, and it's pretty user-friendly. And if you don't like it, you can easily get a full refund!
> You can use an office alternative, like possibly WPS Office.
Or ONLYOFFICE, which is FOSS (and what I use personally). Or LibreOffice (also free/libre software, of course). I don’t miss MS Office one bit, the compatibility is nothing short of excellent nowadays, and the speed and UX both surpass it.
There are specialized software packages that are Windows-only, of course, but at least office programs ain’t it.
IDK how many VMs you've used, but there has been a lot of work specifically with x86 to make VMs nearly as fast as native. If you interact with cloud services everything you do is likely on a VM.
It's handy if you have other services that are Windows-based, though. And, being a VM, it's fairly convenient to have multiple versions and to back up.
Linux doesn't need VMs, people need VMs. If you spend most of your time in Windows-exclusive apps and use WSL2 on occasion, then you already know what you want, why are you worried about arguing about it on the Internet?
For many software engineers, a lot of our work is Linux, and it wouldn't be atypical to spend most of the time doing Linux development. I work on Linux and deploy to Linux, it's just a no-brainer to run Linux, too, aside from the fact that I simply loathe using modern Windows to begin with.
(Outside of that, frankly, most people period live inside of the web browser, Slack, Discord, and/or Steam, none of which are Windows-exclusive.)
My point isn't that Linux is better than Windows, it's that WSL2 isn't better than literally running Linux. If you need to do Linux things, it is worse than Linux at basically all of them.
You still have to go and make sure that what you want is there and works, but it's not a bad bet. With a few major omissions aside, there is a pretty big library of supported games.
> For anything that is PvP multiplayer, this is very much not a given because of how pervasive kernel-level anti-cheat solutions are today.
To be fair, though, you probably still have a better shot of being able to play the games you want to under Linux than macOS and that doesn't seem to be that bad of an issue for Mac users. (I mean, I'm sure many of them game on PC anyways, but even that considered macOS has greater marketshare than Linux, so that's a lot of people either able to deal with it or have two computers.)
Speaking as a Mac user, it's really bad. Much worse than Linux/SteamOS actually. Not only most games just aren't there, many games that are advertised as Mac-compatible are actually broken because they haven't been updated for a long time, and macOS is not particularly ABI-stable when it comes to GUI. Sometimes they just don't support hi-DPI, so you can play it but forget about 4K. But sometimes it just straight up won't start.
I do indeed have two computers with a KVM setup largely for this reason, with a secondary Windows box relegated to gaming console role.
Fair point. I know it was rough when Apple made the break-away with 32-bit.
Still, the point is that you can make it work if you want to make it work. Off the top of my head:
- Two computers, completely separate. Maybe a desktop and a laptop.
- Two computers, one desk and a KVM like you suggest.
- Two computers, one desk. No proper KVM, just set up remote desktop and game streaming.
- (on Linux) KVM with GPU passthrough, or GPU passthrough with frame relay. One computer, one desk.
- Game streaming services, for more casual and occasional uses.
- Ordinary virtualization with emulated GPU. Not usually great for multimedia, but still.
- And of course, Steam Play/Heroic Launcher/WINE. Not as applicable on macOS, but I know CodeWeavers does a lot to keep macOS well-supported with Crossover. With the aforementioned limitations, of course.
Obviously two computers has a downside, managing two boxen is harder than one, and you will pay more for the privilege. On the other hand, it gives you "the real thing" whenever you need it. With some monitors having basic KVM functionality built-in, especially over USB-C, and a variety of mini PCs that have enough muscle to game, it's not really the least practical approach.
I suspect for a lot of us here there is a reasonable option if we really don't want to compromise on our choice of primary desktop OS.
The important bit though is that Docker containers are not VMs or sandboxes, they're "just" a combination of technologies that give you an isolated userland using mostly Linux namespaces. If you're running a Linux host you already have namespaces, so you can just use them directly. Distrobox gives you basically the same sort of experience as WSL2 except it doesn't have any of the weird parts of running a VM because it's not VMs.
Windows supports Linux because the latter is open source, it's a lot easier than the reverse.
Linux, on the other hand, barely supports Windows because the latter is closed, and not just closed, windows issues component updates which specifically check if they run in wine and stop running, being actively hostile to a potential Linux host.
The two are not equivalent, nobody in the Linux kernel team is actively sabotaging WSL, whereas Microsoft is actively sabotaging wine.
Do you have a link to where I can read more about this? My understanding is that Microsoft saw Wine as inconsequential to their business, even offloading the Mono runtime to them [1] when they dropped support for it.
> Until 2020, Microsoft had not made any public statements about Wine. However, the Windows Update online service will block updates to Microsoft applications running in Wine. On 16 February 2005, Ivan Leo Puoti discovered that Microsoft had started checking the Windows Registry for the Wine configuration key and would block the Windows Update for any component.[125] As Puoti noted: "It's also the first time Microsoft acknowledges the existence of Wine."
Microsoft seems to be taking a outside-in "component at a time" approach to open sourcing Windows. Terminal, Notepad, Paint, Calculator, the new Edit.com replacement, a lot of WSL now, etc.
This approach has been fascinating so far, but yeah not "exciting" from "what crazy things can I do with Windows like put it in a toaster" side of things.
It would be great to see at least a little bit more "middle-out" from Windows Open Source efforts. A minimal build of the NT Kernel and some core Windows components has been "free as in beer" for a while for hobby projects with small screens if you really want to try a very minimal "toaster build" (there's some interesting RPi out there), but the path to commercialization is rough after that point and the "small screens" thing a bit of a weird line in the sand (though understandable given Microsoft's position of power on the desktop and sort of the tablet but not phone).
The NT Kernel is one of the most interesting microkernels left in active use [0], especially given how many processor architectures it has supported over decades and how many it still supports (even the ones that Windows isn't very commercially successful on today). It could be a wealth of power to research and academia if it were open source, even if Microsoft didn't open source any of the Windows Subsystems. It would be academically interesting to see what sort of cool/weird/strange Subsystems people would build if NT were open source. I suppose Microsoft still fears it would be commercially interesting, too.
[0] Some offense, I suppose to XNU here. Apple's kernel is often called a microkernel for its roots from the Mach kernel, but it has rebuilt some monoliths on top of that over the years (Wikipedia more kindly calls it a "hybrid kernel"), and Mach itself is still so Unix flavored. NT's "object oriented" approach is rather unique today, with its more VMS heritage, a deeply alternate path from POSIX/Unix/Linux(/BSD).
I doubt it would happen, large projects that aren't open source from the onset and are decades old can have licensed or patented code, Microsoft would have to verify line by line that they can open source it.
Wait long enough and it will happen, the question is just "how long". (Microsoft has open-sourced OS and languages from the 1980s) Some days it seems like Microsoft is more interested in Azure, Copilot and GAME PASS and Windows is an afterthought.
I would certainly love it if Microsoft stopped trying to sell Windows and just open sourced it. I think Windows is a much more pleasant desktop operating system than Linux, minus all the ads and mandatory bloat Microsoft has put in lately. But if Windows was open source the community could just take that out.
I really don't see it happening any time in the next decade at least, though. While Windows might not be Microsoft's biggest focus any more it's still a huge income stream for them. They won't just give that up.
I preferred WSL to running linux directly even though I had no need for any windows only software. Not having to spend time configuring my computer to make basic things work like suspend/wake on lid down/up, battery life, hardware acceleration for video playback on the browser, display scaling on external monitor and so on was reason enough.
That was certainly not the case ~2 years ago, the last time I installed linux on a laptop.
It also doesn't appear to be the case even now. I searched for laptops available in my country that fit my budget and for each laptop searched "<laptop name> linux reddit" on google and filtered for results <1 year old. Each laptop's reports included some or other bug.
The laptop with the best reported linux support seemed to be Thinkpad P14s but even there users reported tweaking some config to get fans to run silently and to make the speakers sound acceptable.
You are going to find issues for any computer for any OS by looking things up like this.
And yeah, it's best to wait a bit for new models, as support is sorted out, if the manufacturer doesn't support Linux itself. Or pick a manufacturer that sells laptops with Linux preinstalled. That makes the comparison with a laptop with Windows preinstalled fair.
> You are going to find issues for any computer for any OS by looking things up like this
I wasn't cherry-picking things. I literally searched for laptops available in my budget in my country and looked up what was the linux support like for those laptops as reported by people on reddit.
> Or pick a manufacturer that sells laptops with Linux preinstalled
I suppose you are talking about System76, Tuxedo etc. These manufacturers don't ship to my country. Even if I am able to get it shipped, how am I supposed to get warranty?
You weren't cherry picking but the search query you used would lead to issue reports.
HP, Dell and Lenovo also sell Linux laptops on which Linux runs well.
I sympathize with the more limited availability and budget restrictions, but comparisons must be fair: compare a preinstalled Windows and a preinstalled linux, or at least a linux installed on hardware whose manufacturer bothered to work on Linux support.
When the manufacturer did their homework, Linux doesn't have the issues listed earlier. I've seen several laptops of these three brands work flawlessly on Linux and it's been like this for a decade.
I certainly choose my laptops with Linux on mind and I know just picking random models would probably lead me to little issues here and there, and I don't want to deal with this. Although I have installed Linux on random laptops for other people and fortunately haven't run into issues.
As a buyer, how am I supposed to know which manufacturer did their homework and on which laptops?
> it's been like this for a decade
Again, depends on the definition of "flawlessly". Afaik, support for hardware accelerated videoplayback on browsers was broken across the board only three years ago.
> As a buyer, how am I supposed to know which manufacturer did their homework and on which laptops?
You first option is to buy a laptop with linux preinstalled from one of the many manufacturers that provides this. This requires no particular knowledge or time. Admittedly, this may lead you to more expensive options, entry grade laptops won't be an option.
Your second best bet is to read tech reviews. Admittedly this requires time and knowledge, but often enough people turn to their tech literate acquaintance for advice when they want to buy hardware.
> Afaik, support for hardware accelerated videoplayback on browsers was broken across the board only three years ago.
Yes indeed, that's something we didn't have. I agree it sucks. Now, all the OSes have their flaws that others don't have, and it's not like the videos didn't play, in practice it was an issue if you wanted to watch 4K videos for hours on battery. Playing regular videos worked, and you can always lower the quality if your situation doesn't allow the higher qualities. Often enough, you could also get the video and play it outside the browser. I know, not ideal, but also way less annoying that the laptop not suspending when you close the lid because of a glitch or something like this.
> You first option is to buy a laptop with linux preinstalled
I have earnestly tried for >20 minutes trying to find such a laptop with any reputed manufacturer in my country (India) and come up empty-handed. Please suggest any that you can find. Even with Thinkpads, the only options are "Windows" or "No Operating System".
>Your second best bet is to read tech reviews.
Which tech reviews specifically point out linux support?
>Playing regular videos worked, and you can always lower the quality if your situation doesn't allow the higher qualities
The issue was never about whether playing the video worked. CPU video decoding uses much more energy and leads to your laptop running hot and draining battery life.
Can we at least agree to reduce the timeframe for things working flawlessly to "less than two years" instead of "a decade"? Yes you were able to go to the toilet downstairs but the toilet upstairs was definitely broken.
If buying with Linux is not an option at your place, you can always buy one of the many models found with this search without OS and install it yourself. Most thinkpads should be all right. Most elitebooks should do. Dell laptops sold with Ubuntu somewhere on the planet should do. I'm afraid I can't help nore, you'll have to do your search. Finding out which laptops are sold with Linux somewhere should not be rocket science. I don't buy laptops very often, I tend to keep my computers for a healthy amount of time, I can't say what it's like in India in 2025.
> Can we at least agree to reduce the timeframe for things working flawlessly to "less than two years" instead of "a decade"? Yes you were able to go to the toilet downstairs but the toilet upstairs was definitely broken.
No. I understand that it can be a dealbreaker for some, but that's a minor issue for me on laptops, even unplugged, and I do watch a lot of videos (for environmental reasons I tend to avoid watching videos in very high resolutions anyway, so software rendering is a bummer but not a blocker). There are still things that don't work, like Photoshop or MS Office, so you could say that it's still not flawless, still, that doesn't affect me.
Many results, including a US-specific page of the Lenovo website.
>If buying with Linux is not an option at your place, you can always buy one of the many models found with this search without OS and install it yourself.
>Finding out which laptops are sold with Linux somewhere should not be rocket science.
It should not. Given the amount of time I have already spent on trying to find one, it is fair to say that there are none easily available in India, at least in the consumer laptop market.
> I understand that it can be a dealbreaker for some, but that's a minor issue for me on laptops
Stockholm Syndrome was bullshit made up on the spot to cover for the inability of the person making it up to defend their position with facts or logic, and...that fits most metaphorical uses quite well, too, though its not usually the message the metaphor is intended to communicate.
> Many results, including a US-specific page of the Lenovo website.
Are you failing to see that this US-specific page gives you a long list of models you can consider elsewhere?
> Stockholm syndrome.
Yeah, no. It just appears I have different needs than you and value different tradeoffs. It appears that the incredible comfort Linux brings me offsets the minor inconvenience software rendered browser video playback causes me.
I'm done in this discussion, we've been quite far away the kind of interesting discussions I come to HN for for a few comments now.
On Windows, I don't have to pick my hardware accordingly.
I have to onboard a lot of students to work on our research. The software is all linux (of course), and mostly distribution-agnostic. Can't be too old, that's it.
If a student comes with a random laptop, I install WSL on it, mostly ubuntu. apt install <curated list of packets>. Done. Linux laptops are OK too, I think, but so far only had one student with that. Mac OS used to be easy, but gets harder with every release, and every new OS version breaks something (mainly, CERN root) and people have to wait until it's fixed.
> On Windows, I don't have to pick my hardware accordingly.
Fair enough. I think the best way to run Linux if you want to be sure you won't have tweak to stuff is to buy hardware with linux preinstalled. That your choice is more limited is another matter than "linux can't suspend".
Comparing a preinstalled Windows with a linux installed on random laptop whose manufacturer can't be bothered to support is a bit unfair.
Linux on a laptop where the manufacturer did their work runs well.
Yes, machines with Linux preinstalled normally work quite well. But it's still a downside of choosing Linux that the choice of laptops is so much smaller. Similar to the downside of Mac OS that you are locked in to pricey-but-well-built laptops, or the downside of Windows that "it runs Windows" doesn't mean the hardware is not bottom-of-the-barrel crap with a vendor who doesn't care about Linux compatibility. WSL allows to run a sane development environment even then :)
I use Windows with wsl for work, and Linux and MacOS at home. Windows is a mess, it blows my mind that people pay for it. Sleep has worked less reliably on my work machine than my Fedora Thinkpad, and my Fedora machine is more responsive in pretty much every way despite having modest specs in comparison. Things just randomly stop working on Windows in a way that just doesn't happen on other OSes. It's garbage.
> You can use Wine/Crosseover, which is cool, but even now the number of software products it supports is tiny. Steam has a lot of games.
This isn't really the case, and hasn't been for some years now, especially since Valve started investing heavily in Wine. The quality of Wine these days is absolutely stunning, to the point that some software runs better under Wine than it does on Win11. Then there's the breadth of support which has has moved the experience from there being a slight chance of something running on Wine, to now it being surprising when something doesn't.
I am experimenting with Bottles for an article right now, as it happens -- but no, it does not seem to me that Bottles is at all comparable to WinApps.
WinApps runs the apps on real native Windows in a VM, but integrates their UI with the host OS.
WINE does this anyway, and it's an inherent property of WINE because there is no host OS. WINE does some fakery and indirection to make Unix filesystems appear on drive letters and things, but the app is still executing on the host OS, just via a translation layer.
Bottles runs the apps on top of WINE, but maintains separate WINE instances for each app and allows different onces to have different auxiliary tools, such as games-compatibility libraries, different versions of WINE, etc.
Sigh I knew that. I just posted this in the wider context of running Windows-applications under Linux, no matter which way and how, because at the end of the day that's all that counts. Yah, well. Maybe not, because that WiNE approach could be seen as less ressource intensive, while full-VM feels rather bloated, though more stable.
And because it bubbled up instantly, having read about it before, right here on HN, just ...uhhhmmm...hours ago.
FWIW, I tested Bottles on 2 machines here, one with Ubuntu 22.04 and one with Ubuntu 24.04.
I could not get any app to install in Bottles that wouldn't run under bare WINE. Apart from a friendly GUI -- although it looks awful on any other desktop, like most Gtk 4 apps -- I can't see any benefit to it, TBH.
You know what's even more convenient than a VM? Not needing a VM and still having the exact same functionality. And you don't need a bunch of janky wrapper scripts, there's more than one tool that gives you essentially the same thing; I have used both Distrobox and toolbx to quickly drop into a Ubuntu or Fedora shell. It's pretty handy on NixOS if I want to test building some software in a more typical Linux environment. As a bonus, you get working hardware acceleration, graphical applications work out of the box, there is no I/O tax for going over a 9p bridge because there is no 9p bridge, and there is no weird memory balloon issues to deal with because there is no VM and there is no guest kernel.
I get that WSL is revolutionary for Windows users, but I'm sorry, the reason why there's no WSL is because on Linux we don't need to use VMs to use Linux. It's that simple...