I think you mean here.
In my experience, it’s here.
Finally some real fucking food
That is generally what I use in my homelab. Though I’ve found that Fedora works a bit better for a general purpose daily workstation OS.
Well aKsHuLiY i used your method. Its just not beginner friendly.
Did you even read the wiki? It’s so easy! Totally beginner friendly provided a basic level of literacy.
/s, hopefully obviously. Arch is a fragile house of cards.
Well, at least for normies who don’t bother learn something new and/or think CLI is very scary
ohyou dot jaypeg
nah, the real GOAT is https://get.opensuse.org/tumbleweed/
the entire reason I switched to Linux – back in January I asked myself “if I have to fight my operating system to make it work right for me anyways, why pay for the privilege?”
like sure updates break things on Linux too occasionally but at least they don’t reinstall spyware I had to spend a day ripping out after the last update.
I’ll have you know I’ve never paid for Windows in my life!
Once for 7 Pro. Still running the same license all these years later.
Also, I use Kubuntu, but I go with
minimal install
to avoidsnap
fuckery, btw.why not kde neon.
Also, Windows is catching up on the breaking of things, while Linux has improved dramatically. At least some distros are incredibly stable.
Has a Linux update ever broken something on my computer? Yes. Have I ever needed to revert versions? Yes.
Has a Linux update ever broken my computer so badly, that a hardware component on the motherboard had permanently stopped working, even after reinstalling firmware? No, but a windows update did once. I had to dual-boot Mint just so I could use WiFi.
So, wait, you are claiming that a Windows update broke your hardware so bad you had to reinstall the firmware, but it magically worked on a linux distro? First of all, that means it wasn’t “permanently stopped [from] working”. Second, I hate to break it to you, but it sounds like Windows might have fucked up a setting, and then you user-errored your way into breaking things. I’ve never had something break that can’t be fixed with a full system restore or reinstall, and it sounds like you had a problem just like that. If it worked on Linux, you could have gotten it working on Windows, too, because it’s clearly a software error at that point.
Don’t know what to tell you. All I know is that WiFi worked before the update, and then didn’t after. Updating the firmware didn’t fix it. Reinstalling the OS didn’t fix it. Taking it to the PC repair shop didn’t fix it. Replacing the network card didn’t fix it. But dual-booting Linux mint did fix it, on the mint partition, at least.
Every. Single. Time. That Linux has broken on me it’s been my fault. I’ve tried to go against an automated process to make what I wanted happen. Or I’ve removed an annoying apt update warning about some unused pub key. And I’ve totally shit bricked countless installs. Probably in the mid double digits.
I’ve burnt through valuable pictures, documents and data. Wasted weekends reinstalling and reconfiguring Linux. BUT, I did that. Not Microsoft, no one held my hand and I certainly learned and never repeating most of those mistakes again.
Most importantly, Linux let me do those things. Linux let me be a better end user and admin because I respected my environments more.
If you switch to Linux you don’t have to be an admin or go nuts…but Linux isn’t going to stop you if you want to.
This is so true. Most of the tools justifying the use of WSL aren’t even supported. Either because of technical limitations or because of security concerns.
Why do people use wsl? The only reason I can think of is to take advantage of Bash and the shell environment. But if wsl runs in its own container separate from Windows, what’s the point?
When WSL first came out, all the documentation i read from Microsoft led me to believe it was intended to help developers who are cross-developing software for both Linux and Windows to more easily test features and compatibility and to ensure software behaves consistently. It never seemed like they intended it to be used to run Linux programs fully and integrate into the Windows environment. It always seemed like it was just there for convenience so a smaller budget developer could develop on one machine and not need to be constantly rebooting or running VMs.
Maybe I’m not aware of similar configurations you can do, but it’s only sorta it’s own container. VSCode can actually directly connect to it automatically so you can develop in host os but run directly against the container. Additionally this means some visualization/gui interfaces can be visible on the host side (this is a gift and a curse).
So you basically have system integrated containers/vms. It’s not perfect, but it is definitely leagues better than what windows development was prior and may have some advantages over Linux only deployments (not sure if the system integrations are feasible in Linux hosts).
Msys2 can be used for that btw
Wine would like a word
Aka “Windows subsystem for Linux”
Uno reverse be like
Glad to know I’m not the only one peeved by the fact the name is unequivocally wrong
Its a dumb name, but its far from being unequivocally wrong. It’s a windows subsystem, which is used for linux.
Windows submissive for Linux
I was just thinking this. If you’re dev, being on pure Linux makes a ton of sense. But if you’re a gamer, Windows is still your best option.
As someone who just last week cut windows out it does feel like this can be true. I am annoyed that xorg doeskin handle multiple monitors with varied refresh rates, so you use wayland instead, but wayland doesn’t play nice with remote access programs, and then you find out that nothing will allow you to use gsync/freesync unless all of your monitors can use it. Also as someone who uses Autocad daily I have to figure out a gpu passthrough VM at some point.
So i am losing the some of my “ideal” setup in my move to linux, but im still happy to lose a smidgen of things I want for the far greater advantage of not worrying about windows updates reinstalling bloatware, or harvesting data that goes above and beyond what anyone else has been doing up to this point. I’m kind of unwilling to have my OS monitoring me, and i can read the witting on the wall, it will only get worse.
Anybody who thinks running Windows is easier hasn’t tried to get Tensorflow working on Windows with GPU support.
In theory, it could run on a straight Windows build of Python, but nobody seems to have given that serious consideration. It must go through WSL, but that means passing through the GPU to WSL. When you Google how to do it, you’ll find three different approaches that have been taken over the years, only one of which is valid on modern setups. If you take one of the old approaches, you will likely twist your system in knots and need a complete reinstall to fix.
On Linux, you install the GPU drivers, compile Tensorflow with the GPU flags, and you’re done.
Linux using wine and bottles:👁👄👁
Should use Linux Subsystem for Windows instead.
Why would anyone want this? I don’t get it. I haven’t needed Linux for anything in years. I actually run docker for Linux stuff (mostly for fun) and it works if you’re competent (same as anything Linux related, it requires work.)
Oh sorry, I forgot where I am. Windows bad! BAD WINDOWS.
I could ask the same. Why would anyone want windows? I haven’t needed windows for anything in years. I run wine for windows stuff(mostly for fun) and it works if you’re competent.
Seriously tho, I care about my privacy somewhat, and a lot of my hardware just doesn’t run windows because it’s too heavy. I also dislike the lack of control I have over windows and can’t work with the windows DE in general.
mingw64 and cygwin