I was thinking about the recent story about the DB looking for windows 3.1 administrator.
A classic issue I’ve soon working in heavy industry is that hardware last longer than windows version. So 10 years ago, you bought a component for the product you design or a full machine for your factory which only comes with a windows XP driver.
10 year latter, Windows XP is obsolete, upgrading to a more recent windows might be an option but would cost a shit load of money.
I have therefore the impression that Linux would offer more control to the professional user in term of product lifecycle and patch deployment. However, there is always that stupid HW which doesn’t have a Linux driver.
Market share. If you look on the server side though, you find the total opposite.
One thing to also remember is 15 years ago there was a lot of anti-Linux marketing. To be fair, Linux sucked back then
Linux sucked back then
Linux already ran the vast majority of the web and internet services back then. I think qualifying as “it sucked” isn’t particularly accurate. Remember, fifteen years ago, Vista was still very current with Windows 7 having just been released.
Linux was a terrible experience for the standard home user and non programmer professional uses. You could make it work for some stuff, but today I’d feel comfortable telling anyone with basic computer and troubleshooting skills that they can make linux work for them. Meanwhile 7 or so years ago I was an engineering student who tried ubuntu and found it couldn’t do what I wanted and took too much work to do what it could.
yeah, especially desktop and interface, that is changing tho i already saw a lathe running ubuntu mate
It’s happening slowly. A year ago my employer had all PLCs, and we are starting on Linux+PIs
PLCs just have so many issues and none of them are being resolved.
It does but for the 90’s/00’s a computer typically meant Windows.
The ops staff would all be ‘Microsoft Certified Engineers’, the project managers had heard of Microsoft FuD about open source and every graduate would have been taught programming via Visual Studio.
Then you have regulatory hurdles, for example in 2010 I was working on an ‘embedded’ platform on a first generation Intel Atom platform. Due to power constraints I suggested we use Linux. It worked brilliantly.
Government regulations required anti virus from an approved list and an OS that had been accredited by a specific body.
The only accredited OS’s were Windows and the approved Anti Viruses only supported Windows. Which is how I got to spend 3 months learning how to cut XP embedded down to nothing.
the simple answer is Microsoft
they have so much backwards compatibility built into their operating system it’s becoming a problem
Linux systems are also more varied, having one target is easy so you know it works,
anither reason is Windows is the largest desktop operating system, so if some random has to use it they’ve probably seen it before
You said it - money. They want you to have to buy new.
Yep, a place i was working in had a plotter that only had drivers for windows ME. not 98, not 2000. Only for windows ME. I gave up after 30 minutes of scouring the internets and left the piece alone. But it was mind boggling to me that they had a precious and fragile desktop that was the only thing that could talk to that plotter. That was around 2003-2004, VMs were not around much. I hope that the guy that worked there after me cloned the machine into an image.
2 main reasons in my view:
- windows is the de facto standard for desktop ans users management. So each corp has at least one guy used to the interface to dofirst-level debug
- windows comes with support, not linux. So corps don’t want to employe one Linux admin “just in case”. That’s the main reason I keep hearing from sysadmins I know
Removed by mod
There’s plenty of support for Linux.
This is the way tech is. Once someone gets an idea in their head it doesn’t go away. IT guy at my job told me this about two years ago, ok yeah buddy not like I used to sysadmin a RHEL system for years. No support contracts, or irc rooms, or websites, or books, or man pages, no nothing.
I just stopped arguing with people about this stuff. I get a contract and I give them the best design I can come up with. They tell me they want to use some ancient thing and I give it to them.
Removed by mod
Oh I believe. I have a folder full of schematics named “for morons” on my work computer. It is achieves most of the basic functionality of a modern design except every part of it looks like it was made in 1994 or so. Your tax dollars at work btw.
Did your buddy make that grass hill thing as well as the background?
Removed by mod
Others have given good reasons. I just wanted to point out that you are generally supposed to use a dedicated computer that is firewalled or, preferably airgapped. And never patch anything unless absolutely required.
Purchasing decisions are often made by people who are not IT experts. They are heavily advertised to and lobbied at by Microsoft. Also, the people who make purchasing decisions now won’t have to live with the consequences twenty years down the line. So they go with the easy option.
Removed by mod
The main reason is that there is no single Linux operating system. Linux basically is just the kernel. Every thing else around this kernel, like tools, applications and libraries, is highly customisable and exists in form of various forms of Linux operating system distributions. The fact that these distributions are very different from each other makes it almost impossible to certify industrial products for „the Linux“ operating system. There are just too many variations of it.
At least the whole automobile industry runs on Linux (and now more and more Android)
I don’t think it’s the marketing crap, but business people like to have warranty and someone to blame when something goes wrong.
Now, GPL and MIT explicitly state that there is no warranty, not even implied warranty.
Linux is on a shitton of products nowadays, but often people don’t recognise it as such.
In terms of the DB - Windows 3.1 story the solution is simple: It’s time. Windows 3.1 is old as fuck and while Linux is one year older in theory it was a hobby project back then and, more important, was not providing any graphical interface back then - which is what the German Bahn used it for - as a graphical interface. While Unix would have been an option these systems were often hardware-vendor specific(AIX for IBM, HP-UX for HP) and the then standard supplier Siemens-Nixdorf did not provide it’s own Os afaik.OS/2 was basically still Microsoft at that point so there was little reason to not use MS.
The other point are the incredibly long development and usage times of industrial equipment - if you start to design a new high speed train from scratch it can easily take 15 years from start to finish - and the decision which OS you use is one to be done rather early on. And that train will then be used for 30-40 years. The complete IT business will change A LOT during that time. And maybe you bring out a newer model but need it to be backward compatible for some reason. And bam,you use Windows 3.1 in 2030.
For a matter of fact I know of at least one nuclear plant still being controlled by a Digital Equipment PDP11 and one conventional power plant controlled by a Robotron system.
Which brings us to the old:“Never change a running system”. If your application works under windows XP and usually these things do after so many years, why do you need a new OS ? Unlike consumer systems these systems are often in a walled garden or not connected at all and there is literally zero reason to change it.
Nowadays things are different - we have much more outside connections in hardware and linux/Unix is in many more products than people think - I personally would even be fairly sure that it’s in more products than MS nowadays, since they got rid of CE there has been a steep decline in their market share afaik.
Because someone in sales ordered it to be done that way.