4 Myths Propagated by Delusional Linux Enthusiasts

I’ll be up front about it.  I’m not a fan of Linux or any other Unix variant.  This includes BSD and Debian and Android and all those other little pet names they give the various distributions of Unix that are far too variable to count.  Over the years, Linux enthusiasts loved to tout all the reasons that their beloved open source operating system was superior to Windows, citing “stability” and “security” being two of the biggest reasons.

Now from the perspective of a guy who works mostly with Windows but is forced to tolerate Unix variants from time to time, I have to say that all the reasons people choose Unix over Windows are just completely baseless, especially in these modern times with just one notable exception: Windows has a price, and open source operating systems are typically free.

In the grand scheme of a company with even a small operating budget, Windows is far from expensive, even for the super server editions.  It could be said that part of the reason Windows isn’t terribly expensive is because the open source operating systems of the world are there to compete with it…. so in that respect, it is good that there is competition.

But I’m a software developer, and frankly, I like to get things “done”, and honestly, Windows is a far better tool for getting things done than any Unix based OS ever was and I feel like I must defend Windows against many of the attacks that hackers love to throw at it that are completely baseless.  Here are some of them:

Myth #1: Windows is less advanced than Unix/Linux Operating systems.

This is just absolutely false and was only ever true when compared to Windows 95/98/Me.  For a long time, there were two very distinct flavors of Windows.  The first one was the basis for all the consumer editions of windows, and was, for lack of a better term, a total piece of crap.  Windows 95 was a rehash of much of the source code that brought you the delights of Windows 3.0, 3.1, 3.11 a.k.a. “Windows for Workgroups”.  And I and everyone serious about computers knows and fully admits that the line of Windows products, built on top of a consumer-grade kernel, was absolutely terrible and unstable.  I won’t go into the details of “why” on technical terms, but the bottom line is that Microsoft needed to launch a product that was compatible with games like “Doom” and played well with graphics and sound cards, and a bloated Hardware Access Layer (HAL) created two problems 1) The system would be slower and 2) 90% of all the consumer hardware out there would no longer be supported.

The other flavor of Windows was “Windows NT”.  It would take some time for Microsoft to finish a lot of the code that would enable the Windows NT kernel to be used to play games and do all the cheesy things that consumers wanted to do with it.  This was because Windows NT was far more sophisticated than the old Windows and required software developers to go through a HAL (Hardware Access Layer) to access any devices in the system.   People who made things like cameras and sound cards and game graphics cards didn’t know how to do all that stuff initially and loved to bend and break the rules in ways that Windows 95/98 would allow… making a mess of everything.  It was a real chicken and egg problem to get consumers away from the old operating system.

Why did so many people think that Windows XP is the best windows ever?  Well, because Windows XP was the first Windows to be released as a consumer product that featured the NT kernel.  The kernel is the heart of the operating system… Windows Millennium  was the last OS released with the old Kernel and also the most hated Windows.  The old kernel just couldn’t keep up with new demands… it had to go.

Regardless of what consumers were getting on their new laptops, Windows NT was available all along for anyone who had a real use for it… but this included mostly businesses running servers and doing things that most people would find boring.   Most of the fancy gizmos weren’t widely supported on NT originally,  but NT 3.5.1 was there as a professional companion to Windows 3.11 almost from the beginning and  Windows NT 4.0 was there to compliment Windows 95.  The better option (NT) was always there, but people loved to complain about the unstable nature of the consumer operating systems, Windows 95/98, and loved to compare them to Linux and tout how superior Linux was to crappy-old Windows 95.

Linux, after-all had a few things that Windows 95 did not.

1. Preemptive multi-tasking
2. Multi-CPU support
3. The ability to run your monitor/display over the network.
4. A hardware access layer.
5. User Layer/Kernel Layer isolation (allows you to reliably kill running/hung tasks without crashing the operating system).
6. Better security for the paranoid and/or careless.

However, if you compare that list against the features available in Windows NT at the time… well… there was really only one thing they had that Windows didn’t.  #3: The ability to run your monitor/display over the network.

Microsoft was fully aware that it needed a good remote-display technology, and with the help of a company called Citrix, implemented the gold standard for the industry, “Remote Desktop”, in the late 90’s.  Originally it was called “Terminal Services” and allowed you to have potentially dozens of people connected to the same computer remotely with a private, personal desktop and set of applications running.  It has since evolved even further and is one of the greatest features of Windows in my opinion.  It will even run Sound, video, 3D applicatoins and is incredibly responsive even when I connect to it over my 3G Android phone, and beats the pants off of anything X-Windows, or X-Windows over VNC (a typical configuration for Linux) has to offer.  It truly is the gold standard, and I run remote desktops to 15+ servers all day long and they are 100% stable and everything works perfectly, virtually ALL the time.

Compared to what linux has to offer in this category, X-Windows…. well… X-Windows is a horrid pile of junk  Ask anyone who has ever had the displeasure of touching the X-Windows APIs… they are horrid, in dire need of replacement, and so horrible, that most of your Linux guys prefer to do everything over the command-line as-if it is 1982 or something and we all write DOS apps.  X-Windows typically works a bit better when running on a local machine, but its remote-display-over-the-network capabilities have fallen dramatically into disrepair and in my opinion, X-Windows is the absolute worst thing about the Unix-based-OSes.   Apple was smart enough to NOT use it when it built OSX from a Debian/Linux base and instead chose to roll its own GUI and Driver architecture from scratch.  Android also does not use X-Windows.   Doing so would have been a big mistake. However, virtually every other flavor out there uses X-Windows still…. including the big popular ones like Ubuntu, Fedora, Redhat, Suse, Mandrake, etc. etc. etc.

Myth #2: Windows is less stable than Unix/Linux

Even if I don’t take into account the fact that Windows machines do MUCH MORE than your average Linux machine and therefore have many more opportunities to crash… Windows is still far, far more stable than any Linux machine I have ever seen.  Sure, Windows 98 crashed 10 times a day and NT 3.5.1 and 4.0 had some growing pains with regards to hardware vendors etc.  but generally speaking, any problems you have with Windows these days are as a result of companies who supply you hardware and terrible drivers.

When you think about the daunting task of integrating your operating system with virtually every piece of hardware ever made, from sound interfaces, to keyboards, mice, web cameras, network adapters, touch pads, 3D Graphics cards, tablet devices…. it is staggering how well of a job Microsoft has done at getting all these hardware vendors on the same page.   Drivers now often go through “WHQL Certification”  (Windows Hardware Quality Labs) and are checked out by a Microsoft team of QA engineers before they’re typically put on your computer these days.  This is really a massive operation and a feat that the open source Linux community has really no hope of ever organizing with their rag-tag group of rebel hippy coders.  There are just some things that are best left to companies with an incentive to profit.

Getting hardware working with Linux is generally a painful operation.  Getting video drivers to run at all can be a two day operation involving many posts and complaints to internet forums…  plug and play? heh… plug and PRAY.

My Windows servers never go down unless I do something stupid like pull the plug accidentally.   This is all despite the fact that they are running software that is 100 times more complex than anything I have ever demanded of my Linux machines.  My Linux servers go down on their own maybe once a month… just because… despite the fact that they do very, very little in comparison to my Windows machines.

I literally run one linux server for the sole purpose of forwarding HTTP traffic to internal servers on the LAN.  That’s all it does… it runs one service that performs virtually the same operation repeatedly.  My websites don’t even get much traffic… but it still crashes once a month.   Another Linux flavor I am running is IPCop… which does nothing but act as a firewall… keeping out unwanted traffic, and forwarding wanted traffic.  It also builds an IPSec tunnel between my office and my home where I run another IPCop instance.  All this is running on virtual machines (so hardware can’t really be the problem),  yet they both crash once a month on average and I get calls from my coworkers complaining “the internet is down!”   Whoops!   There haven’t been any updates or patches for IPCop in many months.

Another experimental machine I have is an Ubuntu distribution.  It also runs virtually, and does virtually nothing.  I have used it only a few times to just get a feel for the Ubuntu experience… I maybe browsed a few web pages on it…. installed Chrome, Chromium, and Lazarus… that’s about it.  I guess it didn’t really like sitting idle… because I came to my virtual host one day to find that the Ubuntu machine was consuming all the CPU resources available to the machine and robbing performance away from all the other virtual machines running on the box.  After that happened 3 or 4 times, I shut down Ubuntu permanently.

The list goes on…. one of my companies websites was abandoned when the CentOS distributions on our Amazon cloud decided to stop functioning for whatever reason.   Despite the fact that we spent over $100,000 building the site… we just let it stay down.

My Windows machines on the other hand,  do all kinds of crazy stuff.  In addition to running a full GUI, I have servers doing lots of server-related tasks… hosting up and transcoding media, running databases, source control servers, distributing files to offsite office branches, hosting terabytes of files for multiple users. Additionally, the desktop flavors of Windows run the latest games, like Starcraft II, Simcity, digital Audio recording packages like Sonar and Reaper, sophisticated development tools like Visual Studio, Delphi XE5, Atmel Studio 6.1, Monodevelop, Unity 3D, web browsers typically with 15 windows open at a given time, Photoshop, advanced 3D-modelling and rendering suites, such as 3D-Studio Max, or sometimes just watching Netflix online while typing up blogs.   So on the stability front, I give Windows a good solid “A” and I give Linux an “F-“.

There is zero truth to the myth that Linux is more stable than Windows.  Any such claim is absolutely baseless when compared to any NT-based operating system.   It is simply not fair to compare it to Windows 95… and it is not fair to compare it to machines that DO LESS.  Sure, if I turn on my computer and just leave it sitting at the BIOS config screen forever it’ll probably run until the year 2100… but people actually want to do things with their computers other than just type on command prompts you know!?  I can’t even get stable performance from the Linux equivalent of “notepad” on half my Linux distros.

Myth #3: Linux is more secure. 

It is hard to measure this.  But I would definitely say that this is a myth.  We have to keep in mind that Windows, being the dominant operating system on the market and most used by ordinary people, is the target of many attacks.  If a hacker wants to compromise your system, he’s going to choose to attack the operating system that you’re most-likely using… which is Windows.  Windows had its share of security flaws in the past, but then-again, so has Unix/Linux.  I shouldn’t need to remind anyone of the “Code Red” worm which attacked Cisco DSL Modems which ran Unix under the hood.  Most little piddly internet appliances run some ghetto unix variant including your wireless routers in your house (ask yourself how often does your wireless router need to be rebooted?… mine needs a reboot quite frequently)

Windows had a checkered history with security, but they really aimed to step-up their game with Windows Vista.  At the core of the Unix security philosophy was the idea that any modifications to the system had to be done with special user permissions, requiring a special login with administrative privileges.  Windows Vista was the first Windows operating system to adopt a similar philosophy.   People disliked Vista for this very reason, as the new security features got in the way of many applications running properly (until they were updated to support the new security scheme).  I personally still don’t like this “Feature” called “User Account Control” and just choose to disable it.  UAC makes me feel like I’m locked out of my own computer, as-if, I’m not welcome there.   Since I know what I’m doing, I feel like I don’t need it.. and even without UAC, I’ve been able to live a virus-free existence for the last 10 years or so.

As a programmer, however, I am still forced to play nice with UAC when releasing products targeted towards consumers who are assumed, by default, to have UAC enabled.  When the concept of UAC was initially imposed on developers, it caused a lot of people to be unhappy because it forced developers to use special user folders to store writable data… something they weren’t used to… which also translated into perceived instability of the OS in the eyes of end-users… it wasn’t unstable though… it was just that software companies needed time to adopt the new system.  By the time Windows 7 came around, most companies had embraced the idea, and therefore Windows 7 was met with far less public grief.

UAC helps to keep the trojans away, as any application that you run that might masquerade itself as a legit application but have sinister motives, would have to ask for your permission to modify system settings to do many of its dirty deeds.  These days, most viruses, therefore, rely on human manipulation.  I might send you an email claiming my name is “John Smith” and luck might have it that since “John Smith” is such a common name, you might click on something you shouldn’t because your best buddy in the world is named “John Smith”… then you’re compromised.   It is for these reasons that .EXEs are blocked from email attachments and other nonsense… they’re security features to prevent people from doing stupid things.   I maintain about 35 Windows machines currently and I have only encountered a single virus in the last decade.

The bottom line is that I would say that Windows is just as secure as Linux, and arguably more secure when you consider how much abuse it has to tolerate.  Who is more secure: A naked guy getting attacked with a small knife, or a guy wearing a kevlar vest being attacked with a 9mm Pistol?

Myth #4.  Linux is faster than Windows

If Linux feels faster than Windows at times, it is just because it is doing LESS.  Just because you’re doing less work, doesn’t mean that you’re doing that work faster.  But if you think that Windows does too much and is therefore a waste of computing resources, you should be made aware of the fact that, to appease the script-kiddies out there, Windows can now be installed as a “Server Core” which features none of the graphical gobbledigook that you find bloated and useless…. although really… the graphical stuff, by todays standards, isn’t a huge burden on modern servers.  Windows has always been designed for next-gen hardware, which is one of the reasons that it has stayed ahead of the game while Linux has been playing catch up.  With that in mind, Windows has a lot of features designed to make it play better with systems that DO have lots of RAM to spare, including improved searching and caching.   These features might annoy you if you’re running outdated hardware because they slow you down and consume RAM that might be precious to you… but it makes the new machines purr like kittens.

Increased RAM requirements, of course, translate directly into increased datacenter costs, particularly when you’re running virtualized cloud servers.   But again, if you’re running in those scenarios, give Server Core a try.

Conclusion

Windows… is just better.  It is a better tool, easier to use, just as secure, more stable, arguably faster, and just has a zillion more features and can be used a zillion more ways.   Plus it has the most sophisticated development tools on the market, hands down.  I really see no reason to use anything else.

One Reply to “4 Myths Propagated by Delusional Linux Enthusiasts”

    1. Right on, Jack! But wouldn’t it be rad to see how the scene has evolved over the last decade? I mean, we’ve got Windows 10 and 11, and the Linux landscape’s changed heaps too. I’m curious if the points made still hold up or if it’s a whole new wave now. Has anyone caught any gnarly updates on how these OSes are matching up these days? ?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.