Modern Middle Manager
Primarily my musings on the practical application of technology and management principles at a financial services company.
Yet Another TCO Article (YATCOA)

Saturday, January 04, 2003  

Another article on TCO, similar to ones I've read before. One of the key points of in this one is how many servers an admin can handle, aka "span of control." If the span of control is wider for Linux vs. Windows, that should be taken into account. So what does a typical systems admin do at the operating system and application level? Hmmm, let's see:

Installation
Backup & Recovery
Patches/Updates
Configuration
Performance Monitoring
Tuning
Troubleshooting

As a small/medium-sized business, we have approximately 20 servers. How do I look at the Windows vs. Linux span of control and TCO issues? I'll evaluate each category:

Installation
Most of our servers are virtualized so we have a single install image for Windows 2000 and Linux. Plug and play isn't important because the hardware is virtualized, appearing the same no matter what the underlying physical server really has. That being the case, Windows 2000 is at a disadvantage because of the need to generate a unique system identifier (SID) and name before putting it into the domain. However, we don't install servers often so the time spent is truly minimal.
Verdict: Minor advantage for Linux.

Backup & Recovery
Our primary method of backup is a grandfather-father-son tape rotation. We use Veritas BackupExec for Windows 2000 to back up of our important data, most of which resides on our network attached storage cluster. We also back up registry information from each server. No problem there -- the biggest issue will be the proliferation of Linux servers. How will we back up the configuration of multiple servers with multiple applications? BackupExec has a Linux agent if we decide to go that route, otherwise Linux makes it fairly simple to schedule the creation of a tarball that gets written to the NAS.

Recovery is the tricky part. Bare metal restores are not an issue for us because of the virtualized servers. However, restoring OS or application information is trickier in Windows. Why? That pesky registry. The Windows registry is a definitive solution to the question, "How can we make admins nostalgic for .INI files?"

What about when a registry tweak breaks the server on reboot? Or, conversely for Linux, a kernel change? Hmm. Under Linux, chances are I've copied the working kernel and created an entry in LILO or GRUB to boot to it. Under Windows, I'd have to create a set of Emergency Repair Disks to fix the problem. Which is quicker...rebooting to a working kernel or booting from CD and (hopefully) copying back the entire registry from the ERD's?
Verdict: Linux wins. No registry to deal with, strictly file restores and the ability to boot from old kernel for recovery.

Patches/Updates
Windows 2000 normally gets raked over the coals in this department. With our purchase of St. Bernard's UpdateEXPERT, we've cut down the amount of time it takes to roll out server patches significantly. However, that comes at an extra cost. Using Debian Linux, we are able to schedule an apt-get upgrade to get all of the needed patches.
Verdict: Because we needed to purchase a third-party product to perform mass-rollouts of patches and updates, Linux wins. Patches are also available more quickly from the open source community than from Microsoft.

Configuration
Windows' tightly-coupled GUI interface makes the administration of a single server an absolute dream. It makes the administration of 20 servers a pain. Better scripting and command-line tools would help here. The idea that regular editing of the registry to perform unusual configurations is acceptable boggles the mind.
Under Linux, editing a single file in the /etc directory is easy. When applications strew their config files in multiple directories it becomes far more difficult to make config changes. However, once you know where all the files are, making multiple configuration changes on multiple servers is easier with Linux using the scripting language of your choice, be it Perl, Python, or bash.
Verdict: I hate the registry and lack of good scripting tools for server farms under Windows. Linux wins.

Performance Monitoring
Other than running top, I have no idea what performance monitoring tools exist for Linux. Windows has tools built in to monitor the performance of multiple servers. They're not as easy as I'd like, but they do exist.
Verdict: Windows 2000.

Tuning
Tuning the OS seems to be better under Linux (why? Because although Windows 2000 has some GUI options, the options are limited unless you're playing with the registry). Tuning apps seem generally easier under Windows on an individual server basis because of the GUI. However, making the same SQL server changes to 5-10 servers will take less time under Linux.
Verdict: In our environment it's a tie. OS tuning belongs to Linux, but because we don't have multiple SQL or web servers, the scripting options aren't as important.

Troubleshooting
The ability to proliferate single-purpose servers cuts down on the number of variables when troubleshooting. What is easier to troubleshoot -- a server performing only print services, or a server performing print, directory, DNS, file, web, SQL and Exchange services with the attendant DLL hell? I can virtualize enough servers to create single-purpose virtual machines. To do that with Windows 2000 costs me another license for each server. Under Linux I face no such licensing cost.
Verdict: Linux by virtue of its licensing scheme and lack of DLL hell.

By proliferating single-purpose servers and managing them via scripts, it appears we can increase the span of control of our systems admin. The additional stability and lower cost of the operating system, plus the advantage in recovering from mistakes, suggests a lower TCO for Linux in our organization.

posted by Henry Jenkins | 1/04/2003 05:29:00 PM

Comments: Post a Comment
search
the author
archives
links
open source
vendors
stats
reading