Modern Middle Manager
Primarily my musings on the practical application of technology and management principles at a financial services company.
2003 Predictions

Tuesday, December 31, 2002  

The end of the year would be incomplete without predictions. Below are my prognostications for 2003, first for my company, then for the IT industry at large.

My Company
1. End-users maintain the belief that computers are supernaturally complex and that training courses would only confuse them.
2. Senior management continues to get its business ideas from four-color glossy brochures.
3. Halfway through the year I will be expected to reduce my budget by 5%. Fortunately, it's the number I already had in mind when I created my budget.
4. Open source will represent 50% of the data center because the word "free" works magical effects upon management.
5. We finally roll out Windows 2000 and Office XP to the desktop, thus answering the question, "What do we do with all this spare computing power?"
6. A major external intrusion is attempted on our network. Fortunately we took the entire data center down that weekend for dusting and to color-coordinate patch wires.

The Industry
1. Microsoft purchases IDG, META Group and Gartner. All new research studies show that Linux is a carcinogen. Even second-hand.
2. DVD+RW capacities are increased to 25GB by Chinese pirate groups who promptly distribute the entire movie career of Christopher Lee.
3. The wireless Internet will be ubiquitous in major metropolitan areas. It, too, will be listed as a carcinogen by year-end.
4. A class-action lawsuit will be filed against Bill Gates for pain, suffering and mental distress inflicted upon individuals worldwide. Bill will win and file a counterclaim.
5. Scott McNealy will finally get his teeth fixed.
6. Larry Ellison will be chosen as North Korea's new stable, rational leader.
7. Customer relationship management (CRM) software will be merged with enterprise resource planning systems (ERP) and human resources information systems (HRIS) to make the ugliest abbreviation in existence. It will only be linked with heart disease and glaucoma.
8. Software patents will make all new development impossible, except in countries where patents are laughed off (see prediction #2).
9. Tort lawyers will continue to grow rich and will create books along the lines of, "Rich Lawyers, Poor Schlubs".

posted by Henry Jenkins | 12/31/2002 10:37:00 AM
(0) comments
Annual Rituals

Monday, December 30, 2002  

This article reminds me of an annual ritual I participate in. It's called The Whitepaper that Justifies My Job. Once a year for the past three years I write an analysis to senior management on why my department should not be outsourced. My competition is our corporate parent's IT contractor company. They are willing to provide, at no cost, IT services to our division. On paper it would reduce my department's expense by about 1/3. That's obviously a significant level of savings.

So why does my department survive year after year? How come I haven't joined the ranks of unemployed IT department managers in a (relatively) ugly job market? How can I compete against free? My strategy for success lies in the following points:

1. Provide a premium service to the end-users. We are very careful to not alienate our end-user computing base. We remediate their problems quickly, treat them with respect, keep their applications up and make certain we don't disrupt production operations during the day (and notify them ahead of time when nightly or weekend downtime is scheduled). I have no doubt that my user-satisfaction ratings are over 85%. We have individualized service because we know each and every one of our internal clients.

2. Foster relationships with other department heads. I believe in fostering a partner mentality with the business units, not an insular glass house mentality that end-users resent and management doesn't understand. My goal as the department manager is to get the regional vice presidents and department heads to sing our praises to their senior managers. In return, my department makes certain their requests get our full attention. Everybody's happy, everybody wins.

3. Emphasize that we understand the business. Our division has two major lines of business: wealth management and (very niche) commercial banking. We understand the systems, the data, the workflow and the information required by the various departments. We understand the regulations and guidelines for which we are accountable. The IT contractors we compete against do not understand wealth management, they do not understand banking -- they understand the title & escrow processes that drives the majority of our parent company's revenue.

4. Remind management that you get what you pay for. Unlike competing with an IBM or EDS that competes in the free market, I am competing against a corporate-mandated provider of IT services. I use that to my advantage, pointing out their (numerous) failures and the dissatisfaction among end-users at other divisions. My best leverage is the reputation they have with a sister division we depend upon for investment services. There the IT contractors have made for themselves a reputation for sloppy service, unreliability, lack of business knowledge and rotating out contractors every 3 months, guaranteeing that none of the contractors will understand the business for very long, if they get it at all.

5. My newest weapon is benchmarking. I downloaded some research from the META Group that illustrated some great metrics such as % of revenue spent on IT, IT employees as percentage of total employees and IT spending per employee for specific industries. You may want to argue the substance of those benchmarks. However, if you fall on the right side of those metrics and senior management is getting its IT information from blurbs in a four-page newsletter, it may be the deciding factor in your favor.

I'm not 100% against outsourcing. I think there is a time and place for it, which I may explore in another post at another time. However, I am 100% against short-sighted, cost-cutting measures that will eventually cause major disruptions to a business, especially a small to medium-sized one.

posted by Henry Jenkins | 12/30/2002 05:39:00 PM
(0) comments
Communications Breakdown

Sunday, December 29, 2002  

Nothing is more difficult to fix than communication breakdowns. They are probably the #1 cause of more irritation, frustration and aggravation in the entire organization. I believe those breakdowns can be narrowed down some basic root causes in the workplace:

1. Personality differences.
2. Impatience.
3. E-mail.

Personality differences speak for themselves. Our perspective informs our reading and listening comprehension. If you and the person you're communicating with aren't having a meeting of the minds, there's no basis for understanding. The most common occurrence I see at work is with humor. It is amazing how easy it is to piss someone off when you think you're being funny and the other person, for whatever reason, is in no mood. I have two team members who cannot communicate at all with each other in a humorous fashion because they don't understand when the other is being funny. I personally believe in cases like that it's best to take a page from the FAA rules: don't try to be funny unless the tower is. Until those basic misunderstandings can be worked through, don't push the issue. There are a number of books and programs in the field of organizational development to help work through personality conflicts and help people understand how they act & react with others.

Impatience is another problem. When people aren't listening, they aren't going to communicate well. The easiest way to work through this is either trying to put them at ease (if the issue needs to be discussed now) or to wait for a more opportune time (if it doesn't). Pretty self-evident.

E-mail is a major offender. I have a simple rule -- if it takes more than three e-mails back and forth to settle an issue, pick up the phone. Too often I see problems that should have been cleared up by a phone call or meeting that lingered in e-mail. When those result in me having to meet with another department head to clear up an issue between two of our staff members, the miscommunication went on far too long.

posted by Henry Jenkins | 12/29/2002 11:04:00 PM
(0) comments
What You Don't Test Won't Work  

The title is a very simple principle I hold dear and expect to be executed in every task and project. Modern systems are far too complex to expect them to perform without flaw when even a simple change is made. Swapped one switch for another? You'd better test. Changed a DHCP attribute? You'd better test. Modified a script you wrote and are dead certain will work the way you think? You damn well better test it thoroughly.

I bring this up because of an article written on another blog that expounds on one previously written here (scroll down if the permalinks aren't working. Sigh). The advantages of the mini-project are:

1. Reducing the span of control. Span of control in project management deals with the number of people on the project. The fewer involved, the higher the quality control and the better the communication between project members.

2. Reducing time to deliver. By producing deliverables quickly, management feels (yes, feels) like the project is moving forward quickly.

3. Reducing variables. Cutting down on complexity means that the team can do more comprehensive testing of each deliverable both before and after being moved into production. Anecdotally, I can say it's been rare that a system tested before and after being moved to production has subsequently failed. However, I have seen failure rates of godawful proportions with systems moved into production and not tested. Hence the mantra I've developed that's featured as this article's title. What you don't test won't work. And what you do test, if it's too complex, probably won't work, either.

The disadvantage to the mini-project is political. As ArmedLiberal mentions, the return on investment (ROI) is much smaller when the grand project is reduced to mini-projects. So if the grand vision will save $3 million over three years, it may be that those savings will be pushed off by accomplishing mini-projects. Why? Because senior management may actually believe a large-scale project can be managed effectively so that the payoff comes more quickly than real results spread out over a longer timeframe. I will postpone a discussion of my company's political landscape and the mines hidden therein until later.

posted by Henry Jenkins | 12/29/2002 01:00:00 AM
(0) comments
Business Value

Saturday, December 28, 2002  

An AP article published on Christmas discusses how Microsoft's latest marketing strategy against Linux is changing to the concept of "business value." About time, I say. Let's examine the article's contents.

Companies including Sun Microsystems and IBM are rolling out products and creating business models based on Linux.

And Microsoft has seized on that development - and points out the technological expertise and labor needed to tailor Linux to companies' needs - in arguing that free isn't really free.

Linux can require costly technical staff, said Rob Enderle, an analyst with Giga Information Group.

``You lose the ability to buy something and plug it in,'' he said. ``It takes you more time to do it. If the (Linux expert) leaves, you could be left with something that's unsupportable.''


I have submitted quotes to reporters and seen things in print that never came out of my mouth. Nevertheless, either the reporter is incredibly biased for Microsoft or Mr. Enderle is a shill. Here's a newsflash for you, Rob -- technical personnel, be they Windows, Linux, Solaris, Cisco or IBM experts, are expensive! And if an IT person is just "plugging in" a Windows server they should be fired and replaced with someone who is competent. I admit that my perspective is from a small business of 100 users. I certainly expect that if I'm cross-training in my organization then enterprise IT departments are as well, which makes this line, um, unsupportable.

The company even commissioned a study, by International Data Corp., concluding that in network infrastructure, file serving, print serving and security workloads, Linux-based servers cost more to run than Microsoft Windows 2000 server software over a five-year period. The report cites the staffing costs as the biggest reason.

I've gone over this in previous posts. This IDC study was a crock and assumes two things that aren't true: 1) that the Windows OS won't be upgraded over that five-year period and 2) the licensing costs weren't adjusted upwards to be in line with Microsoft's Licensing 6.0. With the additional software expense and training costs for the Windows OS upgrade, I wouldn't hesitate to say Linux slaps Windows about the head and leaves it unconscious. Skepticism is reported by resellers, journalists, professionals and even IDC itself. Of course, the whole concept of TCO is put into place in this interview with one of the IDC study's authors.

``To make that argument it really needs to be made by practitioners, not by the vendor itself,'' the analyst said. ``To make it stick you really need company (information technology) managers to stand up.''

I had to search long and hard to find any articles that had positive feedback from IT professionals. Here's one that's rather tepid and expects Linux to be more cost-effective over the next five years. Hmm...why aren't there more? I look through the first 100 articles Google presented when searching for "IDC TCO linux microsoft".

At the end of the day the only "business value" I'm concerned with is for my own company. I'm agnostic in the OS wars, concerned only with what provides the best bang for the buck. Six years ago I led the charge to move away from Novell to Microsoft because they had something vital -- a platform to run the applications we needed to add value to our business. Today I believe that Linux occupies that same position. While I don't see Microsoft fading into obscurity like Novell, I do see a return to a heterogeneous network and "best of breed" for the foreseeable future.

posted by Henry Jenkins | 12/28/2002 10:43:00 PM
(0) comments
Dangerous Minds  

The last post reminded me of something that I rarely see in TCO comparisons -- the cost of training on Microsoft products (including certification) vs. the cost of training on Linux products. Windows knowledge appears to be considered ubiquitous and is therefore never factored in. What's interesting is that everyone who finally does achieve some level of expertise in Windows (be it 2K, XP, et al) has much of that knowledge obsoleted in the next version of Windows. .NET server appears to be no different. If Windows is expected to change every three years, staff knowledge should be considered to depreciate along the same lines as hardware. That means there's a skill upgrade cost that never gets mentioned. Although Linux can be intimidating for those who haven't been exposed to it, the good news is that once you've learned the basics of the install, command-line tools and whichever GUI you've standardized on it doesn't change much. The applications on top of it might change often but the underlying system is not going to undergo radical revision like Windows.

This is one of the tremendous advantages of using Linux in the back office. Once the staff has learned Linux basics they don't need to relearn it every three years. DNS is a separate application. LDAP is a separate application. Apache is a separate application. SMB file/print sharing...well, you get the point. Yes, these apps will change often. On the other hand, if all you want to do is upgrade to the latest Apache, you don't have to upgrade the whole damned OS and get 1000 new features you didn't need but must master or else an interdependency somewhere may cause some unintended consequence you spend a day troubleshooting. I guess that's one way to enforce learning the features of the OS but I prefer a somewhat less totalitarian method.

posted by Henry Jenkins | 12/28/2002 10:16:00 PM
(0) comments
2003 Initiatives  

Some IS departments feel the need to hit a thousand targets all at once, accomplishing none. I believe in a few initiatives that actually get done. For the first quarter of 2003, the top goals of my department for the first quarter will most likely be:

1. Security -- continuing the work started with the GLB Act. Tighten down access to information, audit more heavily, continue and expand awareness programs.
2. Change Management -- with new servers and applications come many more ways to screw everything up. Better controls than what exist now are necessary.
3. Application Development Infrastructure -- move to open-source application infrastructure for all the reasons in my last post.
4. CRM Upgrade -- move to web-based system from current client/server app. Integrate into portal, modify to provide more comprehensive picture of client.
5. Imaging System -- outsource existing imaging system to web-based platform. I want my team concentrating on creating business value, not wasting time maintaining apps like this.
6. Thin Clients -- instead of upgrading our desktops to Windows 2000 and purchase new systems where necessary, move to a Linux/Citrix ICA desktop for most of the company, running apps on a farm of Metaframe XP servers.

My next task is to market these ideas to the COO and get them approved. That happens on Monday. Should be exciting.

posted by Henry Jenkins | 12/28/2002 12:26:00 AM
(0) comments
May I Be Excused? My Brain Is Full.  

As I ponder the direction of the IS department over the next year, I look towards the conversion of much of our backoffice to open source. This has two immediate impacts on the staff -- they have to learn Linux and the associated applications we'll be implementing and there will need to be new programming standards put in place for the developers.

The change for the systems personnel means understanding Linux installation, configuration files and troubleshooting. It also means learning the new applications which will probably include (but not be limited to!) Samba, Apache, PostgreSQL & MySQL (yes, I can see both in use) and Exim. Samba allows Linux to act as a SMB client and server (read: it can create and access CIFS and printer shares, just like Windows). Apache is a web server. PostgresSQL and MySQL are SQL servers (PostgreSQL is more advanced, supporting transactions, views, stored procedures and subqueries). Exim is a POP/IMAP/SMTP mail server. Lots to learn. I think they're up to it and I think they're eager to try something new.

The changes for the programming staff I'm proposing is pretty radical as well. I see us moving to PHP, Perl and Python for most web programming. For more advanced projects I see us moving towards a content management system based on XML/XSLT like the Apache Cocoon platform. Mono looks like another interesting platform for development, especially as my staff is already familiar with .NET. Truth is, we'll use whatever platform gets us to where we want to be quickly and with the best cost-benefit. That's why open-source will probably be the key to our infrastructure. In addition to the development platform, we will need to change our source code control system to something like CVS or Subversion. This means updating procedures and processes. Defining a standard development environment, such as Eclipse is the final piece.

Why go through this headache and hassle? Why not just stay with Microsoft, keep the re-training costs low and not chance anything? How difficult will it be staying ahead of the training curve with a mixed Windows-Linux environment? Aren't I asking for more trouble? The answer is: maybe. This is an obvious risk for me. On the one hand, there are bottom-line licensing savings moving from Windows to Linux for several of our back-office apps. Much of that disappears in the first year because of training and productivity loss. However, year two and beyond I expect significant savings in licensing, reliability and robustness. I think that makes the transition worth it. Best of all, I believe those staff members who aren't already working with Linux (and about half are) very much want to learn something new that benefits the organization. Yes, it will take time to make them experts. However, once they learn the basics the rest should be no different (hell, probably easier) than relearning Windows every time a new OS release comes out. I see 2003 as a year of transition for our company.

And what of other organizations? I remember when OS/2 Warp came out and people hopped on that bandwagon as a way to slay Microsoft. Well, that turned out badly, didn't it? I see a difference here. First of all, Linux has a greater presence at this time, hell, probably 100x the presence, that OS/2 ever had. Second, Linux is not looking for a reason to exist. All of the applications I listed above satisfy business needs now. And there are hundreds, perhaps thousands, more useful apps in development. Third, Linux is not a pain in the ass to set up. OS/2 Warp was. I used it when it came out and made it my desktop OS at home. It was far more stable than Windows but it was a definite irritant. And the GUI was UGLY. Fourth, the economics are too good to ignore. If my company can provide the same (or better) level of service as it does now at a 25% discount or more, I have to explore that option. I would love to boast that I reduced our IT budget by that kind of margin!

posted by Henry Jenkins | 12/28/2002 12:17:00 AM
(0) comments
I'm Trying to Think but Nothing's Happening.

Friday, December 27, 2002  

Thanks to the dozen of you who have visited this site. If you'd like to contribute feedback to what I've written, please feel free. As it stands right now, it's Friday night and my brain is absolutely devoid of anything interesting. I'll have to try harder...maybe some pizza & beer will knock an idea loose.

posted by Henry Jenkins | 12/27/2002 06:28:00 PM
(0) comments
When Good Technology Goes Bad.

Thursday, December 26, 2002  

The cliches are numerous. Technology is a liberator, a catalyst for change, the fulcrum in re-engineering your business. To do that you must change your mindset, your culture, your paradigm. And it all goes down best if you use consultants for the implementation...

Which brings us to Customer Relationship Management (CRM) -- the technology that solves the question, "What do I do with my staff's time and the organization's net income?" Sarcasm aside, we implemented CRM three years ago. It has gone from "transforming the enterprise" to nearly being thrown in the dust-heap to currently surviving only because of grass-roots efforts to keep it.

The initial impetus behind our CRM implementation was "how do we get a single organized database of all of our clients, prospects and vendors." It ended with the organization overreaching, attempting to use it to solve workflow and process quality control issues in addition to the original goal after spending quite a bit of money on (worthless) consultants to implement that grand vision. Our Chief Operating Officer at the time wanted to re-architect the company. The rest of senior management, while publicly assenting while costs went up, privately assailed the project.

A year later that COO left and a number of "simplification" projects were introduced, tearing up most of the workflow and process quality controls. A new CEO and partially turned over management team wanted to throw out the system because they'd "heard negative comments about it," probably from existing senior management. Interestingly enough, the rank-and-file front office people who had the loudest complaints started singing the praises of the system. Largely due to their efforts the system remained in place.

Looking back on what the organization did wrong, I think I can identify our mistakes. Here are the big ones:

1. Don't create one massive project, do mini-projects and stick to a rollout schedule. We changed too much too fast and tried for too many payoffs at once (in sales, client management, operations). That's the number one reason the end-users got irritable and the reputation of the system was trashed almost immediately.

2. We spent too much on consultants. Our organization mixes both outsourced and in-house expertise. Because of the initial overreach we had to rely on the consultants to implement it all instead of growing our own talent. By the time our in-house talent had gotten to the point where they could understand the system, we began to realize that the consultants were performing poorly.

3. Even if you think you have executive backing, line up your ducks. If the executive champion goes away it is possible for a large project to come apart. In this case, the rank-and-file saved the technology from being thrown out the window.

We're now at the stage where small, guerilla projects are enhancing the existing system and will hopefully deliver on the initial promise of CRM, even without an executive champion. This is a case where middle management and staff can define the outcome of an organizational decision and actually make it useful. Where vision is lacking, communication between department managers can create one that brings IT closely aligned to the organizational goals -- and actually deliver something that works.

posted by Henry Jenkins | 12/26/2002 11:21:00 AM
(0) comments
Happy Holidays!

Tuesday, December 24, 2002  

Merry Christmas, Happy Kwanzaa, et al. In an effort to keep everyone safe, I present for you The Rules of the Holidays:

1. Tequila is not your friend.
2. Neither is eggnog unless you are wearing Expand-o-Matic pants.
3. Yes, that shirt does look like a tablecloth. Ooh, same material, too. Just be polite to the giver and donate it later at 5x retail value.
4. The box enclosing the rabbit/hampster/guinea pig/mail-order bride should have airholes in it.
5. Never purchase clothes, books, music or DVD's for your friends and family. Either you have no taste or they don't. Gift certificates are good. Cash is best.
6. Corollary to #5: "Personal Gifts" don't exist. Nobody is going to remember fondly that you gave them that three-eyed orange fish ceramic knife holder because they liked the Simpsons.
7. Corollary to #6: In fact, they will begin to develop a nasty, bitter grudge against you.
8. Keep the political and religious discussions out of the mix unless you are drawn to fistfights and broken furniture. In that case ignore rule #1.

Stay safe and leave the automatic weapons at home!

posted by Henry Jenkins | 12/24/2002 03:54:00 PM
(0) comments
I See Tech People...

Monday, December 23, 2002  

Infrastructure is fun. Complicated, but a lot of fun. I think it's because there are so many options available and they can all be molded fairly easily to an IT manager's will. Well, more easily molded than people...

I am the archetypical IT manager. My background is systems administration and programming, with a four-year stint in consulting that resulting in me co-running a small company in Irvine, CA. Without any guidance and with the pressures of consulting, my style became very authoritarian (OK, I was a despot). I brought some of that attitude to my current position once I became head of the department. Recognizing that, my current boss sent me off to Re-education Camp so I would become the Ultimate Team Leader. Sarcasm aside, I have been devouring information on management and leadership because I recognize that I can always improve. So I got a lobotomy, read books like those from John C. Maxwell and Phil Jackson (and his ghostwriter) and Got With the Program.

Perhaps I'm bitter because my skepticism has failed me. My team is working like a team, they are trusting each other and helping each other and rising to the challenge of becoming better professionals. Imagine my chagrin -- I was wrong! Time to grow up, after all.

So what is team-building all about? Partly it's the integrity of the team leader and partly it's the integrity of the team members. From what I've experienced, building a team boils down to:

1. Getting the team members to understand what they are capable of as individuals.
2. Teaching them to respect and trust each other (and by extension, the team leader).
3. Show them that if they make each other better they benefit themselves as well as the organization.

In my quest to Build the Ultimate IT Department, I am working on all three steps at once. I try to challenge my individual staff members to become better, learn more, apply it and prove they can succeed in their efforts. I hold biweekly topic meetings with them to assign them research tasks to stretch them.

Respect and trust are more difficult. Even though we are a small department and I encourage cross-training, there is a certain amount of specialization (i.e., networking, servers, software development and desktop support). That development of "expert power," combined with respect for each other, is a powerful trust-building tool. Committing them to more open and clear communication, especially dealing with perception and personality issues, is a necessary part of this -- too many times "expert power" becomes more of an ego boost than a support mechanism.

Team members need to be able to rely on each other to back them up or offer help with their projects when they aren't able to find the answer. By helping each other out, they realize the payoff is that they can accomplish their own assigned goals on time and they get more interesting projects to boot. The individuals learn more, the team grows together and plays nice, the organization gets more productive man-hours to accomplish strategic projects.

Please take the electrodes off me now.

posted by Henry Jenkins | 12/23/2002 07:10:00 PM
(0) comments
Defenestration

Saturday, December 21, 2002  

I came across that word while reading some Total Cost of Ownership (TCO) articles on LinuxWorld.com. This led me to a few other articles on TCO published by Paul Murphy, spearheaded by this one. Defenestration means to throw out a window, in this case the pun is that Windows is being thrown out. It's Mr. Murphy's contention that most Unix-based infrastructures are "smarter" and therefore cheaper than Windows. Let's examine some of his conclusions and see if they make sense, as well as how they might be implemented at my organization.


1. Using smart displays/thin clients rather than desktop PC's eliminates the help desk.
2. Unix/Linux unifies an entire organization while Windows fragments it.
3. Unix/Linux frees the IT department to strategic issues while Windows forces IT to be a firefighting group.


Point #1 is that creating a Unix/Linux environment with smart displays/thin clients prys desktop control from the user and gives it back to the IT department. Using my own experience as a guide, I agree with this point. The end-users spend a lot of time playing with their backgrounds, screen savers, installing ActiveX controls to play casino games online, etc. While I understand that some settings should be customizable (screen resolution, for example) I don't believe that an end-user should be decorating their PC like their home. I also agree that thin clients without any moving parts like hard drives will last longer and probably have fewer failures (and a longer mean time between upgrades) than the average desktop PC. However, thin-clients can be run in both Windows and Linux environments, bringing us to point #2.

Point #2 posits that Windows fragments an organization while Unix/Linux unifies it. Fragmentation occurs for two reasons -- because end-users see their equipment as a personal power tool and have no qualms about customizing it to their heart's content and because Windows itself is so complex and unreliable at both the server and desktop level that it forces an organization to lock it down and police its usage. I would go further and mention that the "DLL Hell" all IT departments know and love also forces an organization to proliferate servers, creating a "one application, one server" environment that obviously drives up TCO as it drives up hardware and software costs. I am in absolute agreement with this. The Windows platform is inherently complex, unsecure and unstable if used for more than one or two purposes. The desktops are painful to manage and wresting control of the desktop away from the end-users involves a great deal of hassle, although Windows 2000 made it easier with Group Policy Objects.

The solution, according to the article, is Unix/Linux because the "inherent stability of the system [allows the CIO] to create a trust relationship with the user community." Unix/Linux "from a purely practical perspective...simply makes collaboration easier and cheaper." Long on rhetoric but short on details, I'm afraid. I have managed Unix systems on and off for 12 years and Linux started making its way in my organization over the past year, to my delight. While I will agree that our Linux systems seem very stable, the complexity of the applications running on them is minimal at this time. As we grow, I will be better able to judge that statement. The other statement, that Unix "...simply makes collaboration easier and cheaper" is hardly self-evident. Cheaper and easier than...NetMeeting? Webex? Exchange?

I can see the truth of the fragmentation argument regarding desktop PC's everywhere running programs locally. However, what makes Unix/Linux better than a thin-client Windows 2000 environment? It strikes me that centralizing desktop resources solves this problem. The article's sidebar confirms that.

Point #3 is that IT fights disasters constantly with a Windows infrastructure while Unix/Linux infrastructures are so reliable that IT can "focus on revenue generation and longer-term strategic issues." This means that a Windows-based IT department is "condemned to the role of cost sink, always spending monies earned by others" while a Unix/Linux-based IT department is "an organizational asset."

Ahem. Whereas point #2 involved some rhetoric, this point starts out well and ends up sounding like a cliche ("Neurotics build castles in the sky, psychotics live in them"). Unless IT is billing out its services, all IT is a cost sink. Other than this rhetorical excess, the rest of the point is on target. If your IT organization spends most of its time fighting fires, it will never be able to focus on aligning itself to management's strategic vision. Duh. Windows servers DO have problems with stability when taxed beyond the one-application, one-server model. Duh again. Windows desktops are probably the single greatest waste of time in an organization. Three duhs for Windows. However, the implication is that "Windows is automatically bad" and that "Unix/Linux is automatically good because it's not Windows."

Desktops are indeed a massive point of failure in so many ways in organizations. I believe that getting rid of them in favor of thin clients is a great way to go. As I've mentioned in previous posts, though, replacing the server infrastructure would be a little more difficult. While the Linux/Apache/PostgreSQL/PHP alternative looks good for in-house development, what replaces Exchange and Rightfax? How about Sharepoint Portal Server? Active Directory -- where is the Unix/Linux equivalent? What about an existing Customer Relationship Management app that's integrated with Microsoft SQL Server? I don't see a forklift upgrade in our future. However, we have reduced downtime and fires by pursuing a virtual server and one-app per server environment. And that frees us up for more strategic pursuits.

posted by Henry Jenkins | 12/21/2002 02:19:00 PM
(0) comments
The New IT Crisis  

This article from Marc Andreessen discusses what he believes is the next wave of IT innovation -- driving down costs by making computing resources as automatic and easy as getting a dialtone when you pick up the phone. I sincerely hope we get there. IT should be spending more time on enabling business processes and adding value than spending resources on maintenance.

There are several obstacles to implementing this kind of "utility computing." #1: How to you propose to create a new server on demand? How will it be configured with the right settings and custom software? There is a lot of talk about pushing OS images to spare server appliances on demand, installable software modules, automated patching and software robots that perform monitoring and self-healing. It's a great concept and one I'd love to see in practice. I think it will be a bit before we get there.

What practical steps can be taken NOW to drive down maintenance time and expenses? Is it possible to achieve SOME of these objectives today? Let's examine some of these issues:

1. Creating a new server on demand. As an alternative to SysPrep, we are using virtual servers which are actually a number of files on a hard disk, loaded and managed by VMWare's GSX Server. By creating a basic set of server images (such as a Windows 2000 member server, perhaps one with IIS or similar software -- this doesn't work with Exchange or SQL Server), it's possible to rapidly deploy a server by copying those virtual server files to another system, start it up without networking, rename it, turn networking back on, put it in the domain and get it functioning. Not quite the same as pushing a fully-functional server image but it's much faster than simply reinstalling Windows 2K or even SysPrep. For Linux servers, the process is even quicker -- rename, readdress, be done with it.

2. Installable software modules and automated patches. We've tried Microsoft's SMS (harken back to In Living Color...hated it!) and LANdesk Software's LANdesk Manager deployments (hated it!). We may evaluate something closer to the installable software module idea with InstallShield's AdminStudio. Updates to Microsoft servers and desktops are currently done using St. Bernard's UpdateExpert program. Really, who wants to look for a software package that requires hours of specialization from your staff and sets itself up intrusively on every [Microsoft] server? Isn't there another way to push updates and/or full software packages? And Linux is not leading the way here, although we're using the Debian version so at least we could schedule an apt-get update to upgrade the systems. It's still pulling software as opposed to pushing it out, though, limiting the IT department's control.

3. Software robots. There is software now that attempts to do self-healing (SMS and LANdesk come to mind again) although I have been unimpressed by LANdesk's ability -- in fact, it's self-healing module broke Microsoft Outlook's ability to open attachments and interfered with some of our scanning software. I think the technology still needs work. Using robots to monitor processes in [near] real-time does have some promise, though, and can be done now. I think of the article I read on General Electric's initiative to create a real-time enterprise. We have created a scorecard for several critical processes that we monitor on a near real-time basis as well. While not as comprehensive or currently as pretty as GE's, it has proven to deliver bad news to quickly enough to resolve problems before an end-users and clients notify us of service level issues.

I eagerly anticipate the day when computing resources are like Legos and their maintenance is near-automatic. I sincerely hope that Operating System and Server vendors are moving in that direction.

posted by Henry Jenkins | 12/21/2002 02:08:00 PM
(0) comments
The Outlook/Exchange Combination Getting Competition?

Tuesday, December 17, 2002  

It looks like there are finally some open source challenges to Microsoft's Outlook/Exchange combination. On the client side, Ximian Inc. is offering a Linux client with a connector to Exchange that would get rid of Outlook on the desktop. With OpenOffice it's possible to get rid of the Microsoft Office suite (approx. $300 per desktop). On the server side there I have found the Bynari, Inc. InsightServer that looks like it's attempting to be a credible alternative to Exchange.

We have a business need that might be able to use the combination of these products. Assuming I can get it put together, look for more information in this blog.

posted by Henry Jenkins | 12/17/2002 07:58:00 PM
(0) comments
I'm a Driver, I'm a Winner/Things are Going to Change, I Can Feel It...  

Just when you think you've got your ducks in a row and everything's starting to work, a monkey wrench flies out of nowhere and staves in your skull. That's what it felt like today when I found out our dear corporate parent decided to give transfer ownership for our telecom trunks at our headquarters. I had some idea this was coming and was told, "Oh, your usage is only a couple hundred dollars a month." Hey, I can deal with that. Got the bill yesterday and, well, let's just say the coffee spewing across my cherrywood desk signaled my disgust. $2,500 extra dollars per month. Good freakin' night! This is the kind of utter folly that destroys a good month's worth of hard labor by breaking my budget. Why is that important? In an earlier post I mentioned that budgets are fictional and there's a good chance that the expected revenue for next year will not materialze and that cuts will be expected around mid-year.

What to do, what to do? Every month my boss and I get the budgeted vs. actual spending reports. I'm measured in two categories -- how much I spent or saved for each budget line and the overall "bottom line" for the entire department. Transgressions in the former are forgiven for surpluses in the latter. Knowing that I'm screwed at this point, I have to make some real cuts. However, by timing the cash flow properly and delaying projects, I can show surpluses over most of that period while I'm simultaneously negotiating better deals with some contracts that come up for renewal or canceling them all-together. My options are left open until later in the year.

Nowhere is this more useful than with the capital budget. Most of our capital budget this year is for replacements -- PC's, printers and servers. Some of those PC's need to be replaced before upgrading desktops to W2K, some of the printers are killing us on maintenance costs as they approach the million page mark and the servers will need to be upgraded this year if we expect to implement the centralized desktop model. If not, we could skimp (although I really wanted those blade servers).

Of course, it could be argued that with personnel at about 34% of our budget I should let someone go. After all, that's the quickest way to reduce costs. Here's the deal -- it's my fundamental belief that with all the time, effort and energy the company spends on developing a person to be what the it needs is wasted when a person is booted out the door. Especially an IT employee. If I let a staff member go I have just lost 20% of the knowledge, creativity and energy my department had. That's just stupid.

At the end of the day I would prefer to keep the company train running on rusted wheels than abandon all the engineers. Yeah, we'll move more slowly. At least the damned wheels won't come off.

posted by Henry Jenkins | 12/17/2002 07:17:00 PM
(0) comments
Another Switch  

The German city of Schwaebisch Hall has decided to replace its Microsoft software with an open source solution. Their network of 400 PC's will be running SuSE Linux and OpenOffice on the desktop and SuSE Linux Enterprise Server in the back-office, with savings around $102,300. I assume that figure means projected savings over 3-5 years summed up today. I've run similar projections for my company. Despite this and Fargo, FL's conversion I'm still wary of Linux at the desktop as a full-featured client.
My concerns in a nutshell:

1. Administering 400 users. Easier with Linux than AD? Doubt it.
2. Print services. I have yet to get this working as easily as Windows does it.
3. Internet Explorer. Looks like there are a number of websites that are very IE friendly and give Mozilla fits, including financial websites we use.
4. Desktop support. User training costs money and time. So does missing the ability to remote control a desktop. Does VNC solve this problem?
5. Vertical apps. We have probably a dozen vertical apps that have no Open Source equivalent.

Don't get me wrong -- I'm excited to see this kind of thing happening. I'm just unconvinced everything's in place for us to convert our desktops as they are. Competition in the OS and app markets are A Good Thing. I hope this continues and we don't have a situation like OS/2 or the Mac, where open-source implodes or remains marginalized.

posted by Henry Jenkins | 12/17/2002 10:53:00 AM
(0) comments
Interlude

Sunday, December 15, 2002  

Tonight I spent the evening watching fireworks in Manhattan Beach while drinking mulled wine. I think pyrotechnics displays seem ever more hyperkinetic than the prior one. Drove home with the top down on the convertible, now watching the Lakers actually win a game. A great way to end a weekend.

posted by Henry Jenkins | 12/15/2002 08:51:00 PM
(0) comments
Slackers.  

A new book from Tom Demarco of "Peopleware" fame discusses how employees need to be cut some "slack," working at less than 100% efficiency. This, paradoxically it seems, will make them more productive. Although I have yet to read the book I have read the reviews and it makes logical sense to me. The premise is that knowledge workers, especially those in the IT industry, benefit from "non-productive" time. I agree with that. Wasn't it that bald-headed guy, Steven Covey, who identifies one of those ubiquitous seven habits is "sharpening the saw?" Do we really expect after a 40-60 hour work week that employees will take the time to learn new skills or actually stretch their minds? Hell no! They are tired and cranky and sometimes can't but see two steps ahead of them. This is one of the reasons IT departments get outsourced -- the employees get so busy doing maintenance (do more with less!) that consulting services are necessary to implement new technology. What crap. Train your own people, grow your own talent -- they ought to be the most familiar with your company's way of doing business.

BTW, I noticed in the Amazon review someone complaining that this just gives an underachieving employee another way of wasting time. I have two comments about that: #1 -- actually, I know that many underperformers can be made better. That is a management issue and one that I have dealt with several times. Yes, some people don't want to work hard. Those people eventually wash out, but motivations are not uniform, nor are the rewards people get from their work. It's a manager's job to discover the key. #2 -- I find it really, really ironic that someone claiming to write from The Land of the 35-Hour Work Week is worrying about underperformers. France is on the cutting edge of so many industries...NOT! LOL.

posted by Henry Jenkins | 12/15/2002 09:10:00 AM
(0) comments
The Difference Between a Hallucination and a Vision...

Saturday, December 14, 2002  

...is how many people can see it. This article discusses why change management starts with the hearts and minds of employees, not the directives of management. Duh. Are there any managers who honestly believe you can tell people to change and expect them to love it? Try this on for size:

"Today, senior management has decided that we will organize the client servicing division into teams rather than administrator and assistant pairs. You will receive new titles that sound like the marketing factory made them up, we'll leave your actual roles relatively ambiguous and you'll have to figure out the responsibilities within your team yourself. Oh yeah, and one person that we selected will be team leader. Now go figure it out. Go teams!"

That's pretty close to what happened in our company. Were the employees delighted? Does a bear...well, you get the point. Of course not. Management rearranges the chairs on the deck and forgets that maybe the staff would like that iceberg up ahead dealt with first.

posted by Henry Jenkins | 12/14/2002 12:55:00 AM
(0) comments
Training Wheels  

I came across an article about the current IT trend of staying out of the end-user training business and how it will ultimately costs IT departments control over strategic innovation, leading to wasted capital IT investments. I have an issue with the article.

Problem #1: IT should not be "innovating" without support of the business units anyway. If a company wants to abdicate its responsibility to aligning IT with the business strategy then that's a bigger problem than who should be doing training.

Problem #2: Setting up marketing and/or the HR department trainers as strawmen within the article. There is another choice. The business units need to be in charge of their own training to avoid the HR empire builders and because it is in their best interests to bring new users up to speed. This is a great example of alignment. Why shouldn't IT tackle training? Well, the last time I checked allocating "training resources" (aka man-hours) would take away from user support, replacements, upgrades and the time it takes within an IT department to "sharpen the saw" of each IT employee. IT should be working with business units to add value and/or cut expenses through its expertise -- advice, analysis and development in pursuit of business goals. Business units should be accountable for the knowledge transfer -- it is their bottom lines that are affected profoundly by lack of training.

Problem #3: There are really two kinds of training being discussed but not identified in the article -- what we would consider "generalized knowledge" like using a spreadsheet or word processing program, and "specialized knowledge" such as how to create new accounts in the company's core banking system. The latter is what I am addressing by laying the training burden on the business units. The first one is easily solved, actually -- stop hiring computer illiterates. Yes, illiterates is the proper term. They are as much a burden in the 21st century as hiring someone who can barely read and write was in the 20th. There's too much background information that needs to be taught, too much intimidation they feel from the very tools they should have mastered to do their work. You don't hire a CPA who can't use an Excel spreadsheet so why the hell are you hiring administrative assistants or salespeople who write letters but don't know how to use Microsoft Word?

posted by Henry Jenkins | 12/14/2002 12:48:00 AM
(0) comments
In-House Entrepreneurial Activities

Friday, December 13, 2002  

Running an information services department, no matter the size, means dealing with giant maintenance eyesore, sometimes on a daily basis -- software patches. Especially security patches. Even more so for Microsoft products (God bless Jim Allchin) -- although open source and other popular proprietary systems are hardly bastions of security. As bad as this is, there are far worse problems businesses face that require security procedures and processes in place to keep "in-house entrepreneurial activity" from occuring.

Security is more than rummaging through code and looking for unchecked buffers. Security is about business processes. It is not just a comprehensive trusted computing environment provided by controlling the desktop as tightly as the Department of Justice will allow. Although I think I understand what Microsoft wants to accomplish, I'm not convinced that Microsoft has the security mentality needed to re-evaluate security. How else to explain how so many bugs are created in their software and how an employee can abuse internal procedures to the tune of $9 million? Where are the controls, the double-checking/reconciliation, the dual-custody if necessary? Are they truly looking beyond the desktop to the entire business environment -- apart from attempting to lock businesses into their solution, of course.

It seems to me that open source development, by way of its collaborative nature, is less likely to see security as a whole than a single vendor or consortium of vendors. However, the security mindset is a meme -- get a few people thinking along those lines in major open source projects and the meme spreads, infecting positively the habits of developers and their project managers. If that basic change occurs and includes the additional question, "How will this software be used?" along with the usual, "What should it do?" then information services departments can only benefit and hopefully avoid ridiculous security problems in the future.

posted by Henry Jenkins | 12/13/2002 04:23:00 PM
(0) comments
Putting Desktop Clients on a Diet  

Thin clients -- the concept is grand. No moving parts, no modifiable OS, nothing to crash, centralized profiles, apps & services and end-user computing Nirvana is reached. The grass is greener, the skies are bluer and I think I hear angels singing softly in my ear...

There is a very good chance that our annual revenue projections this year are bogus. And that means sometime in the next 3-6 months, depending on how poorly our salesforce does, that the support services (uh, that would be us) will once again be asked to sacrifice. So why not get in front of that train and stay in front instead of the pain, suffering and inevitable loss of consciousness that a collision inevitably generates?

Today I had a meeting with my staff and we're going to attempt to get ahead of that train wreck by rolling out a thin client research project -- to lock down an existing desktop running Linux, set it to autoload Citrix's ICA client in full screen mode and make certain the apps loaded on our Citrix Metaframe XPs 1.0 server don't cause crashes (and we've only got about, oh, a dozen apps that need to be loaded). The guinea pigs, er, end-users in this experiment will win a 17" LCD display for participating. Our branch offices have the least-customized desktop environment so they'll be easy to test (and probably convert). We have 90 days to get the results of real-world tests done before we have to make a go/no-go decision.

posted by Henry Jenkins | 12/13/2002 02:36:00 PM
(0) comments
The Holiday Party  

Once a year my company has a Christmas, er, Holiday party. The last couple of years it was held at the CEO's palatial estate; this year it was moved to the Santa Ana Performing Arts Theater. Smack dab in the heart of downtown Santa Ana, this ex-Masonic temple is a stone block monument to the architect's sublime achievement -- a building that resembles nothing quite like a child's rendering of a short, squat medieval tower. Or, as our parent company's Chairman said to me, "This building has history. Unfortunately, none of it is very interesting." At least he gave me his drink tickets. Oh yes, we were limited to two drink tickets. However, for those of us who are a little more resourceful, more drink tickets could be had. I believe I still had half a dozen unused by the time I staggered from the party. But I digress.

The main event at the Holiday Party, besides the bar and seafood buffet, is to present awards. There's an award for internal customer service, external customer service, leadership, mentoring, sales excellence, department excellence and the Eagle award, a four-foot tall monstrosity capped with a chicken-sized eagle, which nominally goes to the best all-around employee. Frighteningly enough, I did receive a blob of plastic letting me know I had been Mr. Mentor for the year. All of the threats, beatings and psychological warfare I waged on my staff, as well as the bribes used on the selection committee, finally came through. Oh yes, a check was included with the award. I'm not sure what I did with little plastic award, come to think of it, but the check has already been spent.

Every year the Holiday Party is held on a Thursday, presumably to keep people like me from doing what I did last night while whoring for drink tickets -- buy scotch all night until I was one of the stragglers remaining, listening to the jazz band playing. Considering what it normally costs for a night out, sitting around drinking free scotch, eating free filet mignon and listening to a free jazz band is probably the most cost-effective after-hours experience I've had in a while. Thank you, clients! That's where those fees go...

posted by Henry Jenkins | 12/13/2002 01:15:00 PM
(0) comments
Training Day

Tuesday, December 10, 2002  

Every two weeks I subject my staff to a "biweekly topic." The goal is to give them purpose when they have downtime in between meetings. Not just any goal will do -- the idea is to stretch their skills, make them learn something new that is beneficial both to the company and to them personally and/or professionally.

One of the goals that I've given to my staff is to research and open-source database solution, in this case MySQL. MySQL promises to deliver a transactional database for free. Free makes for an excellent return on investment if it is robust, reliable and easily managed. I spent some time today reading a couple of articles about the IT department in Largo, FL, and their investment in open-source. Whereas I believe my company will find more Open Source use in the data center, it appears Largo is making Open Source on the desktop a reality. Though I'm skeptical of zero-administration claims on the desktop (even with thin clients), it does offer some pause to see 1.3% of revenue spent on IT. That would be a reduction of over 80% of our current costs. Now I can afford to put those Boxsters on the lease schedule.

So where does the city of Largo wring out its IT expenses? From the article it appears they find savings in the centralization of software (and the use of thin clients), the use of open-source to avoid Microsoft's licensing fees whenever possible and a culture that allows for changes in desktop software suites.

Centralization in software with Largo begins with Citrix Metaframe on top of Windows 2000 Terminal Services. By using thin clients, all processing power is concentrated in their servers. Thin clients need just enough processing power to update the screen, keyboard and mouse. Desktop control and migration/replacement costs are reduced signifcantly, although Largo has to pay for the Citrix server and client licenses, as well as Terminal Services Client Access Licenses for each thin client. This wrings out about $500 per replaced desktop per year, compared to an organization that buys newer PC's to replace depreciated ones. I do like this idea and when coupled with the idea of virtual terminal servers (see my prior entries on server consolidation) may change the replacement cycle from once every three years to thin clients with a once every five year or more cycle, resulting in desktop equipment savings of 83% (assuming $1K per PC replacement and $500 per thin client, PC's are capitalized over 3 years, thin clients would be fully expensed in each year purchased).

By centralizing, Largo has put itself in a position to move end-users to a new centralized platform that obviates paying for Citrix and Microsoft licenses. Their solution is to use KDE on Linux and OpenOffice, an open-source alternative to the Microsoft Office suite. Avoiding Microsoft's Office XP licenses is about $300 per desktop, the Terminal Services CAL is another $65 and the Citrix Metaframe solution is probably $3K - $6K per server. In my company, we seem to have several vertical-market applications which only work under Windows (that pesky network effect). It would be difficult for us to switch to an all-Open Source thin client desktop system, although the Wine Project (Windows Emulator) has piqued my interest. This will become the heart of another biweekly topic.

Finally, the corporate culture in Largo is obviously very malleable when it comes to replacing desktop software. As a department manager I can only scratch my head in wonder -- I have learned often over the last six years that attempting to get end-users to learn new software or convincing them to switch software and lose existing functionality (especially calendars!) would bring forth shrieks of agony.

Is there truly a replacement for Outlook when it's become part of the corporate culture? What about the look and feel of Microsoft Office? Directory services? How difficult is it managing hundreds of users in a Linux environment? What about printing services (I've had some bad Linux printing experiences so far)? Locking down the desktop -- how easy/hard is it? Better than Group Policy Objects? Bottom line -- can the desktop experience be replaced with something else that does not inflict much pain on the end user and does not take away existing functionality while yielding a better bottom line?

posted by Henry Jenkins | 12/10/2002 09:59:00 PM
(0) comments
Interlude

Thursday, December 05, 2002  

Sometimes middle management has a silver lining, like going on out a boat with the boss and a vendor, watching the sunset over the Pacific Ocean as the horizon turned orange to rose to crimson to gray to indigo to black. Smoking cigars, drinking bourbon and losing $51 at poker. OK, the last really, really pissed me off -- otherwise it was a fantastic afternoon and evening out with the guys.

posted by Henry Jenkins | 12/05/2002 11:41:00 PM
(0) comments
It's Budget Time

Wednesday, December 04, 2002  

I always get energized budgeting for my department. Why? Because deep down, I am a frustrated writer. It is only during the budgeting season that I can exercise the imagination and vision equal to the famous Russian novelists, bringing forth a piece of dense fiction like no other. I will not lay claim to "epic" status, though I will state for the record that my efforts have yielded the sixth volume in this yearly saga.

Our fiscal year is the same as the calendar year. The fun begins about 6-12 weeks before the end of December. It consists of four parts: salaries, capital expenditures, expenses and revenue, in that order. Department managers create their budgets and then senior management rolls them up into a single budget. The astute among you (and you know you are) may ask the question, "How can you create the expense budget for the company before you know how much revenue you will generate?" That is because you are constrained by such trivialities as the laws of mathematics and logic. We practice what I would dub "postmodernist budgeting," except that we try really, really hard to drop the context as well. Perhaps it is an exercise in existentialism, where the existence of the numbers precedes the essence of their meaning?

Numbers without context are dangerous (duh). If, for example, our CEO reads that "financial services companies are reducing their IT budgets by 50%" without asking the questions "why?" or "how?" or "how much were those budgets growing over the last three years vis-a-vis our company," then he plays a game of data Russian Roulette. It doesn't become useful information until given some background, which of course forces me to start digging up some industry metrics (a difficult task when you can't pay for quality research -- it's a lot like dumpster diving through Google). Fortunately I was able to make some comparisons between my company and the banking industry as a whole with metrics from a META Group report. Comparisons are below:























Metric

My
Company

Banking
Industry

IT Spending as a Percentage of Revenue:

6.8%

6.3%

IT Support Personnel / Total Personnel:

5.2%

7.3%

IT Spending per Employee:

$15,890

$16,039



These give my department's budget numbers some concrete existence rather than the belief that I pulled them out of some nether orifice or (let's be pretentious) created them ex nihilo. I am a traditionalist in a budget theater of the absurd.

posted by Henry Jenkins | 12/04/2002 01:05:00 PM
(0) comments


Monday, December 02, 2002  

Server consolidation is about saving money. Thanks to the condition known as DLL Hell (for those of us using Microsoft products -- another discussion for another time), it is almost impossible to run more than one any major application on a server. So what to do? Dedicate one server per application? Although robust, it's definitely expensive. There is a technology that make server consolidation cheaper and easier to implement than ever, while preserving IS departments from DLL conflicts -- virtualized servers. The concept of a virtual server is as old as the mainframe; several "virtual servers" that act like independent servers running on their own hardware actually run on a single physical server with the aid of a program or specialized operating system that doles out resources like memory and hard disk space to each virtual server.

At my company we use VMWare's GSX server running on dual-processor Dell servers. GSX server is an application running on top of either Windows 2000 or Linux that manages the virtual servers. With this arrangement we've been able to get 2-4 virtual servers on each piece of real hardware. The best servers to virtualize initially are those that are not processing and RAM intensive. In our case, that meant some web servers, Sharepoint Portal Server, development servers, Windows 2000 domain controllers, FTP servers, file and print servers. Cost savings come quickly. Here's a typical scenario (server prices are from when we bought them):

Four dual-processor servers at $7K each (2xPentium III CPU's, 4GB RAM, 72GB Ultra-160 SCSI RAID 5 hard drive space) = $28K.
One dual-processor server at $7K plus a copy of VMWare's GSX server at $3K = $10K.
Assume the same cost for application and OS software licenses.
Cost savings are $18K, or 64%.

What about CPU, RAM or hard disk-intensive applications like Exchange or SQL? Well, we've been able to migrate our SQL server and Exchange 5.5 server (the latter by breaking it into two). We also had to use a dual-processor P4 Xeon server with 6GB RAM. It did the trick -- our SQL 7 server with our imaging, CRM, and intranet-based apps servicing 100 users runs very well. Our Outlook users likewise have no complaints. Another tack we took was to migrate the intensive I/O for SQL database devices and the Exchange information stores to a clustered Network Appliance F820 device. That speeded up I/O access considerably compared to GSX's virtualized hard disks. Next year we expect to explore VMWare's ESX server, which is a specialized OS rather than an application running on an existing OS. It's licensed on a per-CPU basis so it may not have the same kinds of cost savings GSX server does, but performance is supposed to be better.

Tomorrow I will explore some more benefits of this arrangement in the area of disaster recovery.

posted by Henry Jenkins | 12/02/2002 09:24:00 PM
(0) comments
Server Consolidation II - Recovering from Server Failure  

One of the obvious consequences of server consolidation is that if a physical server dies, several virtual servers might be taken out as well. The simplified version of the solution we've implemented is to purchase a second physical server and place copies of the virtual machines on it. Because a virtual server consists of several files in a directory, migrating a virtual server from one physical piece of hardware to another is simple -- it's just a file copy. This reduces the proposed savings in my earlier post to only $8K, or 29% savings. Still substantial for most data centers.

Naturally a file-copy recovery strategy is only as good as the last virtual server backup. One improvement is to move rapidly-changing data off of the virtual server to some kind of centralized storage (in our case, a NetApp filer). Frequent backups of the virtual servers become unnecessary. Files that can't be migrated to a centralized storage device should be backed up on a regular schedule, preferably to a NAS or SAN -- again, for quick recovery.

posted by Henry Jenkins | 12/02/2002 08:05:00 PM
(0) comments
De Novo

Sunday, December 01, 2002  

My musings on technology and management principles as applied at a tiny little financial company in Orange County, CA. I run the information services department. My hope is to use this blog to explore the practical application of new technology and management ideas in a real-world environment.


posted by Henry Jenkins | 12/01/2002 05:20:00 PM
(0) comments
search
the author
archives
links
open source
vendors
stats
reading