
Original Link: https://www.anandtech.com/show/821
The State of Corporate IT: A case for Linux
by Paul Sullivan on August 31, 2001 12:23 AM EST- Posted in
- IT Computing
We are in the midst of a severe corporate downturn, where the focus is more on the bottom line than perhaps ever before. As cost-cutters keep searching for ways to trim expenses, they often find themselves looking at one of the largest non-personnel related areas: IT Infrastructure.
Computer systems are such an integral part of most modern businesses that even the slightest hiccup can cause a serious financial hit. Most workers need these computers to do their jobs and the moment systems go down, productivity hits a wall. Not only do corporations have to be concerned with stability, they are becoming more and more concerned about the ongoing costs of running these systems.
The hardware is only the first step in creating this infrastructure. Once the hardware is in place, you need the software to get it all running in a productive manner. For nearly a decade, the momentum has been towards a Client/Server model based on the Microsoft Windows NT platform and for many, the move has meant an easier to administer, easier to implement configuration. But as NT has become more entrenched and companies have become more dependent on it, corporations have had to endure a drastic increase in overall costs.
Part of the increase has been hardware related. As NT has grown and become more capable, it has demanded much faster and more robust systems to work its magic. Another part of the increase has been security related. Corporations and consumers can be a very demanding bunch, and fairly or unfairly, Microsoft has been faced with the task of trying to make NT all things to all people. In doing so, they have not been able to devote as many resources to security as they would have liked, and as a result, NT has proven to be less secure than originally hoped for. Their web server software in particular has been under assault from individual and organized attackers, and efforts have been increased to combat these intruders and to shore up the front line of defense.
But by many accounts, the largest cost of ownership increases that corporations have faced have been licensing related. As NT has become a mainstay, licensing terms have become more specific and more expensive. In addition, the explosive growth of the internet has brought security and reliability to the fore, and shoring up an evolving infrastructure can become prohibitively and increasingly expensive.
One Tale of NT's Journey Into Corporate America
To help make this evolution more understandable, we will use an example based upon the experiences of a corporation with a presence in Washington State. This company currently employs some 7,000 people at its primary site and had made the transition from a combination of Unix/Novell software to Windows NT. The move was cost justified based upon the ease of administration and a reduced cost of ownership, but years into the transition, administration and licensing costs soared and they were faced with some harsh realities, particularly when the market took a downturn and belt tightening became a necessity.
Initially, the company was approached by Microsoft and pitched on the idea of moving over to the NT Server platform. As any good company would do, Microsoft gave them a very hard sell and did an excellent job of convincing them of the potential benefits. One of the biggest parts of the pitch was the generous support that MS pledged to provide to corporate clients, and that support was perhaps the turning point in the decision to make the change. Direct support from IBM, Sun and Novell was becoming more expensive, even as it became harder to obtain. The company worked on a fresh agreement with Microsoft and took the plunge.
Originally, the licensing agreement called for a $20 per seat annual fee for each client that would be accessing NT servers. Originally, two NT multiprocessor NT servers were put in place, each hosting Mail, Internet, File and Print requests. Costs for each server were based upon the number of clients accessing those servers at any one time. Since the company had their employees in separate shifts, only one half of their total employees would be capable of using those servers at any given time. After an initial analysis, it was determined that at no time did the number of concurrent users exceed 2,000. The drafted licensing agreement called for 2,000 concurrent licenses at $20 per annum, for a total of $40,000.
Over the next few years, as the transition from the old server software to the new became complete, changes to the licensing and service agreements were introduced. Uptime over those first years was not nearly as much as the service agreement had called for, and the cost to the company was becoming severe. When time came to renew the service agreement, the company sought more assurances and tighter uptime requirements. In response, Microsoft cited increasing demand on the servers and indicated that in order to ensure proper service and support, each server would have to be limited to one of three primary functions: Internet/Mail, File Serving and Print Serving. NT 4.0 performed best when each of those tasks was handled by a dedicated server and system integrity could only be guaranteed if the functions were separate.
In addition, the case was made that with the split of responsibility to multiple servers, the number of concurrent users could not be accurately determined and that it would be necessary to pay licensing fees for all of the clients that might use those servers. Employment at the company had increased by some 1,000 workers and network usage had increased along with it. When all was said and done, the company was asked to expand from two to six servers and to pay client access fees for a full 5,000 users on each of the three primary servers.
Under this proposal, annual licensing fees would increase from $40,000 to $300,000, but uptime performance would be guaranteed at a specific rate and there would be rebates should those rates not be met on a consistent basis. The company had invested over a million dollars to make the switch from the old to the new, and at this point, going back was not a viable option. Reluctantly the agreement was made and they moved forward.
The next couple of years saw a dramatic increase in data storage requirements and internet use as employment rose to nearly 7,000. The server redundancy helped ensure a higher level of uptime, but maintenance costs were going up as the internal IT team spent more hours working on the extra units that did go down, prepping them to go back up again. As redundant servers went in and out of service, data synchronization was becoming more critical and ensuring data integrity became an even costlier proposition.
During further licensing negotiations, Microsoft proposed that the company transition away from other suites and applications to Microsoft Office. In exchange for this move and the earlier commitment to the NT server line, Microsoft would give them a significant break on site licensing for these applications. They would even aid in transitioning their data warehouse from Oracle to SQL Server. At the time, the company took them up on the Office licensing bundle but skipped on the Oracle conversion. They would ride the market with the infrastructure they now had and do some long term evaluations before making any further commitments to expanded licensing agreements.
The Tide Shifts . . .
As tends to happen with fast growing companies with demanding shareholders, there were some major changes in the ranks of upper management. With those changes came a more disciplined approach to cost structures and an increased scrutiny on exploding expenses such as IT budgets. It so happened that in the bi-annual review, one of the topics of discussion was the proposed transition to Windows 2000. Microsoft had proposed a very comprehensive package, but the cost conscious team leaders were hit with a fairly serious case of sticker shock. The decision was made to pull together the primary department heads, key IT staff and a team of Microsoft representatives to go over the possibilities.
After extensive meetings, some significant concerns began to surface. The proposed transition to Windows 2000 would be much more than a simple upgrade, but would actually constitute a paradigm shift in the way domains and assets were handled and managed. Active Directory (AD), a new and ambitious idea, was certainly not well established and at this point in the discussion, did not really seem ready for prime time. There appeared to be many unanswered questions on the MS side, and their "leap of faith" mantra was falling on skeptical ears.
According to the MS proposal, the transition to AD would eventually involve a complete shift of all internal systems to updated software, in part because Active Directory was not designed to be backward compatible with earlier Windows 9x client software. At the urging of MS, the company had earlier made a very substantial investment in the Primary Domain Controller (PDC) and Backup Domain Controller (BDC) paradigm established by earlier NT iterations. They had purchased the machines and the licenses, had passed dozens of their IT staff through the extensive training outlined by Microsoft and had literally worked years to develop a functioning asset management program utilizing IBM's Tivoli software. A change to the AD model would require another huge cash and resource infusion, and might possibly negate many of the benefits realized from the earlier investments in the PDC/BDC model.
As if all of this was not "Red Flag" enough, there would be some serious changes in the licensing agreement that would now cover all types of external and internal network accesses including terminal services, remote dial-in and the use of Virtual Private Networks (VPN's). Further, another proposed modification would take into account the number of accesses and transactions conducted over servers using MS hosting software. When it was all put together, the cost implications were absolutely staggering.
Sink, Swim or Tread Water?
The company in this example was at a critical point in the decision making process. They had made the choice some years back to migrate to the NT platform and invested heavily in the infrastructure. But now they were forcing themselves to take a long, hard look at their decision before they would commit any further. Cost analysis had shown that even though the company had poured a great deal of money into software, hardware and institutional retraining, the benefits were not nearly as apparent as promised.
The fact that the NT platform had not proved itself to be as scalable or reliable as promoted was a major thorn in their side. The redundancy suggested by Microsoft had helped increase uptime, but increased the already high maintenance of the growing server clusters. While the previous Unix and Novell platforms had handled file, print and mail servers on a single server, NT now needed one machine for each service plus a dedicated backup for each. Hardware costs were not the real concern - it was the licensing and maintenance requirements that hit the hardest.
The modern workforce was changing, and remote access was becoming much more important for collaborative efforts and virtual workstations. The excessive costs of purchasing licenses for each and every terminal server, remote dial-in and VPN access was simply not going to be possible with the start of the downturn in the tech sector. Though E-commerce was not as explosive as predicted, the web was becoming a very significant tool for internal and external customers alike, not to mention vendors and other third parties. Paying fees for each of these transactions would severely blunt the effectiveness of the entire process, actually making it more cost effective to take a step back and do things the old way.
The company was knee-deep in the mire here, and had to face some tough decisions. Luckily, they had some options. Linux had been gaining a steady groundswell of support over the past few years and had some serious advantages in terms of cost to benefit ratios. Not only was the software free, but it would run on existing hardware and could actually be tweaked and recompiled to maximize performance in key areas such as file and print serving. Plans to advance into a modified agreement with Microsoft were to be put on hold in favor of further exploration into Linux. It would be a decision that they would not regret.
The Advantages Of Linux
Red Hat was key in helping them realize the benefits to the bottom line. Within 60 days of the first overtures, they were on site with a demonstration that completely blew the corporate team away. Red Hat brought a single Pentium class system for a site visit and thanks to the early legwork their engineers had done, were able to integrate the box into the network and take over all file and print server requests for one busy segment within four hours. The system ran for the next 10 business days without any downtime, something NT machines had not been able to do very often. All issues that did come up were fixed on the spot without a single kernel restart. File and print transactions were stored in ques and processed without incident. Samba allowed the Linux box to seamlessly integrate into the file network and actually increased overall performance. Nightly backups were performed from the master NT server without any sign of incompatibility. Print jobs were also handled seamlessly with fewer delays and error messages along the way. This limited demonstration was an absolute success and had most of the corporate advance team nodding their heads in approval.
Additional demonstrations followed that focused on developers and system administrators. Using a Linux system with a combination of existing products and a newer application called VMware, developers were able to write, compile and test code on a variety of distinct platforms from a single machine. When code did crash, it affected only one environment, which could quickly be killed and restarted without interfering with other processes. Since the host GUI was not tied to the Linux kernel, but was instead run as a separate process, even the most complex and sensitive graphical development would not bring down the machine. In each case, individual processes were simply killed and restarted to a fresh state. Productivity benefits were obvious.
By activating an Apache web server on the same machine, development, administration and testing of the corporate intranet could be handled in real-time. Quick fixes could be made in code windows, loaded in the appropriate Apache folders and tested on multiple platforms with multiple browsers within minutes. This level of stability and flexibility was something these developers had been craving for some time. Part of the excitement involved the bottom line as well. With the exception of VMware, all of the Linux software was essentially free and did not have any of the restrictions imposed under NT EULA's. Developers were able to clone and distribute development environments to other machines across the hall or across the country without fear of violating licensing terms.
In fact, with Linux, there would be no licensing fees at all. The proposed file servers, print servers and web servers to be hosted under Linux did not require the purchase of any client licenses whatsoever. There would be no E-commerce transaction fees, no distribution limitations and no expensive application bundling requirements. Developers could use a variety of existing applications, from Star Office to SQL databases to C++ IDE's, all without additional fees. Contractors could be provided with all the tools they needed with no additional impact on cost.
During these developer and administration demonstrations, it was found that users already familiar with Unix seemed to feel right at home in Linux, so extensive retraining would not be needed. The KDE environment was similar enough to Windows so that even novice users got the hang of it pretty soon. Since Star Office was also free for use on Windows machines, entire project teams could be assembled and provided with some of the same key tools, regardless of platform - all without impacting the bottom line. The possibilities were becoming obvious, as were the advantages. Linux provided more freedom and more flexibility at a lower initial and long-term cost. Linux gave them an alternative - one that could be explored and researched without the need to allocate additional capital funds. It was a pretty easy decision to continue exploring this new platform.
Linux Makes An Impact
In addition to all the benefits and possibilities mentioned earlier, Linux gave this company a bargaining chip in license negotiations with Microsoft. Earlier, they had passed on the Windows 2000 and Office 2000 upgrades, waiting instead to see what the future held after the evaluation. When the move to XP was being touted by Microsoft during subsequent meetings, they found that they were faced with some surprises. The proposed fee structure was radically different and established what amounted to a cost penalty for those who chose to stand pat instead of upgrading their operating systems and office suites to XP versions upon their initial release.
Currently, they had a large mix of Windows 9x and NT 4.x clients successfully running Office 97 software on laptops and desktops. They had stayed on top of bug fixes and system patches and found that for the most part, their existing infrastructure performed well enough as it was. They had maintained their NT 4.x Server infrastructure as well, opting not to move to Active Directory.
During this difficult time, Red Hat had proven to be a helpful ally. Instead of trying to push a whole-scale replacement of the infrastructure, they had worked to supplement it. Over time Linux brought more security, improved load balancing and an overall reduction in the growth rate of IT spending. Point of sale terminals were reliable, easy to manage and did not incur additional transaction costs. Their remote access and VPN configurations handled an ever increasing load with a higher degree of reliability and a lower cost. Their intranet had been transitioned over to Linux, and as a result cost less to maintain. It also eliminated interference with IIS based consumer and vendor systems accessed from outside of the company.
Through a series of such modifications, they had been able to establish and maintain a more stable, more cost effective configuration. Their network was more flexible and more able to meet the needs of a changing marketplace. Projects could be isolated to their own LAN or WAN segment without impacting other services and teams of experienced Unix/Linux workers could be called upon when NT resources were scarce. As a result, overall TTM (Time To Market) was reduced for mission critical consumer applications and customer satisfaction actually rose in the midst of explosive growth.
Linux was not the right tool for every job, but it certainly had proved its mettle as a cost effective alternative and helped give them some breathing room as they worked to bring soaring IT costs under control and reduce TCO (Total Cost of Ownership). It was ironic that only by turning to an alternative operating system were they able to realize some of the cost savings promised them when they initially switched over to NT. Linux had not only given them tangible benefits, it had increased confidence in their ability to manage their own systems.
This was important because over time there had been a growing fissure between what Microsoft had originally promised and the proposals they were making today. The constant tinkering with licensing agreements, the perpetually increasing fees and the imposition of bundling and usage restrictions had generated a lot of bad-will. The Microsoft of old that had come knocking on their door with friendly overtures was no more. In its place was a company that stifled their clients with ever-increasing pressure to upgrade or face the prospect of paying higher fees and receiving reduced levels of support.
So when the time came to make a decision on the transition to XP, they felt they were in a much stronger position. They had found a willing partner in Red Hat, a viable alternative in Linux and a sense of control over their own infrastructure that had previously been lacking. Though they might face higher licensing costs later on, they opted to again bypass the proposed Microsoft solution in favor of standing pat.
Summing it all up
Some months later, with the market still soft and the bottom line increasingly important to shareholders, the team feels they made the right decision. The proposed licensing agreements would have required a complete switch to new versions of the Microsoft XP software, increasing deployment costs. The system requirements of Office and Windows XP would have mandated a substantial investment in the purchase and deployment of new hardware and the transfer of system data. Changes to the system interface would have required worker retraining. Changes to the core OS would have rendered a variety of third party software and utilities unusable, dramatically impacting productivity and further increasing upgrade costs.
In addition, Microsoft was being put under intense pressure from Federal Regulators, the courts and consumers. They were at odds with Sun over Java, with vendors of the status of icons and with consumers over security and product activation. Hackers were finding holes in their server big enough to drive a truck through, and security experts like Steve Gibson were bringing other serious flaws to light. They questioned the need to include Direct X and other consumer-friendly services into the server version of their OS. They questioned the extensive integration of IE 6 and Smart Tags. They questioned the usage requirements of their Passport software, a key part of their XP and .Net strategies.
Sometimes, when there's smoke, there's fire and heading in another direction can be the smart thing to do from a client standpoint. Companies have been dropping like flies in this marketplace and only the nimble seem to be able to survive. You have to be willing to challenge existing paradigms.
Change does not always have to be a frightening thing, and it is always a good idea to have alternatives at your disposal in the corporate world. Having "all your eggs in one basket" has been considered a risky proposition for a lot longer than computers have been around, but the principle is still sound and timely. In the case of the company we used as a basis for this example, thinking outside the box paid off.
As a result of their willingness to look beyond, they now have a more cost-effective, more stable and more predictable infrastructure in place. They have been able to benefit from the hard work of the Linux community and the support of companies like Red Hat. They have been able to establish and maintain key relationships with forward thinking companies like Dell, who started bundling and supporting Linux on their server machines early on.
Because they were willing to open their eyes to new ideas and challenge convention,
they have been able to hang on where others have not. If other companies facing
growing infrastructure costs are to survive these difficult times, it may be
a good idea for them to do some evaluations of their own. After all, Linux is
a free download and Red Hat is only a phone call away...