Windows in the Glass House

Run important applications on Windows? A few years ago, fears of the blue screen of death would have frightened many companies away from entrusting their revenue-producing activities to a Windows platform. But Microsoft Corp.'s Windows 2000 Datacenter Server—the product and the program—is changing many minds about letting Windows into the glass house.

Long ago, when computers meant IBM, mainframe vendors controlled every aspect of computer systems: hardware, operating system and applications. This helped vendors ensure stability and reliability levels, at least until the courts made them unbundle their systems.

In the PC era, the situation changed. PCs were built for modularity of hardware, software and operating system. With many vendors selling hardware, applications and drivers, who could guarantee stability and reliability? Finger-pointing was the primary response.However, as CPUs became cheaper and more powerful, vendors began to pack more chips into machines. These servers were more capable than ever but still suffered from a multitude of hardware platforms and a bewildering array of drivers and applications.

Windows 2000 Datacenter Server is Microsoft's attempt to change that, coming full-circle to the mainframe model. No, Microsoft isn't building machines, but it is trying to exercise some control over the hardware, operating system and driver and application bundles. Enterprise users are finding the result compelling, not only in terms of cost but also in terms of stability, reliability and simplicity.

Product and Program

Datacenter is both a product and a program designed for enterprise users. It's the top-level Windows operating system from Microsoft—but the company won't sell it to you. Instead, you buy Datacenter in a package along with certified, supported server hardware from a traditional vendor such as Compaq Computer Corp. or IBM.

As part of the Windows 2000 family, Datacenter shares that operating system's services, including Active Directory and security. Datacenter offers impressive handling of up to 32 processors per server. This allows great scalability when paired with appropriate multiprocessing hardware, and it's a lot more than Windows 2000 Advanced Server can offer. Also, Datacenter can address up to 64GB of main memory for managing complex applications.

Datacenter includes a Process Control tool to help manage all those processors. The tool lets you oversee workload and performance across all processors. You can dedicate certain processors to certain applications, so those applications don't have to go searching for idle cycles, thus reducing overhead and bottlenecks. It's also possible to change the load balance dynamically, pulling more processors in for certain applications when needed.

Multinode clustering of up to four servers is also part of Datacenter. Four-node clustering is far better than two-node clustering; with the latter, if one server fails, you start praying that whatever took down your primary server doesn't hit your only remaining server. Four nodes give you more breathing space. If one server fails, you still have three as backups.

Cascading fail-over is very desirable for systems that must keep running to keep revenue flowing, such as consumer-oriented e-commerce. With some other operating systems, clustering is available only through a third-party product, rather than as a part of the operating system itself. Here, Datacenter has a distinct advantage.

Get with the Program

Perhaps more important than the software is Microsoft's Datacenter accreditation process. A PC vendor can slap a copy of ordinary Windows on a box and call it a system. Datacenter resellers, however, must agree to a rigorous process for ensuring reliability and stability. This includes a mandatory 14-day test for every hardware-driver-software configuration. Typical drivers include background antivirus and backup products. "Most problems with NT were really due to problems with drivers," says Steve Every, product manager of Microsoft operating systems at Blue Bell, Pa.-based Unisys Corp.

The system runs a battery of demanding tests under heavy loads to find possible glitches. These tests have already exposed some driver problems, which have been fixed. Even after passing the 14-day marathon, whenever any part of the tested configuration changes, a reseller must do a seven-day retest. Testing of the four-node cluster features is a separate requirement. "Customers were doing all this kind of testing themselves," says Robin Hensley, director of the Datacenter program for Compaq's Industry Standard Server Group. "By taking this on ourselves, we lessen the need for customer testing, reducing the time for deployment." When a customer gets its new system installed, the staff knows it's a stable system and that it's already been through the mill.

Support and service is also different; it's a joint reseller-Microsoft operation that should eliminate finger-pointing and expedite problem resolution. Resellers must offer a list of services to their customers and have programs in place to guarantee them. These services include a guaranteed minimum of 99.9% availability and a maximum of four hours' on-site response to problems. The reseller must also assist customers in planning and designing their systems. This lets the reseller assess the level of availability that's desired and possible for a given customer situation. For example, customers may not have in place procedures to support high availability. Even their power supplies can affect what's possible for them.

Limiting the number of certified environments also simplifies problem-solving. For instance, it's far easier to maintain a few standard systems on which to rapidly replicate problems.

You might think all these requirements would scare away resellers, but many hardware vendors have signed on to the program. The draw for them is that they get a system and a program that best shows off their top-of-the-line multiprocessor machines and eliminates the cringe factor. Each major server vendor has at least one product line supporting Datacenter.

The Datacenter change control process also minimizes customer impact and maximizes reliability and stability. Changes in the operating system, hardware, drivers and software must all be thoroughly tested with resellers before being certified and offered to customers. Changes will occur as a unified bundle about every six months. Customers will know when changes will be released and will be able to accept or pass on each update package as they see fit. Thus, system changes will be painless and an improvement, not an impediment.

Microsoft is also initiating an application certification process. To be certified for Datacenter, the application must undergo testing by independent lab Veritest (a service of Waltham, Mass.-based Lionbridge Technologies Inc.) for Windows 2000 compliance, stability and the ability to handle Datacenter features like multiprocessing, big memory and clustered environments. Certified applications will be preferred for these environments.

Obviously, customers reap a number of benefits from this combination of product and program by getting a comprehensive, integrated environment, not a crazy quilt of hardware and software awkwardly meeting for the first time. We can expect this to contribute to the overall system stability, reliability and availability.

This stabilizing control over the operating environment isn't limited to a single machine or reseller. "Customers aren't locked into a sole manufacturer," points out Michel Gambier, group product manager of Datacenter Server. The number of notable resellers actually gives the customer a lot of choice.

Furthermore, Microsoft sets only the minimum standards. Resellers can offer more features or better prices. For example, Stratus Computer Inc. in Maynard, Mass., known for its fault-tolerant systems, aims to better the availability mark by offering 99.999% uptime with Datacenter—that's about five minutes of downtime per year. This shows the breadth of choice available to the customer and also the regard these manufacturers have for the Datacenter product as the basis for a stable and reliable system.

Selling the system

Hardware manufacturers are approaching Datacenter from several angles. Compaq has designated its ProLiant 8500 model as its Datacenter machine, with eight-way and 32-way multiprocessing options. The company has also established its own testing lab in Bellevue, Wash., close enough to Microsoft's headquarters to simplify collaboration. Compaq served as a prime development environment for Microsoft in the development of Windows 2000.

Unisys comes from a glass-house background. Its hardware offering is the ES7000, a 32-way multiprocessor machine with Unisys' Cellular MultiProcessing server architecture.

The variety of Datacenter features makes for an interesting combination of likely customers. For example, large dot-coms can benefit from the around-the-clock availability, cluster-empowered fail-over and scaling capabilities. "Many large e-businesses have been wondering, 'How are we going to make it through Christmas 2000 when we barely made it in 1999?' This scalability to deal with seasonal—and transient—demand is vital to them," Hensley says. Processor-intensive operations such as heavyweight database support—typical for many large enterprises—would benefit from the multiprocessor and memory power. Application service providers and Internet providers, which must guarantee uptime, would benefit from both the increased stability and fail-over functions of certified systems. Such enterprises include financial institutions and e-businesses that must be up to make sales.

More broadly, any enterprise that seeks to consolidate server functions could benefit from Datacenter's multiprocessing. This is especially attractive if an organization is currently supporting servers from multiple vendors.

Companies that wish to reduce the number of operating systems they support would probably welcome the opportunity to move to a single enterprisewide system. Since Microsoft already owns the desktop, that piece of the puzzle must stay in place, but Datacenter makes it possible to move enterprise-level applications from other operating systems, such as Unix to Windows 2000. This can simplify staffing, since organizations wouldn't require separate staffs for each environment or individuals proficient in multiple systems.

"At Compaq, we talk to both the Intel/Windows and the RISC/Unix community. We see the desire for single, unified systems," says Hensley.

Of course, moving to a new operating system requires capital investment for hardware and software, as well as additional staff training. Companies unable to make this commitment will most likely steer clear of Datacenter as long as their current systems remain adequate. However, the savings that Datacenter can offer in maintenance, management and support may tip the balance.

DOT-COM Votes for Datacenter

Garden City, N.Y.-based Inc. provides online election services to governments, universities and other organizations worldwide. The company hosted the first legally binding online election—Arizona's Democratic presidential primary—and many absentee ballots in the November presidential election originated online with

The company was one of the first customers of Windows Datacenter. "We did look at other non-Windows solutions," explains Mark Prieto,'s CIO. "We concluded that Datacenter on the Compaq platform would give us what we needed."

What needed was a reliable and scalable operating environment that could ramp up quickly. For the site to successfully handle elections, reliability of the system is clearly a must. It needs to be ready to handle online voters whenever they decide to vote. And the site must be able to handle an unpredictable number of online voters. "For one student election, we handled over 700,000 voters in a single day," notes Prieto. This is possible only in a highly scalable environment that can add processing as needed.

Finally, must be able to ramp up a new service rapidly. "We have to be able to handle new clients quickly, sometimes at the last minute, so scaling becomes crucial," Prieto says. The company is finding that scalability with Datacenter.

For hardware, uses a Compaq ProLiant 8500 cluster, which has been reliable and presented no problems. Since the company's proprietary applications already use Microsoft's SQL Server (which is scaling as needs it), the Windows connection is a bonus. And the system met the company's budget. says it anticipates smooth sailing, thanks to the Datacenter support program. Staffers say they have found the support and service to be superb and expect to be able to upgrade gracefully when new service bundles become available.

realtech Gets Real new tech

The operations of Walldorf, Germany-based realTech AG are based on SAP R/3 and the mySAP environment, along with associated technologies such as security and hosting. Datacenter offers realTech the possibility of consolidating several R/3 systems onto one hardware platform with one operating system. For hardware, realTech chose Unisys' ES7000 server.

1 2 Page 1
Page 1 of 2
It’s time to break the ChatGPT habit
Shop Tech Products at Amazon