SQL Server: The Sequel

It's been a long time since Microsoft Corp. first entered the database world with a relabeled version of the Sybase relational database management system. From its humble beginnings, the Microsoft product has taken over the SQL Server name (Sybase Inc.'s SQL Server is now called Sybase Adaptive Server), lost all traces of Sybase programming, gained features that rival those of Oracle and become an integral part of Microsoft's back-office strategy.

It's not perfect, of course. SQL Server 2000 runs only on Windows NT 4.0 or 2000. It lacks IBM DB2's high level of support for static SQL. It can't run stored procedures as flexibly or as powerfully as Oracle can. Its native programming interface isn't as well-known as Oracle's. It doesn't integrate as easily as Oracle, DB2 or Adaptive Server with Java-based application server environments such as San Jose-based BEA Systems Inc.'s WebLogic or IBM's WebSphere– especially if they're based on Enterprise JavaBeans.

Despite these and other shortcomings, early adopters of SQL Server 2000 say they're pleased with the new version of Microsoft's RDBMS. Microsoft recently shipped the newest upgrade of its premier database, and Computerworld approached four early adopters to see how satisfied they are so far. Their reactions were distinctly positive.

This profile of early SQL Server 2000 use shows why, at least at the large workgroup level, Microsoft appears to be ready to make significant inroads against entrenched competitors like Oracle Corp.

Microsoft Builds on Server Side

"We're at a point when people are waking up to how powerful Windows NT, Windows 2000 and Microsoft's server-based products are. The whole Unix and Java world has been overhyped," says Will Zachman, an analyst at Stamford, Conn.-based Meta Group Inc. "In contrast, Microsoft has been steadily building and improving its server-side products, such as SQL Server 2000, to be able to give customers useful and reliable software."

Zachman describes SQL Server 2000's market potential as almost limitless. "I'm very impressed with what Microsoft is bringing to the table," he says.

Matthew Drooker, vice president of systems development at Turner Broadcasting Sales, says his cable TV advertising sales organization chose SQL Server 2000 because his development team has a lot of experience with earlier versions, the system integrates well with the team's tools and it's an appropriate solution for storing sales data from his approximately 220 users. The database stores information about advertisers, advertising agencies, sales orders and broadcast details in about 300 tables.

Turner Broadcasting Sales is a division of Turner Broadcasting System Inc. (TBS), a subsidiary of New York-based Time Warner Inc. Atlanta-based TBS is a leading cable TV content provider and owns outlets such as TNT, Cable News Network Inc. and the TBS Superstation.

When the sales staff and company executives needed a more efficient way to both create and access sales data, TBS developed a set of applications based on SQL Server 2000.

Drooker characterizes TBS as a longtime user of SQL Server and Windows. He has a small group of developers–about 10 people–which he says definitely prefers and has come to rely on Microsoft's Visual Studio 6.0, including Interdev, Visual Basic and Visual C++.

Zachman says he agrees that Visual Studio 6.0 and SQL Server 2000 are an excellent combination. "Microsoft has server-side development tools that blow away anything other vendors have. IBM's VisualAge, for example, is primitive in contrast," he says.

TBS's sales-oriented applications generate proposals, track orders, produce reports and interface with legacy billing systems that run on its Unisys Corp. mainframe. Drooker says SQL Server's Data Transformation Services (DTS) component has been a great time-saver in building and processing interface files. Little programming expertise or effort is required to use the graphical DTS tool to produce programs that convert files from one format to another.

The tool, he says, produces Visual Basic code, which means his team can add business logic quickly and easily.

In addition, Drooker says, the integrated and easy-to-use programming environment surrounding SQL Server lets the development team build effective work-arounds for problems that previously may have required the use of expensive consulting help.

Shopping for a New DBMS

Scott Mitchell, chief technology officer at Home Shopping Network (HSN) in St. Petersburg, Fla., chose SQL Server 2000 primarily for its scalability. He says he likes being able to add horsepower in fine, inexpensive increments by simply buying additional Intel-based servers.

"It increases our agility and improves our ability to change in response to market forces," Mitchell explains. "The Windows programming environment affords me access to a more robust labor market. It also greatly simplifies vendor management, because I'm not tied to a single hardware vendor and a single flavor of Unix."

In May, Mitchell says, HSN decided to migrate the database tier of its Web site's e-commerce application from Solaris-based Oracle running on Sun Microsystems Inc. computers to SQL Server 2000 and Windows 2000 Advanced Server running on Intel-based servers.

Mitchell echoes Drooker in describing the ease of programming in the Windows environment, which has reduced the time it takes to add improvements to the HSN Web site. The database tier migration took less than six weeks. Mitchell says HSN has also seen improved performance, simplified vendor management and reduced hardware, software and support costs.

Like TBS, HSN runs its core business–including inventory, order management and fulfillment–using a series of legacy applications running on a Unisys mainframe. To extend this functionality to the Internet, HSN used to rely on a mixed-platform solution a third party developed for the company.

The presentation and business-logic tiers of HSN's Web solution consisted of Microsoft Internet Information Server and Windows NT running on six two-processor Intel-based servers. The HSN Web site processes about 18 million page views per month, and that number is growing rapidly.

The data tier, consisting of catalog data, the customer database and community content such as the site's message boards, resided in Oracle8i running on a Sun computer. Information transfer between the Web platform and the mainframe was accomplished through twice-daily bidirectional batch transfers; new orders were sent from the Web platform to the legacy system, while inventory levels and order status were sent in the opposite direction.

Mitchell says HSN was experiencing system performance and availability problems, and the mixed solution made it more difficult and time-consuming for developers to fix problems and optimize performance. Having multiple vendors and support organizations added complexity, and staffing was also more difficult because HSN needed to recruit people with expertise with two very different systems.

In August, HSN replaced the Sun/ Oracle data tier with a cluster of two Compaq Computer Corp. 8500 servers, each populated with four processors running Microsoft Windows 2000 Advanced Server and Microsoft SQL Server 2000.

The two-node cluster, which accesses shared external disk storage, is configured in an active/passive fail-over configuration to maximize availability, using the fault-tolerant functionality provided in SQL Server 2000. HSN selected the eight-processor-capable Compaq servers so it could scale up to meet customer demand by simply adding processors, says Mitchell.

Growing Acceptance

Cary Gottesman, a Microsoft consultant at AT&T Growth Markets in North Brunswick, N.J., says AT&T selected SQL Server 2000 primarily because of its online analytical processing (OLAP) capability, its performance and its greater maximum database size. He says SQL Server 2000's increased cube-size limit (over SQL Server 7.0) makes it the right tool for the job.

AT&T Growth Markets, an $8 billion business unit of the AT&T Business Services division, depends on what it calls its Attack Database to provide a wide array of information to field sales personnel. First developed using SQL Server 7.0, this database provides access to just under 1 terabyte (TB) of data that sales staff use to quickly identify and target new telecommunications business opportunities.

Accessed via a browser-based front end, the database provides OLAP and other functionality that lets sales representatives see, for example, what kinds of communications services a business has, the potential revenue a customer may offer and the existing infrastructure that will shape a customer's choices.

Gottesman says database usage is spreading quickly among AT&T's sales force. Moreover, the database's size has grown to include new information from internal sources, such as AT&T billing records, and external information from business and financial profiling services. It currently contains more than 40 million rows in a star/snowflake design that facilitates OLAP views of the data. For example, sales staff can choose to view customer data based on demographic category, industry or region.

Individual customers are represented in 24 million of the 40 million rows. With SQL Server 7.0, Gottesman points out, AT&T Growth Markets was able to support only a few million customers. However, the Analysis Services in SQL Server 2000 can scale up to tens of millions of customers in multidimensional OLAP and more than 100 million in relational OLAP.

Gottesman says the success of the Attack Database has created additional challenges. Keeping the information fresh requires rebuilding about 500GB of the database on a monthly cycle. The entire data warehouse exceeds 1TB when all tables, indexes and OLAP cubes are included.

According to Gottesman, switching from RAID Level 5 to RAID Level 0 gave AT&T its biggest performance gains in the database-rebuilding operation, dropping the time for a single clustered index build from 32 hours to four hours. A faster server further reduced the time to one hour. And switching from SQL Server 7.0 to SQL Server 2000 shrank the index build time even further, to only 15 to 20 minutes. The monthly database-refresh cycle that used to take more than four weeks today takes about a week.

A Vote for SQL Server

Gregg McGilvray, chief technology officer at Election.com Inc., says he feels that SQL Server 2000 scales well, is less expensive than comparable databases and has better tool support in the developer community.

Election.com is a global Internet election-services company in Garden City, N.Y., that provides public- and private-sector election services for governments, associations, nonprofit organizations, labor unions, school districts, credit unions, pension funds and corporations.

The company helps voters around the world make the transition from paper ballots to online voting. Clients include the Sierra Club, the United Nations Federal Credit Union and the Arizona Democratic Party. As the company expands into different markets, it's taking on much larger elections in the U.S. and abroad on a regional, state and national level. These business opportunities represent enormous variations in the number of votes being counted.

"Depending on the bylaws of a particular organization, we may need to build additional tables, add data types, manage indexes and otherwise adapt to constantly changing elements to meet the needs of our clients," McGilvray says. "SQL Server 2000 helps us to do that with an easy-to-understand interface. It's also a lot more cost-effective than the alternatives."

McGilvray says Election.com chose SQL Server 2000 because it provides both the flexibility to rapidly adjust its offerings to different election-format scenarios and the scalability to assist the company in handling major jurisdictional elections.

One of Election.com's largest projects was the recent Youth-e-Vote 2000, an online election for U.S. school children, who were able to vote for the next U.S. president just prior to the actual election.

"That's an example where we were awarded a project that has very different requirements than a standard election," McGilvray says. "We had to add additional tables and reporting mechanisms at different levels in the system. SQL Server enabled us to do this very quickly and to deploy the online voting system in time to meet the requirements of the project."

Passing the Test

McGilvray's team has successfully stress-tested its voting application at 20 million votes per hour.

"The OLAP function in SQL Server 2000 is already helping us in creating rapid election-reporting services. It's also opening up new ways to interpret voter data and voting patterns," he says, adding that the company also exports SQL Server-based data to Excel spreadsheets and uses Visual Basic to generate reports.

"For example, we can look at how often people tend to vote for the first person on a ballot simply because of that position on the ballot," McGilvray says. "If we uncover a high percentage of this kind of voting behavior, then we can offer to rotate candidates' names on a ballot."

1 2 Page 1
Page 1 of 2
7 inconvenient truths about the hybrid work trend
Shop Tech Products at Amazon