Schwab goes live with grid computing technology
The brokerage plans to expand the system to thousands of servers next year
Computerworld - Charles Schwab & Co. this week went live with a grid computing system and said it has already improved performance by an order of magnitude on the first investment management application that's running on the new setup.
David Dibble, executive vice president of technology services at Schwab, said the grid system was jointly developed with IBM and currently connects 12 two-processor servers that are based on Intel chips and are located in the discount brokerage's Phoenix data center. Next year, San Francisco-based Schwab plans to begin rolling out the grid technology across thousands of low-cost servers with spare CPU capacity that could be used to speed up compute-intensive applications.
"We wanted to open up a whole new realm of high-throughput computing for Schwab's business applications," Dibble said. "Things that were not thinkable just a year ago are now proving economical, and therefore we're working at getting more of them into production." He said there is "very little, if anything, off the shelf" in the grid system.
It took a team of about 15 internal IT staffers working with a development team from IBM about a year to build the grid system, Dibble said. Schwab is using Globus Toolkit 2.0, a set of open-source software tools that support grid computing applications. The system links IBM xSeries 330 servers running Red Hat Linux and IBM's DB2 database. Both IBM's WebSphere and BEA Systems Inc.'s WebLogic application server software is being used.
Dibble wouldn't disclose the cost of the project or the throughput that Schwab has achieved on the initial application, citing the performance levels as a competitive advantage for the company. But he said the grid system lets Schwab turn around end-user requests for retirement planning data in seconds instead of days.
The retirement planning tool that's now running on the system calculates real-life portfolio scenarios based on retirement goals, risk tolerance and preferred investments. In the future, Schwab plans to add other applications for investment managers and Web-enable the applications for use by individual investors.
In addition to boosting application performance, Schwab's IT team hopes the grid system will help lower total cost of ownership in its tech operations. Like most large brokerages, Schwab built its server infrastructure to handle twice the computing capacity that's needed during peak hours on an average trading day.
"There's a lot of capacity lying around on just average days," Dibble said. "What grid computing does is enable us to go out and recapture unused capacity in a very efficient manner."
The grid system works through a "head node," a master server that breaksup compute-intensive data requests into smaller jobs and sends them out to systems on the grid for processing, said Willy Chiu, a vice president in IBM's software group. The head node then reassembles the different pieces of the transaction and presents the data to Schwab's investment managers.
Read more about Servers in Computerworld's Servers Topic Center.
- Path Selection Infographic Path Selection Infographic
- Hyperconvergence Infographic A wide range of observers agree that data centers are now entering an era of "hyperconvergence" that will raise network traffic levels faster...
- Preparing Your Infrastructure for the Hyperconvergence Era From cloud computing and virtualization to mobility and unified communications, an array of innovative technologies is transforming today's data centers.
- How WAN Optimization Helps Enterprises Reduce Costs If you wanted to break down innovation into a tidy equation, it might go something like this: Technology + Connectivity = Productivity. Productivity...
- Cloud Knowledge Vault Learn how your organization can benefit from the scalability, flexibility, and performance that the cloud offers through the short videos and other resources...
- LIVE EVENT: 5/7, The End of Data Protection As We Know It. Introducing a Next Generation Data Protection Architecture. Traditional backup is going away, but where does this leave end-users? All Servers White Papers | Webcasts