New chips give Bing a performance boost

Microsoft's Project Catapult uses field-programmable chips to speed up certain tasks

Microsoft is planning to use programmable chips to boost the performance of the servers for its Bing search engine, by accelerating certain services using these devices.

The project, called Catapult, used field-programmable gate arrays, or FPGAs, which as the name suggests can be configured by customers on the field after they are manufactured. FPGAs are now seen as powerful computing devices in their own right, which can be used for accelerating certain programmable tasks, researchers working on the project said.

The new approach is likely to be important for compute-intensive services like Bing as Microsoft can now selectively speed up large-scale services, such as its ranking throughput, by adding FPGA compute boards to servers, rather than buy more hardware and CPUs to run the workloads.

The rate at which server performance improves has slowed considerably, as performance gains from CPUs flatten, wrote Microsoft researchers and colleagues from Bing, the industry and academia in a paper presented on Monday to the 41st International Symposium on Computer Architecture, which describes a bid to combine programmable hardware and software using FPGAs to boost ranking throughput by as much as 95 percent.

Adding the FPGA boards increased power consumption by only 10 percent and did not exceed the researchers' 30 percent limit in the total cost of ownership of an individual server, indicating that the new architecture could deliver key savings and efficiencies, according to the researchers.

The paper describes the testing across 1,632 servers of the reconfigurable fabric of interconnected nodes linked by high bandwidth connections, and measuring its efficacy in accelerating the workload in a production search infrastructure. The use of the FPGAs does not do away with CPUs, but only offloads certain workloads to the configurable chips.

Instead of moving functions to custom designed chips, known as application-specific integrated circuits, the researchers mapped them to the reconfigurable FPGAs from a company called Altera, in the bargain designing what is described as a 'programmable hardware' enhanced cloud.

The researchers decided against using GPUs (graphics processing units), although both FPGAs and GPUs support high parallelism with hundreds to thousands of arithmetic units available on each chip, because the current power requirements of high-end GPUs are too high for conventional data center servers and also because it wasn't clear whether some latency-sensitive ranking stages, such as feature extraction, would map well to GPUs, the researchers wrote.

"Going into production with the new technology will be a watershed moment for Bing search," Microsoft Research head Peter Lee said in a blog post. For the first time ever, the quality of Bing's page ranking will be driven not only by great algorithms but also by highly specialized hardware, he added.

"We conclude that distributed reconfigurable fabrics are a viable path forward as increases in server performance level off, and will be crucial at the end of Moore's Law for continued cost and capability improvements," the researchers wrote. The law is named after Intel's cofounder Gordon Moore who observed that the number of transistors in an integrated circuit would double every two years.

Copyright © 2014 IDG Communications, Inc.

Bing’s AI chatbot came to work for me. I had to fire it.
Shop Tech Products at Amazon