The test was so successful that it will be rolled out to all Bing servers in 2015, Putnam said. The challenge now is to figure out where next to apply the FPGA technology, he added.
Baidu, which owns tens of thousand of servers in China, used FPGAs to accelerate deep neural networks, algorithms used for everything from traditional search to speech recognition to image search and recognition. Baidu used a board with a Xilinx K7 480t-2l FPGA board that could be plugged into any type of 1U or 2U server. Under various workloads, Baidu found that the FPGA boards were several times more efficient than either a CPU or GPU.
All end users care about is the quality of their search results from Bing or Baidu. Improving the efficiency and performance of the search algorithms is good news for everyone.
Sign up for CIO Asia eNewsletters.