Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Microsoft to speed up Bing with FPGAs next year

Agam Shah | Aug. 13, 2014
It started off as an experiment, but Microsoft now wants to speed up and return more accurate Bing search results with the help of reconfigurable chips called FPGAs (field-programmable gate arrays) in data centers.

The FPGAs were built on a specialized network connected through cables. Search requests were bounced from the CPU to a local FPGA, with the request then rerouted across a separate network of PGAs covering computer vision, math acceleration and other search services. The FPGA network was scalable, and didn't rely on the network connecting the computers.

"From the perspective of a CPU, its local FPGA can handle what actually took a lot of FPGAs," Putnam said. "You can add a bunch of services to this kind of FPGA network. Every CPU thinks it's attached to all of these services."

As the number of FPGAs in a data center piles up, programming them could become a nightmare. Microsoft has ported over filtering, ranking, relevancy and sorting tools so FPGAs remain relevant and are easier to reprogram.

"They are actually going to push [FPGAs] into their data centers, but of course, in order to keep them there, we're going to have to really improve the future of programmability," Putnam said.


Previous Page  1  2  3 

Sign up for CIO Asia eNewsletters.