"Microsoft has been deploying FPGAs in every Azure server over the last several years, creating a cloud that can be reconfigured to optimize a diverse set of applications and functions. This configurable cloud provides more efficient execution than CPUs for many scenarios without the inflexibility of fixed-function ASICs at scale. Today, Microsoft is already using FPGAs for Bing search ranking, deep neural network (DNN) evaluation, and software defined networking (SDN) acceleration. Azure’s FPGA-based accelerated networking reduces inter-virtual machine latency by up to 10x while freeing CPUs for other tasks."
The moonshot that succeeded: How Bing and Azure are using an AI supercomputer in the cloud - Next at Microsoft
Programmable chips turning Azure into a supercomputing powerhouse | Ars Technica
EC2 F1 Instances with FPGAs – Now Generally Available | AWS Blog
Mark Russinovich (@markrussinovich) | Twitter
fpgas_for_dummies_ebook.pdf @ Altera (Intel)
Resource Center - Overview
design.altera.com/New2FPGAeBook