As analysts bet on AI as the way of the future, most Silicon Valley tech giants are jumping aboard to compete to win the race.
Recently, Google started building its own chips to give itself an edge in artificial intelligence. The chip, called the Tensor Processing Unit, is optimized to run its deep learning algorithms.
To keep up in the competition, Microsoft announced Project Brainwave. The objective of the project is to get deep learning applications running fast and efficient across sprawling data centers.
Microsoft is taking a slightly different approach than Google. Instead of building a chip optimized for a very specific set of algorithms, Microsoft is using a chip called Field-Programmable Gate Arrays (FPGAs), which can be reprogrammed after manufacturing.
“The FPGAs, built by Intel-owned Altera, give the company more flexibility than a dedicated chip,” said Microsoft engineer Doug Burger.
“We wanted to build something bigger, more disruptive and more general than a point solution,” Burger said in an interview at the Hot Chips conference where Project Brainwave was announced.
Another differentiating factor between Google and Microsoft is that the latter supports multiple deep learning frameworks such as Microsoft’s CNTK, Google’s TensorFlow and Facebook’s Caffe2. This means that the FPGAs will not only be high in data processing, but will also be able to extract data from many platforms.
Burger said, “Microsoft has tweaked the FPGAs enough to make them competitive with (and sometimes better than) the dedicated chips. Running a deep learning model called gated recurrent units, Microsoft’s hardware demonstrated nearly 40 teraflops (a unit of computing speed for floating point operations) of performance and ran each operation in under one millisecond.”
Project Brainwave will be only available for Microsoft’s internal AI services for now, but will potentially offer the system to outside companies through its cloud services in the future.
“We’ve already deployed FPGAs at a massive scale,” Burger said. “So, if you think about, the technology to run deep learning is already deployed worldwide with Azure.”