in , ,

Microsoft Sends Graphcore AI Accelerator Chips To Azure

Graphcore Card

Microsoft has taken another big step in the race to catch rivals Amazon and Google in the AI space. It has sent Graphcore AI accelerator chips to the Azure cloud.

Marking this the first time a large-scale cloud vendor has made these chips available to public.

This innovative processor is designed to push artificial intelligence applications to greater heights, with these remarkable new Intelligence Processing Units (IPUs) custom designed by the British startup for the age of AI.

Graphcore, founded in 2016 in Bristol, UK, has attracted considerable attention among AI researchers on the promise that its chips accelerate the computations required to make artificial intelligence work. And early tests have been encouraging.

The chips either matched or exceeded the performance of the top AI chips from NVIDIA and Google using algorithms written for these rival platforms.

Code written specifically for Graphcore hardware may be even more efficient.

Graphcore Logo

As cofounder Nigel Toon said:

“Microsoft and Graphcore have been collaborating closely for over two years. Over this period, the Microsoft team, led by Marc Tremblay, distinguished engineer, has been developing systems for Azure and has been enhancing advanced machine vision and natural language processing models on IPUs. We have been working extensively with a number of leading early-access customers and partners for some time to ensure that [these products are] ready for general release.”

In terms of hardware, we have the C2 card that feature two crosslinked Colossus IPUs, each of which packs 16 cores and 23.6 billion transistors. The 1216 IPUs in a single chip can hit over 100 GFLOPS per core, and paired with 300MB of memory can run up to 10,000 programs executing in parallel.

The C2 is designed to work with Poplar, a graph tool chain designed for AI and machine learning.

Graphcore Chip

Worth a mention here that Microsoft actually poured its own money into Graphcore last December, as part of a $200 million funding round.

And it has now become the first cloud provider to launch this new hardware that is custom designed to help machines recognize faces, understand speech, parse language, drive cars and train robots — things that a modern AI system have to deal with.

Written by Fahad Ali

Fahad Ali is a professional freelancer, specializing in technology, web design and development and enterprise applications. He is the primary contributor to this website. When he is not typing away on his keyboard, he is relaxing to some soft jazz.

Leave a Reply

Your email address will not be published. Required fields are marked *

GitHub Mascot

The GitHub iOS App Is Finally Here

JEDI Contract

Amazon Will Fight Pentagon Over JEDI Contract