
Putting end to the rumour mill about Microsoft working on its custom chip, the Redmond-based tech giant has introduced its own AI chip at the annual Ignite Conference. The company unveiled two custom-designed chips and integrated systems - the Maia AI Accelerator, a competition to Nvidia’s AI graphics processing units, and Cobalt CPU.
The Microsoft Azure Maia AI Accelerator is optimized for artificial intelligence (AI) tasks and generative AI, wherein the Microsoft Azure Cobalt CPU is an Arm-based processor tailored to run general purpose compute workloads on the Microsoft Cloud.
According to Microsoft, the chips represent a last puzzle piece for the company to deliver infrastructure systems – which include everything from silicon choices, software and servers to racks and cooling systems – that have been designed from top to bottom and can be optimized with internal and customer workloads in mind.
“Microsoft is building the infrastructure to support AI innovation, and we are reimagining every aspect of our datacenters to meet the needs of our customers,” said Scott Guthrie, executive vice president of Microsoft’s Cloud + AI Group. “At the scale we operate, it’s important for us to optimize and integrate every layer of the infrastructure stack to maximize performance, diversify our supply chain and give customers infrastructure choice.”
To be rolled out early next year to Microsoft’s datacenters, the chips will initially power the company’s services such as Microsoft Copilot or Azure OpenAI Service. Microsoft says, they will join an expanding range of products from industry partners to help meet the exploding demand for efficient, scalable and sustainable compute power and the needs of customers eager to take advantage of the latest cloud and AI breakthroughs.
Microsoft has been secretly (not so secretly as the rumour mill was abuzz with the news) working on developing the chips for years. At the company’s Redmond campus, there was a lab full of machines meticulously working and testing the silicon.
It isn’t the first time when a tech giant has announced a bespoke chip. In 2016, Google had announced its tensor processing unit for AI. Amazon Web Services joined the league in 2018 with its Graviton Arm-based chip and Inferentia AI processor, and in 2020 it also announced Trainium for training models.
For Unparalleled coverage of India's Businesses and Economy – Subscribe to Business Today Magazine
Copyright©2025 Living Media India Limited. For reprint rights: Syndications Today