Microsoft brings Maia 200 AI accelerator to Azure cloud
New custom chip claims 30% better performance per dollar than current systems
Microsoft has added its Maia 200 artificial intelligence accelerator to the Azure cloud platform, expanding its in-house hardware offering for advanced AI workloads.
Chief executive Satya Nadella announced the update in a post, stating: “Our newest AI accelerator Maia 200 is now online in Azure.” He said the chip delivers “30% better performance per dollar than current systems.”
Related reading
- Google.org awards $20m to AI-led science projects tackling health, energy and biodiversity
- Midas Labs to launch Playmaker in Q2 2026
- VIP Play completes technology transformation
According to Microsoft, the Maia 200 provides more than 10 petaflops of FP4 throughput, approximately 5 petaflops at FP8 precision, and includes 216GB of HBM3e high-bandwidth memory with 7 terabytes per second of memory bandwidth.
The Maia 200 joins Microsoft’s existing portfolio of central processing units, graphics processors and custom accelerators, offering customers more hardware options to deploy large-scale AI models efficiently and cost-effectively on Azure.
The Recap
- Microsoft added Maia 200 accelerator to Azure platform.
- It delivers 30% better performance per dollar, company said.
- Maia 200 is now online and available in Azure.