Microsoft's Maia AI chip, specifically the newly launched
Maia 200, is a custom-designed AI accelerator for running large-scale AI inference in Microsoft's Azure cloud, aiming to offer better performance, efficiency, and cost-effectiveness than alternatives, powering services like
Microsoft 365 Copilot and OpenAI models within Microsoft's own infrastructure. It's built on advanced manufacturing (TSMC 3nm), optimized for low-precision AI workloads (FP4/FP8), integrates high-bandwidth memory, and uses liquid cooling for efficiency, representing a significant step in Microsoft's long-term vertical integration for AI.
Maia 200: The AI accelerator built for inference - The Official Microsoft Blog
Microsoft's Maia AI chip ambitions might include an exclusive SK Hynix HBM3e memory deal | TechSpot
No comments:
Post a Comment