Blog - Channel Partner
What powers the infrastructure in the era of AI?

For most of our partners and especially their customers, the evolution of Microsoft innovations and ever-changing technology has been the fulcrum of productivity. This has been shaped lately by an exponential growth in the sophistication of large language models like OpenAI’s GPT trained on trillions of parameters and groundbreaking generative AI services like Bing Chat Enterprise and Microsoft Copilot used by millions of people globally.
Microsoft Azure’s infrastructure is growing fast and Microsoft is adding new tools for customers who want to create hybrid, cloud-native, or open source solutions. In this new era of AI, Microsoft is helping you bring your ideas to production in a safe and responsible way, keeping enterprise security in mind.
At this year’s Microsoft Ignite, it was announced that further innovation in Microsoft Azure is powering more and more AI capabilities, helping our partners and customers’ enterprises with their cloud management and further operations.
New Silicon redefining cloud infrastructure
Microsoft has introduced the new Azure Maia series, being their new series of custom AI accelerators that can handle cloud-based AI workloads such as OpenAI models, Bing, GitHub Copilot, and ChatGPT. Alongside this innovation, Azure’s first custom in-house central processing unit series, Azure Cobalt. Azure Cobalt is based on Arm architecture and delivers optimal performance or watt efficiency for common cloud workloads in the Microsoft Cloud. Also, NVIDIA H100 Tensor Core (GPU) graphics processing unit as Azure continuously works closely with NVIDIA.
Azure’s collaboration with AMD will enable our customers to use AI-optimized VMs that run on AMD's new MI300 accelerator starting from early next year. This reflects our dedication to providing customers with more choices in terms of price, performance, and power for their diverse business needs.
Maia 100 Series
This being the first generation of their Maia 100 series which has 105 billion transistors and is one of the largest chips on 5nm process technology. Maia 100 also has innovations in silicon, software, network, racks, and cooling capabilities. This gives the Azure AI infrastructure a comprehensive system optimization that can support the most advanced AI features. This means that the Azure AI infrastructure is designed to optimize the whole system for the requirements of cutting-edge AI like OpenAI models, Bing, GitHub Copilot, and ChatGPT.
Azure Cobalt
Cobalt 100, being built on Arm architecture for optimal performance or watt efficiency, powering common cloud workloads for the Microsoft Cloud. From in-house silicon to systems, Microsoft now optimizes and innovates at every layer in the infrastructure stack. This enables faster networking and storage solutions in the cloud. You can now achieve up to 12.5 GBs throughput, 650K input output operations per second (IOPs) in remote storage performance to run data-intensive workloads, and up to 200 GBs in networking bandwidth for network-intensive workloads.
The end goal is an Azure hardware system that offers maximum flexibility that can be optimized for power, performance, sustainability or cost. If this innovation speaks to your customers need, please contact your Surestep Ambassador team at This email address is being protected from spambots. You need JavaScript enabled to view it. to look at more innovative and secure ways to provide your customer with new silicon powered cloud-based solutions.