Вештачка интелигенција

Microsoft Unveils Next-Generation AI Chip and General-Purpose Arm Chip

Summary

Microsoft made a groundbreaking announcement at its Ignite conference in Seattle, revealing two new chips that are set to redefine the landscape of artificial intelligence (AI) and general computing tasks. The company introduced the Maia 100 AI chip, a powerful […]

Microsoft Unveils Next-Generation AI Chip and General-Purpose Arm Chip

Microsoft made a groundbreaking announcement at its Ignite conference in Seattle, revealing two new chips that are set to redefine the landscape of artificial intelligence (AI) and general computing tasks. The company introduced the Maia 100 AI chip, a powerful contender to Nvidia’s highly sought-after AI graphics processing units. Additionally, Microsoft unveiled the Cobalt 100 Arm chip, designed for general computing tasks and capable of rivaling Intel processors.

As companies with substantial financial resources continue to offer their customers more options for cloud infrastructure, the competition in the market intensifies. Alibaba, Amazon, and Google have been at the forefront of this trend for years. According to estimates, Microsoft, with approximately $144 billion in cash reserves as of October, is projected to capture a 21.5% market share in cloud computing by 2022, making it the second-largest player after Amazon.

Virtual machine instances powered by Cobalt chips will be commercially available through Microsoft’s Azure cloud platform in 2024, according to Rani Borkar, Corporate Vice President at Microsoft. However, no specific timeline was provided for the deployment of the Maia 100 chip.

The introduction of AI chips by cloud service providers is instrumental in meeting the high demand when there is a scarcity of graphics processors. Unlike Nvidia or AMD, Microsoft and its competitors in cloud computing do not have plans to sell servers equipped with their chips directly to companies. The company’s approach to developing its AI computing chip is based on user feedback, Borkar explained.

Microsoft is currently testing the performance of Maia 100 chips in various applications, including Bing’s AI-powered chatbot, GitHub Copilot coding assistant, and GPT-3.5-Turbo, a large-scale language model developed in collaboration with OpenAI. OpenAI has trained its language models on vast amounts of internet data, enabling them to generate emails, summarize documents, and answer questions based on concise human instructions.

The GPT-3.5-Turbo model is integrated into the OpenAI ChatGPT assistant, which gained immense popularity since its launch last year. Other companies swiftly followed suit, incorporating similar conversational capabilities into their software, which consequently spurred the demand for graphics processors.

FAQ