Money

4 Takeaways From Microsoft’s Build Developer Conference Keynote


Microsoft (MSFT) CEO Satya Nadella took the stage Tuesday to give a keynote address at Microsoft Build, the company’s annual developer conference, where he unveiled several new artificial intelligence (AI) initiatives. The updates came just a day after Microsoft announced new Copilot+ PCs, and a week after Alphabet’s (GOOGL) Google I/O developer conference. Here are the key takeaways.

Introducing Team Copilot, Copilot Extensions, and More

Nadella announced Team Copilot, an expansion of Microsoft’s Copilot chatbot that brings a personal AI assistant to Microsoft 365 users. The new Team Copilot can manage agendas, take notes, and use context to answer questions. Team Copilot is set to be available in preview later in the year for Microsoft Copilot for some Microsoft 365 customers.

The CEO also introduced new agent capabilities in Microsoft Copilot Studio to help developers build their copilots. The new agent features are available for some through Microsoft’s Early Access Program and are set to become more widely available in preview later this year.

Nadella unveiled Copilot extensions as well, that allow Copilot users to connect to new sources and applications to expand the assistant’s capabilities.

The company introduced the first set of Copilot extensions for GitHub, Microsoft’s developer platform. Microsoft said developers can further customize their GitHub experience using the GitHub Copilot extensions, which are available in private preview.

Expanding AI Partnerships With Hugging Face and Others

Nadella highlighted that Microsoft has several collaborations with companies working to further AI tech, and pointed to Microsoft’s ongoing partnership with Nvidia (NVDA), just ahead of the chipmaker’s highly anticipated earnings report Wednesday. Nadella also called out Microsoft’s arguably most important AI era partner, ChatGPT maker Open AI, which recently launched GPT-4o.

The CEO announced that Microsoft extended its partnership with Hugging Face to bring Hugging Face’s models to Azure AI studio and a new partnership with Cognition AI, an AI startup, to power Cognition’s AI agent specialized in software development on Azure.

The company also announced a partnership with Khan Academy focused on leveraging AI tech to power educational materials. Microsoft is donating Azure infrastructure access and Khan Academy is working to make its AI teaching assistant, Khanmigo, free to teachers. 

New Small Language Model and Real-Time Intelligence

Microsoft announced Phi-3-vision, a new multimodal model with audio and vision capabilities. The new model is part of Microsoft’s Phi-3 family of small language models (SLMs).

The company introduced the Phi-3 models as part of its push into SLMs that are less costly than large language models (LLMs) and used for specific use cases. The Phi-3 SLMs are available on Azure AI and Hugging Face.

Nadella also announced Real-Time Intelligence on Microsoft Fabric, an AI-powered analytics platform, that offers in-the-moment decision-making for organizations. The feature is now in preview.

Updates on Microsoft’s Custom Chip and AI Infrastructure Offerings

The company announced a preview of its new custom silicon Arm-based (ARM) processor announced in November, Cobalt 100. Nadella highlighted the chip’s performance capabilities for running general and cloud-native workloads, saying that the new custom silicon provides a 40% improvement in performance.

Microsoft also said it is “the first cloud provider to bring AMD’s leading MI300X AI accelerator chip to power customers’ AI training and inferencing needs” through the Azure ND MI300X v5 series. The company said the chip is part of its holistic infrastructure approach to AI.


Source link

Related Articles

Back to top button