Microsoft is reported to be working with AMD to develop AI-capable chips and provide an alternative to Nvidia. Credit: Martyn Williams/IDG Microsoft is reportedly partnering with AMD to help the chip maker develop advanced processors that support artificial AI workloads. Microsoft’s increased demand for chips that can support AI applications is due to the number of AI-based products and services it has recently released in collaboration with OpenAI, creator of ChatGPT. As a result, Microsoft has decided to collaborate with AMD to provide an alternative to Nvidia, which dominates the market for graphics processing units (GPUs) used for AI applications, according to a Bloomberg report. Microsoft has made significant investments in AI in recent months, most notably through its $100 million investment in OpenAI in January. More recently, the company has integrated its GPT-based GitHub Copilot with SharePoint, Viva and Microsoft 365. Microsoft has also announced several updates to Copilot itself under the Copilot X program and is expected to open up its ChatGPT-powered Bing Chat to the public soon. Separately, Microsoft is already said to be working on developing its own processor for AI workloads under the aegis of Project Athena, according to a report from The Information. Microsoft already has several hundred employees working on the Athena project, and, to date, has spent approximately $2 billion on its chip efforts, according to the report. The recent interest in generative AI requires significant increases in compute performance, AMD CEO Lisa Su said on a call with analysts after the company reported its first quarter 2023 financial results earlier this week. She added that the company is “very well positioned” to capitalize on this increased demand due to its large portfolio of high-performance compute engines and expanding software capabilities. “We are very excited about our opportunity in AI, this is our number-one strategic priority,” Su said. Microsoft’s efforts to develop processors for AI is in line with the strategy of rival hyperscalers, with Amazon Web Services and Google already developing their own chips for AI workloads. However, for many of these companies Nvidia remains the top supplier of chips for generative AI projects, supporting AWS, Google Cloud and even Elon Musk’s new AI business, according to a separate Bloomberg report. Niether AMD or Microsft had immediate comments on the reports. Related content how-to Compressing files using the zip command on Linux The zip command lets you compress files to preserve them or back them up, and you can require a password to extract the contents of a zip file. By Sandra Henry-Stocker May 13, 2024 4 mins Linux news High-bandwidth memory nearly sold out until 2026 While it might be tempting to blame Nvidia for the shortage of HBM, it’s not alone in driving high-performance computing and demand for the memory HPC requires. By Andy Patrizio May 13, 2024 3 mins CPUs and Processors High-Performance Computing Data Center opinion NSA, FBI warn of email spoofing threat Email spoofing is acknowledged by experts as a very credible threat. By Sandra Henry-Stocker May 13, 2024 3 mins Linux how-to Download our SASE and SSE enterprise buyer’s guide From the editors of Network World, this enterprise buyer’s guide helps network and security IT staff understand what Secure Access Service Edge (SASE) and Secure Service Edge) SSE can do for their organizations and how to choose the right solut By Neal Weinberg May 13, 2024 1 min SASE Remote Access Security Network Security PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe