According to The Information, Microsoft Corp is reportedly developing its own artificial intelligence (AI) chip, code-named “Athena,” which will power the technology behind AI chatbots like ChatGPT. The report reveals that one of the primary reasons for Microsoft’s foray into chip-making is to reduce costs. Compared to Nvidia, the leading producer of GPUs for the AI industry, Athena could potentially slash costs by a third.
Athena: Microsoft’s Answer to AI Chip Development
The Athena project has been underway since 2019, and the chips are currently being tested by a small group of Microsoft and OpenAI employees. However, it remains unclear if and when Microsoft plans to launch the chip commercially.
Microsoft’s development of an AI chip aligns with the broader industry trend, with AI companies like Google, Apple, and Amazon already having developed their AI chips. Google built a supercomputer to train its models using Tensor Processing Units (TPUs), while Amazon developed its Trainium and Inferentia processor architectures. Consequently, Microsoft’s entry into the AI chip market is not surprising.
Also Read: Google VS Microsoft: The Battle of AI Innovation
Addressing the Costs of AI Training and Deployment
Training large language models (LLMs) like GPT-3 is expensive. It is estimated that the cost of training GPT-3 was around USD 4 million. Additionally, OpenAI spends approximately USD 3 million per month to maintain ChatGPT. GPUs used to run these models also come with a hefty price tag. For instance, Nvidia’s primary data center chips sell for around USD 10,000, while its H100 GPUs are priced at USD 40,000 on eBay. Microsoft’s supercomputer reportedly utilizes tens of thousands of Nvidia’s H100 and A100 data center GPUs.
Reducing Reliance on Nvidia and Customizing Chip Design
By building chips in-house, Microsoft aims to reduce its reliance on Nvidia. This move would enable the tech giant to design chips, their architecture, and compatibility according to their own needs. Reportedly, Microsoft has designed Athena to train and run its artificial intelligence models.
Given Microsoft’s plans to introduce AI-powered features in Bing, Office 365, and GitHub, the transition to in-house AI chips could be advantageous for the company in terms of cost.
Also Read: Elevate Your Workflow: Microsoft’s AI Copilot Boosts Office, GitHub, Bing & Cybersecurity
Accelerating AI Development in the Wake of ChatGPT’s Success
The Information reported that Microsoft’s AI chip rollout is being accelerated following the success of ChatGPT. The company, an early backer of ChatGPT-owner OpenAI, launched its own AI-powered search engine, Bing AI, earlier this year. By capitalizing on its partnership with OpenAI, Microsoft aims to grab market share from Google.
Also Read: Microsoft Releases VisualGPT: Combines Language and Visuals
Our Say
Microsoft’s development of its own AI chip, Athena, signifies the company’s commitment to staying competitive in the AI market. By addressing the high costs associated with AI model training and deployment, Microsoft aims to leverage the success of AI-powered chatbots like ChatGPT and gain an edge in the industry.