Summary
- Microsoft’s new Phi-3 Mini AI model is efficient, with capabilities similar to GPT-3.5, but in a smaller size, perfect for handheld devices.
- With 3.8 billion parameters, Phi-3 Mini is part of a series of lightweight AI models Microsoft is debuting for better portability and performance.
- The AI model can run locally on low-power hardware, saving energy and costs compared to cloud-based processing.
Every big-name tech company wants its slice of the AI pie these days. Google has Gemini, OpenAI has ChatGPT, and Samsung has its Galaxy AI — but that’s just the tip of the iceberg. Microsoft was early on the generative AI scene, providing vital seed money in OpenAI’s endeavors, and now it’s starting to reap the benefits of its foresight. After debuting its Copilot Android app powered by ChatGPT, the company is taking the wraps off a much smaller AI model that could also find a home on phones.
Microsoft’s Copilot AI can now replace Google Assistant on Android
But you might want to hold off for now
The Verge reports that Microsoft has launched the first of three lightweight AI models, Phi-3 Mini. The latest model features 3.8 billion parameters and is currently available on Ollama, Hugging Face, and Azure. The company plans to release additional models in this series called Phi-3 Small (7 billion parameters) and Phi-3 Medium (14 billion parameters), with the trade-off being that more parameters means more complex instructions, while fewer means better portability. Currently, GPT-4 has over a trillion parameters, and Meta’s small-scale Llama 3 model comes with 8 billion.
Small but mighty
According to VP of Microsoft Azure AI Platform Eric Boyd, Phi-3 Mini is as efficient as models such as GPT-3.5, “just in a smaller form factor.” As a compact AI model, it should be cheaper and should also give better performance on devices such as laptops and phones. To improve its performance, an LLM (large language model) created a children’s book from a list of 3,000 words to teach Phi-3, which is already an improved version since it’s better at coding and reasoning.
Microsoft has not commented on the subject, but perhaps this could be something that may keep up with Meta’s Llama 3 8B, which is catching up with GPT-4 in some areas. The more powerful an AI model is, the more power and energy it uses. So, the fact that the Phi-3 Mini is smaller than the competitors is an advantage since it’s capable of running locally on low-power hardware rather than offloading its computing tasks to expensive cloud-based processing centers.