Microsoft's new lightweight AI model is as capable as GPT-3.5 despite being small enough to run on a smartphone

ChatGPT on a Google Pixel 7 Pro
Microsoft's new small AI model can compete with larger models that power tools like ChatGPT and Copilot. (Image credit: Ben Wilson | Windows Central)

What you need to know

  • Microsoft has a new lightweight AI model called Phi-3 Mini.
  • Phi-3 Mini is one of three smaller models that Microsoft will release, the other two being Phi-3 Small and Phi-3 Medium.
  • Microsoft trained Phi-3 Mini using a curriculum similar to how children learn from hearing stories.
  • Due to the fact that there aren't enough children's stories to train an AI model, Microsoft has an LLM create children's books to teach Phi-3 Mini.
  • Microsoft states that Phi-3 Mini is as capable as GPT-3.5 but that it is a smaller form factor.

A new lightweight AI model is here from Microsoft, and it promises to deliver a similar level of capabilities as GPT-3.5 in some areas despite being much smaller. Phi-3 Mini is trained on a much smaller amount of data than GPT-4 or other large language models (LLMs), but it can outperform larger models such as Llama 2. The model being smaller also allows it to run on phones and laptops rather than requiring the web.

Microsoft shared details about Phi-3 in a research paper. The Verge then shared insight on the model and quotes from Microsoft.

Phi-3 Mini is a 3.8 billion parameter language model that was trained on 3.3 trillion tokens. The research paper about Phi-3 Mini explains that one of the keys to the model is its dataset for training. Phi-3 Mini is a scaled up version of Phi-2, which was released in December 2023.

According to Microsoft, Phi-3 Mini can compete with models 10 times the size of the new lightweight AI model.

Lightweight models aren't exclusive to Microsoft. Google, Anthropic, and Meta all have smaller models.. One thing that stands out about Phi-3 Mini when compared to other models is how it was trained. Microsoft used a "curriculum," said Vice President of Microsoft Azure AI Platform Eric Boyd to The Verge. Microsoft was inspired by how children learn from hearing bedtime stories, according to the VP.

A limit on Phi-3 Mini's training was how many children's stories there are, so Microsoft had to make some. "There aren’t enough children’s books out there, so we took a list of more than 3,000 words and asked an LLM to make ‘children’s books’ to teach Phi," said Boyd to The Verge.

A model like Phi-3 Mini is not meant to replace GPT-4 or LLMs. Instead, small models can focus on specific tasks and use cases. Small models are also useful for companies using internal data for training.

Local AI

Microsoft Copilot

Some PCs will be able to run Microsoft Copilot locally rather than through the cloud. (Image credit: Windows Central | Jez Corden)

LLMs aren't going anywhere, but local AI is the next evolution of artificial intelligence. AI PCs will be able to run Microsoft Copilot locally to some extent and organizations are working on ways to use AI without requiring a connection to the web. Smaller models like Phi-3 Mini are small enough to run on phones, laptops, and other small devices.

When Intel revealed its next-gen Lunar Lake CPUs, the company confirmed the chips will have 100 TOPS (Trillion Operations per Second) of performance for AI tasks with the NPU accounting for 45 TOPS. That figure is significant because Copilot requires at least 40 TOPS of NPU performance to run locally. Qualcomm's Snapdragon X Elite has 45 TOPS of NPU performance, meaning the processor can also power Copilot locally.

Tech giants raced to roll out LLMs and other AI models to the public, but we're just starting to see hardware that can take advantage of AI technology. Smaller models like Phi-3 Mini will play a role in specialized cases and on devices that don't meet the performance requirements to run Copilot and other AI tools locally.

Sean Endicott
News Writer and apps editor

Sean Endicott brings nearly a decade of experience covering Microsoft and Windows news to Windows Central. He joined our team in 2017 as an app reviewer and now heads up our day-to-day news coverage. If you have a news tip or an app to review, hit him up at sean.endicott@futurenet.com.