Microsoft launches Phi-3, its smallest AI model yet
Microsoft launched the next version of its lightweight AI model Phi-3 Mini, the first of three small models the company plans to release.
Phi-3 Mini measures 3.8 billion parameters and is trained on a data set that is smaller relative to large language models like GPT-4. It is now available on Azure, Hugging Face, and Ollama. Microsoft plans to release Phi-3 Small (7B parameters) and Phi-3 Medium (14B parameters). Parameters refer to how many complex instructions a model can understand.
The company released Phi-2 in December, which performed just as well as bigger models like Llama 2. Microsoft says Phi-3 performs better than the previous version and can provide responses close to how a model 10 times bigger than it can.
Eric Boyd, corporate vice president of Microsoft Azure AI Platform, tells The Verge Phi-3 Mini is as capable as LLMs like GPT-3.5 “just in a smaller form factor.”
Compared to their larger counterparts, small AI models are often cheaper to run and perform better on personal devices like phones and laptops. The Information reported earlier this year that Microsoft was building a team focused specifically on lighter-weight AI models. Along with Phi, the company has also built Orca-Math, a model focused on solving math problems.
Boyd says developers trained Phi-3 with a “curriculum.” They were inspired by how children learned from bedtime stories, books with simpler words, and sentence structures that talk about larger topics.
“There aren’t enough children’s books out there, so we took a list of more than 3,000 words and asked an LLM to make ‘children’s books’ to teach Phi,” Boyd says.
He added that Phi-3 simply built on what previous iterations learned. While Phi-1 focused on coding and Phi-2 began to learn to reason, Phi-3 is better at coding and reasoning. While the Phi-3 family of models knows some general knowledge, it cannot beat a GPT-4 or another LLM in breadth — there’s a big difference in the kind of answers you can get from a LLM trained on the entirety of the internet versus a smaller model like Phi-3.
Boyd says that companies often find that smaller models like Phi-3 work better for their custom applications since, for a lot of companies, their internal data sets are going to be on the smaller side anyway. And because these models use less computing power, they are often far more affordable.
Microsoft launched the next version of its lightweight AI model Phi-3 Mini, the first of three small models the company plans to release. Phi-3 Mini measures 3.8 billion parameters and is trained on a data set that is smaller relative to large language models like GPT-4. It is now available…
Recent Posts
- The Arc browser just launched and yes, it really is that good
- Microsoft’s billion-dollar OpenAI investment was trigged by Google fears, emails reveal
- Walmart shuts down telehealth service
- Audible tries book recommendations based on your streaming history
- Anthropic finally releases a Claude mobile app
Archives
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- December 2011