Microsoft is making artificial intelligence (AI) more accessible with the launch of Phi-3-mini, a lightweight and budget-friendly model. This is the first of several small language models (SLMs) planned by Microsoft, aiming to broaden their customer base.
“Phi-3-mini offers a dramatic cost reduction,” said Sebastian Bubeck, a Microsoft research VP. “It’s ten times cheaper than similar models on the market.”
Phi-3-mini boasts 3.8 billion parameters, making it smaller than AI giants like GPT-4. It will be available on popular platforms like Azure, Hugging Face, and Ollama. Microsoft also plans to release Phi-3 Small and Phi-3 Medium in the future, offering a range of options with varying capabilities and costs.
Parameters refer to the number of complex instructions a model can understand. With fewer parameters, Phi-3-mini is cheaper to run and performs well on personal devices. This caters to users who may not need the power of a larger model.
“We’re moving towards a future with a variety of AI models,” said Sonali Yadav, a Microsoft product manager. “Customers can choose the best model for their specific needs, considering factors like cost and performance.”
While large language models excel at complex tasks like data analysis and reasoning, smaller models offer advantages. They are cheaper to run and can be used in situations where keeping data on-premise is crucial, such as in regulated industries.
Microsoft’s Phi-3-mini marks a significant step towards making AI technology more affordable and accessible to a wider range of users. This not only benefits businesses of all sizes but also opens doors for further innovation in the field of artificial intelligence.