Small language models are designed to perform well for simpler tasks, are more accessible and easier to use for organisations with limited resources and can be more easily fine-tuned to meet specific needs.
Microsoft AI
Continuing its AI spree, Microsoft has unveiled a new language model called Phi 3. According to Microsoft, small language models are designed to perform well for simpler tasks, are more accessible and easier to use for organisations with limited resources and can be more easily fine-tuned to meet specific needs. They are also cost-effective and can be used easily on smartphones as well.
“What we’re going to start to see is not a shift from large to small, but a shift from a singular category of models to a portfolio of models where customers get the ability to make a decision on what is the best model for their scenario,” said Sonali Yadav, principal product manager for Generative AI at Microsoft.
According to Microsoft, Choosing the right language model depends on an organisation’s specific needs, the complexity of the task and available resources. Small language models, as per Microsoft, are well suited for organisations looking to build applications that can run locally on a device (as opposed to the cloud) and where a task doesn’t require extensive reasoning or a quick response is needed. What Microsoft means that rather than spending a lot of money on running LLMs on cloud, models like Phi 3 can run locally.
Microsoft says that because models like Phi 3 can work offline, more people will be able to use AI in ways that wasn’t possible with LLMs. Microsoft cited an example of farmer inspecting crops who finds signs of disease on a leaf or branch. Using a SLM with visual capability, the farmer could take a picture of the crop at issue and get immediate recommendations on how to treat pests or diseases.
Interestingly, Microsoft’s unveiling of Phi 3 comes before Apple is rumoured to have a small LLM enabling AI tasks on iPhones.