Stackup Solutions

In the vast expanse of AI innovation, Microsoft has always been a pacesetter for its rivals. From embracing Open AI, unveiling its chatbot tool, to fostering a culture of innovation within its organization, Microsoft seems to be tapping every new opportunity in the world of AI. But what’s the recent brilliant journey that the tech giant has just embarked? We’re precisely going to discuss Microsoft’s significant revelations in the Artificial Intelligence space and everything about its latest launch. 

Looking Back at Microsoft AI Achievements

The world’s first official encounter with AI was just more than a year ago. Since then, everything around us has significantly changed. The way we used to work, travel, study, interact and entertain ourselves – all has hugely impacted with the AI revolution. Not to forget our relationship with technology, that has also been redefined with the help of AI-powered solutions.  

Microsoft Launches Phi-3, Its Smallest AI Model

Microsoft, being a pioneer in AI, tapped new opportunities, and experimented perspectives and imperatives. With the release of hundreds of new upgrades and AI-powered solutions last year, the whole Microsoft experience and stack is now AI-powered. This starts with Microsoft 365, a suite of productivity products used on a daily basis by hundreds of millions of people globally. 

We have also introduced toolchains to facilitate organizations in harnessing role-based solutions. Our Copilot Studio is a low-code tool that enables enterprises to develop standalone Copilot and also tailor-made Copilot for Microsoft 365 suite. 

Talking about Microsoft Azure, it is an Open AI solution that allows developers to build robust and innovative AI-based solutions without a hassle. 

With the help of these tools, Microsoft accomplishes its dream to empower enterprises, developers and users. The speed, cost-effectiveness and efficiency that come along with leveraging the power of AI in building customized solutions help organizations and tech geeks to focus on their specific needs and unlock new domains of creativity and innovation.   

Tiny But Mighty – Microsoft’s Smallest Phi-3

Keeping in view billions of dollars investment that the tech giant has spent so far in AI activities, users were expecting brilliant and better artificial intelligence models from Microsoft. On the other hand, the organization rather thinks that a smaller AI model is not necessarily a bad idea.

If you’re living under the sky, chances are you must be aware that Microsoft has unveiled Phi-3, a diminutive, lightweight AI mini model on Tuesday 24, 2024. It is the first version of the SLM (small language models) series and the company plans to launch 3 mini Phi-3 versions in the upcoming future. 

Read more: Devin, New AI Software Engineer

Microsoft says that Phi-3 performs efficiently with outsized potential, proving that greatness can indeed come in the smallest of packages. It will be high-performing and cost-effective – a perfect package for companies with limited resources.  

But where did the inspiration come from?

Sometimes highly complex problems can be resolved by taking inspiration from random, unimportant paradigms. Similarly, Microsoft’s machine learning expert Ronen Eldan was once reading bedtime stories with his daughter. He randomly thought, how did she get to know these words and about connecting words to make new words? Thinking along these lines, he wondered how much an AI model might learn with just words that a four-year-old could understand. This prompted him to develop a novel training strategy that has created a new class of smaller, more powerful language models, which should increase the accessibility of AI for a wider audience.

Large language Models (LLM) allow you to be creative using AI and are used to accomplish complex tasks. But leveraging LLMs means you will require large computing resources to function and a significant hike in the price point. 

Features of Microsoft Phi-3

  • Phi-3 Mini is trained on 3.8 billion parameters or variables 
  • Requires less power and energy to deliver desirable results
  • Best for smartphones and lowered-powered devices
  • Cost-effective small language model 
  • Evaluates language, coding and math capabilities
  • Highly-responsive 
  • 10x times more cheaper than current AI models
  • Can work offline and is accessible to more number of users

Applications of Microsoft Phi-3

  • Microsoft claims that Phi-3 could be used to summarize a lengthy document into main points. It can also help users to extract insights and industry trends from market research studies. 
  • Phi-3 can be used to generate website copies, social media posts and product descriptions for the marketing or sales staff. 
  • Businesses can also employ Phi-3 to power a chatbot that provides basic plan or service upgrade information to clients.  
  • It is primarily useful for the computations when you don’t need cloud for your projects.
  • Phi-3-mini is available on Microsoft cloud service platform Azure’s cloud AI model catalog, machine learning model platform Hugging Face, and Ollama, an AI model service.
  • Although, Phi-3 lacks computing capabilities like GPT-4 and Turbo (trained on 1 Trillion parameters), the performance is comparable to GPT-3.5

Wrapping Up

It is not being claimed that SLMs will take the place of or replace massive language models. SLMs, on the other hand, are ideally suited for computing on the edge, computation on the device, and computations where you can accomplish tasks without going to the cloud. We must comprehend the advantages and disadvantages of this model portfolio.

And there are significant benefits to size. The level of intelligence that may be obtained from large models on the cloud still differs from tiny language models. And perhaps there will always be a difference because, well, the major models will never stop improving.