Lightweight and cheap AI model Phi-3-mini launched by Microsoft

Lightweight and cheap AI model Phi-3-mini

Lightweight and cheap AI model Phi-3-mini launched by Microsoft

Tech giant Microsoft launched a lightweight and most cheap artificial intelligence (AI) model named Phi-3-mini. This development shows the company’s strategic attempts to serve more clients by providing more reasonably priced solutions in the quickly changing AI technology market.

Microsoft has released the first of its three small language models (SLMs), Phi-3-mini. Large language models (LLMs) get all the attention these days. However, small language models (SMLs) of Phi-3-mini highlight the significance of SMLs.

It may look small compared to big models like GPT-4 but has a massive 3.8 billion features that work very well. Microsoft has a carefully chosen dataset full of ideas from children’s books and teaching materials to train this AI model. There have been tests that show it gives the same answers as GPT-3.5.

These smaller AIs provide a cheap and effective option for more powerful than their bigger counterparts. This has made it easier for several companies to use AI. It works well for quick answers, localized processing, and offline capabilities.

Phi-3-mini can be used in various AI apps from smart speakers to smartphones without massive computing power.

Sebastien Bubeck is a vice president of GenAI research at Microsoft and discussed the affordability of Phi-3-mini. He also emphasized its notable cost advantage over similar models available in the market.

Sebastien stated that Phi-3 is not slightly cheaper, it is dramatically cheap. He cited a cost savings of up to ten times compared to rivals with comparable capabilities.

Click here to read the updates on OpenAI Sora launched to convert AI text into video

Phi-3-mini is recently easily accessible on Hugging Face, a matching learning program, Azure, Microsoft’s cloud service platform, and Ollama. It is a framework for local machine model use.

Phi-3-mini SLM is also integrated with Nvidia’s software tool, Nividia Inference Microservices (NIM), and optimized for Nvidia’s graphics processing units (GPUs) to improve accessibility and performance.

Hence, Microsoft has released the lightweight and cheap AI model Phi-3-mini for companies that don’t have a lot of resources to handle simpler jobs.

Read More:

Share this content:

Post Comment