fbpx
Microsoft Unveils New Cost Effective AI Model – Phi-3 Microsoft Unveils New Cost Effective AI Model – Phi-3
Microsoft announced on Tuesday the launch of Phi-3-mini, a new lightweight AI model aimed at providing cost-effective solutions for its users.... Microsoft Unveils New Cost Effective AI Model – Phi-3

Microsoft announced on Tuesday the launch of Phi-3-mini, a new lightweight AI model aimed at providing cost-effective solutions for its users. According to Reuters, this release marks the first in a series of three small language models (SLMs) that the tech giant plans to unveil, signaling a strong commitment to a technology poised to transform global industries and workflows.

The Phi-3-mini is part of Microsoft’s broader strategy to cater to a diverse range of clients, including those with limited resources. By offering a dramatically more affordable option, Microsoft intends to remove financial barriers that often discourage smaller companies from integrating advanced AI technologies.

In-Person and Virtual Conference

September 5th to 6th, 2024 – London

Featuring 200 hours of content, 90 thought leaders and experts, and 40+ workshops and training sessions, Europe 2024 will keep you up-to-date with the latest topics and tools in everything from machine learning to generative AI and more.

 

Phi-3 is not slightly cheaper, it’s dramatically cheaper, we’re talking about a 10x cost difference compared to other models out there with similar capabilities,” said Sébastien Bubeck, Microsoft’s vice president of GenAI research.

This model is designed to perform simpler tasks efficiently, making it an ideal choice for companies that do not require the complex capabilities of larger AI systems but still seek to leverage AI for enhanced productivity and innovation.

Currently Phi-3 mini is available on Microsoft’s cloud service platform, Azure, and its inclusion in the AI model catalog highlights the company’s efforts to ensure ease of access. Additionally, the model has been listed on prominent machine learning model platform Hugging Face and Ollama, a framework that facilitates the running of models on local machines.

This strategic placement is expected to broaden its usability and appeal, especially among developers and small to medium-sized enterprises (SMEs) looking to experiment with AI without committing significant resources.

Moreover, Phi-3-mini has been optimized for use with Nvidia’s graphics processing units through the Nvidia Inference Microservices (NIM), enhancing its performance and making it an even more attractive option for users with existing Nvidia infrastructure.

In-Person & Virtual Data Science Conference

October 29th-31st, 2024 – Burlingame, CA

Join us for 300+ hours of expert-led content, featuring hands-on, immersive training sessions, workshops, tutorials, and talks on cutting-edge AI tools and techniques, including our first-ever track devoted to AI Robotics!

 

The launch of Phi-3-mini comes on the heels of Microsoft’s recent $1.5 billion investment in UAE-based AI firm G42, further underscoring its investment in cutting-edge technology and global AI development.

The company’s ongoing partnership with French startup Mistral AI also highlights its strategy to expand its technological footprint and collaborative efforts via its Azure cloud computing platform.

ODSC Team

ODSC Team

ODSC gathers the attendees, presenters, and companies that are shaping the present and future of data science and AI. ODSC hosts one of the largest gatherings of professional data scientists with major conferences in USA, Europe, and Asia.

1