https://www.theverge.com/2024/4/23/24137534/microsoft-phi-3-launch-small-ai-language-model Microsoft just rolled out Phi-3, claiming it's their tiniest AI model yet. I stumbled upon this article talking about it and got curious. What do you all think about this new tech move? Excited? Skeptical? Honestly I'm not exactly familiar with AI models and while I get the possible advantage of the smaller AI model -- i'm not exactly sure if it's a huge advantage or devleopment... what do you all think?
Nice move for edge deploy for a specific usecase
Or an “edge” case
edge model for edge edge usecase (not a typo)
Smaller = run on micro envs like your smartphone or your smartwatch
They are just trying to get hype for AI, it's nothing really remarkable. We are still in an AI bubble
If you think tech is a giant bubble, ya sure
The entire universe is a bubble created at the Big Bang.
🥱
The 3.8B model is on par with ChatGPT… and can run locally on your phone. They are ahead of Google and apple
What is the minimum config required for the phone ? Will it work only on high end phones ?
1.8 GB of ram for the 4-bit model runs on iPhone 14 (old A16 cpu) at 12 Tok/s !!
My smallest AI language model is a logistic regression model with 2 parameters and outputs either "yes" or "no". Beat that microsmall
I can beat your microsmall No not like that This O(1) algorithm to check if a number is prime. 95% accurate https://github.com/mawerty/Is-Prime
ok that is funny meta
Small models mean they can run locally. The question is, what usecases need that
Local corporate network in legal and tax applications.
They have their data centers, they don't need small models to run on phones and laptops
Definitely not the smallest language model, the previous Phi2 and Phi1.5 are smaller. Is it good? We'll see once more evaluation results come out
Why are they so obsessed with language models ???
MicroAI