Apple wants AI to run directly on its hardware instead of in the cloud::iPhone maker wants to catch up to its rivals when it comes to AI.
Apple wants AI to run directly on its hardware instead of in the cloud::iPhone maker wants to catch up to its rivals when it comes to AI.
It’s already possible. A 4bit quant of phi 1.5 1.5B (as smart as a 7b model ) takes 1Gb of ram . Phi 2 2.6B (as smart as a 13b model ) was recently released and it would likely take 2GB of RAM with 4bit Quant (not tried yet) The research only license on these make people not want to port them to android and instead focus on weak 3B models or bigger models ( 7b+) which heavily limit any potential usability.
Or
I bet on 2.
This has been Apple’s strategy for the last, what, 30 years or so. Yes, they love to buy software that is already proved and integrate it into their ecosystem (like Siri), but they get a competitive advantage when they build something that their competition can’t just replicate. I’m betting on 1! (Actually, I’m just hoping for 1, because it would probably be a better user experience.)