What we know about Apple’s Google Gemini deal for AI

5gDedicated

Apple on Tuesday confirmed that it’s working with Google Gemini to build AI Foundation Models used across its platforms. As the joint statement explained it: “Apple and Google have entered into a multi-year collaboration under which the next generation of Apple Foundation Models will be based on Google’s Gemini models and cloud technology. These models will help power future Apple Intelligence features, including a more personalized Siri coming this year.

“After careful evaluation, Apple determined that Google’s Al technology provides the most capable foundation for Apple Foundation Models and is excited about the innovative new experiences it will unlock for Apple users. Apple Intelligence will continue to run on Apple devices and Private Cloud Compute, while maintaining Apple’s industry-leading privacy standards.”

What is the partnership structure?

The announcement confirms that Apple and Google have entered into a multi-year collaboration partnership in which the next generation of Apple Foundation Models will be based on Google’s Gemini models.

Is it an exclusive deal?

The arrangement allows Apple to work with other AI providers, including OpenAI, if it wants.

What about the money?

We do not know the financial terms of the deal. Bloomberg at one point claimed Apple intended to pay around $1 billion a year to use Gemini, but the actual terms weren’t disclosed, nor are they likely to be. But both companies can already anticipate additional scrutiny on their financial statements in the coming months.

What is the deployment model?

Apple’s plan is to use its own self-customized version of Gemini, tweaked to make sure queries are handled in Apple’s preferred fashion. Apple Insider reports the absence of any mention of Google or Gemini in the UI. Essentially, Apple will use Gemini to form the base of Apple Foundation Models it will make for itself. Apple can also ask Google to tweak aspects of how the Gemini models work.

What price privacy?

Apple cares about privacy. That is why the AI features will run either on-device or using Apple’s Private Cloud Compute, not on Google’s servers. It will be possible to use third-party cloud services for complex problems, if required. But the use of Apple’s own servers means Google won’t have direct access to user data.

What will happen as a result?

With Gemini as its foundation, Apple can now deliver:

A major Siri overhaul, rolling out this spring.

Even more evolved on-device contextual understanding that will follow in months, as Apple Intelligence becomes able to figure out data, such as identifying your relatives, on your behalf.

A Siri that’s better at conversational responses and also more likely to at least try to find the correct response, rather than saying it doesn’t understand.

The ability to create documents, and eventually remember past conversations and make proactive suggestions based on information from your apps

It is important to note that as Apple builds future Foundation Models on the foundation of Gemini, Apple developers will likely also gain access to using those models within their apps. 

Why Gemini?

Apple is relying on Gemini to form the base model for its own AI development. It will then tweak and train those models to create its own agent. Max Weinbach has written an excellent review of how this could work. He estimates that Apple is effectively covering the cost of Google’s future models within the deal.

Why did this happen?

We’ve had months of reporting about how and why Apple seemingly missed a trick with AI, and at least one technical problem has been publicly disclosed. Craig Federighi, Apple’s vice president for software, told staffers that Apple had intended to merge its existing automation systems with generative AI. “We initially wanted to do a hybrid architecture, but we realized that approach wasn’t going to get us to Apple quality,” he said.

We’ve also heard about internal conflicts, poor leadership, and unexpected challenges in the work. This may not matter now as Apple leans into Gemini.

What analysts say

“For Apple, partnering, rather than building an end-to-end AI proprietary model stack, could compress time-to-market and reduce execution risk by leveraging mature, already-deployed technology.” — Anisha Bhatia, senior technology analyst at GlobalData.

“For the people asking if the Apple-Google deal means there is no differentiation, think about it in F1 terms: multiple teams run the same engine, yet deliver vastly different results based on design and setup. Same here.” — Creative Strategies analyst Carolina Milanesi.

“Apple’s decision to use Google’s Gemini models for Siri shifts OpenAI into a more supporting role, with ChatGPT remaining positioned for complex, opt-in queries rather than the default intelligence layer.” — Parth Talsania, CEO of Equisights Research.

What is the rollout schedule?

Apple will introduce some new features this spring, aiming to keep to earlier promises. More sophisticated features, including proactive AI, will likely be announced at WWDC for introduction later this year, The Information reports.

What next?

Ask Siri in spring.

You can follow me on social media! Join me on BlueSky,  LinkedIn, Mastodon, and MeWe. What we know about Apple’s Google Gemini deal for AI – ComputerworldRead More