Every WWDC brings us Apple fans something to be excited about, something that feels like it’s going to change the world forever. But it also brings the naysayers something to complain about - the next Apple is doomed story, the next story of how Steve Jobs would have done it better and all that nonsense. Last year, the one big thing was the Vision Pro and this year it is Apple’s spin on how AI should work for all of us, the very cutely named, “Apple Intelligence”.
Apple Intelligence is a complex system that represents the best of what Apple can do, features that make sense for everyone, without compromising privacy for anyone. But given how AI has worked in existing systems, there is very understandably some confusion on how Apple Intelligence functions and how private user data is managed. Apple did not help themselves by also talking about their partnership with OpenAI in the same vein, resulting in naysayers believing that Apple has rotten and all our data is sent to OpenAI.
With this post, I try my best to break down all of the AI (read: Apple Intelligence) stuff the company announced yesterday and which of these features rely on Apple’s safe on-device processing and which of these use external servers or third party systems like OpenAI’s ChatGPT.
If you’re busy and you want a TL;DR, I have this image to summarise all features in a single table.
There is also Apple Cloud Compute, a verifiable, secure server processing system that Apple has developed. But at the moment, it is unclear which of the features uses this system over what it can do on-device. I have left out any feature that I couldn’t verify, so the break down below only talks about the features that I know works on-device / using OpenAI.
The Features
Text Editing Tools
The very first feature that I want to talk about is the re-writing tools - making your email sound kinder, checking the content for grammar. I know that a lot of us use these very functions on ChatGPT almost everyday and it’s fantastic to see that Apple nails this use case system wide for any app that uses the default text editing APIs that Apple provides.
Does it use OpenAI? NO!
All re-writing tools work completely on-device using Apple’s own models. So we should expect that it is going to be reliable, safe and super fast. No more waiting on ChatGPT to come back after a server outage to send that important email. You can now do it offline!
I do want to cover one additional use-case that comes with this tool - composing new content. This includes cases like starting a new story, a starting paragraph for your essay etc. This does not work with Apple’s on-device system. This will work using ChatGPT.
Image Editing Tools
Apple announced two types of image editing or image generation tools. The first one is called Genmoji with which you can create your own emoji, if you watched the Keynote, this is the one where you can create an emoji of a T-Rex on a surfboard. The second feature is called Image Playground which uses Stable Diffusion to create different types of images based on context or the prompts you provide. Again, if you watched the Keynote, this is the creepy looking photo of a “super mom” that they showed in Messages.
While, Genmoji is something new and not something many would have experienced before, I am sure most users with access to GPT 4 or Dall-E would have tried generating images and the outputs from these and the new Image Playground look fairly similar. It is easy for people to think that Apple is using ChatGPT under the hood here but the truth is everything here also happens on-device! They specifically call this out in both the Keynote and the State of the Union, where the presenters emphasise that users can try different types of pictures quickly as everything happens on-device.
So, again -
Does it use OpenAI? NO!
System Features
Next, I want to talk about three system features that really augment the iPhone experience.
The first is smarter notifications - iOS will now summarize the content of your notifications so that you can understand exactly what the message or email is trying to tell you. This is again a completely on-device model that does not rely on OpenAI.
The second is context awareness - finding out when your mom’s flight is landing, asking Siri where you planned to eat with her etc. These features also work using on-device intelligence. From what I saw, it seems like this feature builds on what Siri has been doing under the hood for many years now. We have seen instances in the past where Siri would say it’s someone’s birthday because of something you discussed in Messages etc. This seems like an extension of that feature. Again, completely on-device and does not depend on OpenAI.
Third is photo editing - Apple showcased that you can ask Siri to make a photo pop and it will add some editing effects to make your photo look better. These are just intents exposed to the system that Siri can take advantage of and this is something third party developers can do as well later this year. Once again, everything is completely on-device and nothing relies on OpenAI.
So what uses OpenAI?
After seeing all these amazing use cases that Apple enables on-device, it is natural to wonder why the OpenAI partnership even exists.
From what Apple showcased in the Keynote, it looks like Siri will make a reference to ChatGPT every time the user requests something that is outside of Siri’s knowledge domain.
Some examples include asking Siri for recipe suggestions, showing a photo and asking for some context around it etc. In these cases, Siri will first check with you on whether you want to send this data to ChatGPT and then make the request and show you the response.
This should solve one of Siri’s most irritating responses - “I found these results on the web”. For the first time, Siri should be able to take help from a third party tool and show us actual meaningful information instead of just web results.
This experience alone makes the OpenAI partnership valuable and users will really be able to take advantage of the best AI experiences right from Siri
Apple Intelligence is truly a revolutionary announcement, it brings AI to the masses while wrapping up all the complexity within one simple, delightful user experience. The only catch is that it will only be compatible with the iPhone 15 Pro and later, iPads and Macs with M1 chip or later. The real breakthrough in user experience will be when Apple manages to fit these models on our Watches and AirPods which will truly liberate us to live our best lives. But until that day, a world with Apple Intelligence is a delightful new world to live in!