Since this year’s WWDC in June, everyone was extremely hyped up about Apples’ AI features that are natively baked into the OS: Apple Intelligence. While other companies like Microsoft or Google are working on similar tools, Apple focusses on having most of the AI features and even LLMs run locally on device without an internet connection. After waiting for nearly half a year, Apple Intelligence eis finally out and the hype goes on. But is it actually worth it? Should you really upgrade to the latest iPhone just for the AI features? Let’s take a look.
What can Apple Intelligence do?
The new Siri
The most significant update to the OS brought by Apple Intelligence is the introduction of a new Siri. This Siri is now powered by an on-device LLM that comprehends natural language significantly better than the previous approach that relied on predefined commands. In my personal experience, the new Siri has finally surpassed Google Assistant and Alexa in terms of understanding commands. For instance, when I want to add a new event to my calendar, I can simply tell Siri that there’s a new event or a new reminder, and it’s almost certain that Siri understands it correctly and creates the event for me. Very cool overall.
Writing Tools
I write quite a bit, whether it’s on Medium, for school, or anywhere else. While I’m not a big fan of having an AI generate all my content, I frequently use ChatGPT to find and correct spelling and grammar errors. This is especially helpful since I’m not a native speaker and inevitably make mistakes.
However, I’ve discovered that Writing Tools are incredibly useful. I can select any text, anywhere on my system, and ask an LLM to proofread or refine it. It’s not about having everything done for me; it’s about helping me identify and correct my mistakes, and ultimately improve my writing.
Other features like summarizing my writing, creating tables or bullet point lists, and changing the tone don’t interest me much because of this reason.
AI Summaries
Apple Intelligence offers a feature to summarize content such as emails, notifications, messages, and websites. While this sounds promising, I’ve encountered issues with such summaries in the past. They often provide overly concise summaries, omitting crucial information. Apple’s implementation doesn’t address this problem entirely, but it’s significantly better than for example Notion’s AI summaries or Googles’ AI search. Notably, email and notification summaries are particularly useful in helping me prioritize what’s worth my attention and what can be ignored for the moment. However, since I don’t use Safari as my browser, I find the website summaries less interesting although they work very well.
Image Cleanup
Affinity Photo, Photoshop, and Google Photos all offer similar features for removing objects from photos. With the advancements in AI, these tools work exceptionally well. Honestly, there’s not much else to discuss about this feature. If you need to remove something from a Photo, you can now do it natively in the Photos app. That’s it.
Genmoji
Really, AI-generated Emojis and Memojis? Honestly, I rarely use Memojis, and if I do, I don’t feel the need to create a specific one just for the current situation. Genmoji isn’t a bad feature, but it’s not for me.
Limitations
As you’ve likely noticed, I’m not entirely convinced by Apple Intelligence and its features. However, that’s not the main issue I’m addressing. There are just a few Apple-specific things that genuinely annoy me. One of them is the hardware requirements.
As someone who understands computer science quite well, I fully comprehend that AI, particularly LLMs, need specific hardware capabilities to function effectively. Therefore, Apple’s decision to restrict Apple Intelligence to the iPhone 15 Pro and newer models, as well as all M-Series chips, makes sense. However, the fact that other features like the ChatGPT integration, which is supposedly „coming later this year,“ are also limited to these new, top-end devices is frustrating. Apple wants to capitalize on the latest iPhone, but why can’t I make an API call to OpenAI my an iPhone 13?
The much more significant problem is the limited availability and chaotic rollout of Apple Intelligence. As of now, it’s only available in English (US), not even English (UK), and no other language. Moreover, it requires the system and Siri language settings to be set to American English. Personally, I’ve configured my devices this way anyway, but since I reside in Germany, I frequently encounter German content. — What a surprise… Apple Intelligence refuses to summarize or, in a broader sense, work with content that isn’t English. While it makes sense, because the LLM primarily runs locally on the device and it’s likely fine tuned for English, this limitation is still a significant obstacle when working with it.
AI done right
I’ve already mentioned it, but I genuinely like Apple Intelligence overall. It has its limitations and issues, but I’d describe it as AI done right.
Since the launch of ChatGPT in November 2022, we’ve collectively provided OpenAI with a wealth of data about ourselves — texts to summarize or write, access to our coding projects, and personal information in various forms. It’s refreshing to see a company like Apple going for a different approach to the conventional, cloud-based AI paradigm. Personally, I believe we shouldn’t compromise our privacy and security in exchange for using AI in any way. Running LLMs mostly locally presents an intriguing solution to achieving this goal. And I haven’t event talked about Apples’ Private Cloud Compute.