iOS 18 tipped for major new AI feature — here's what you need to know
AI will be able to summarize notifications in iOS 18
Apple has its work cut out to close the AI gap with Google, but one particular upgrade coming to iOS 18 and, specifically, Siri, could make for a valuable time-saving feature. As per the Power On newsletter from Bloomberg's Mark Gurman, Apple is working to bestow the iPhone's virtual assistant with the ability to auto-summarize notifications.
Supposedly, Apple calls this "proactive intelligence" and, in keeping with the company's privacy ethos, would use on-device processing. As Gurman tells it, iOS 18 will give Siri, "services like auto-summarizing notifications from your iPhone, giving a quick synopsis of news articles and transcribing voice memos, as well as improving existing features that auto-populate your calendar and suggest apps."
Furthermore, the assistant's voice capabilities are set for an overhaul that'll give it a "more conversational feel".
Gurman also noted other AI features coming to iOS 18, including AI-based editing in Photos, but interestingly said any ChatGPT-style chatbot won't be playing a part in Apple's upcoming WWDC event.
A "proactive" Siri 2.0
Siri's taken somewhat of a pasting when compared to Google's Assistant — and the latter is only becoming more powerful as it transmogrifies into Gemini. But Apple is committed to doing as much on-device as possible. The aforementioned privacy factor is at play, but it also offers up a faster response time because there's no need to communicate with a server.
In Gemini Nano, Google has managed to condense a multimodal LLM down to run on-device and Apple seems to be deploying the same tactic for iOS 18, taking advantage of the iPhone's Neural Engine.
Back in January, when we got our first look at the iOS 17.4 beta, there were references to a private framework called “SiriSummarization” which would relay to the ChatGPT API.
Sign up to get the BEST of Tom's Guide direct to your inbox.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
But, Apple has also developed multiple new models from its own research division that could play a part in Siri's next iteration. One of which is called ReALM (Reference Resolution As Language Modeling) and, like Gemini Nano, is designed to run on a phone.
Apple's ReALM is a visual model that reconstructs what's displayed on screen, labels each entity and passes to Siri as additional context for clues to the user's request. You can immediately see how such a model would help with summarization because it would be able to capture the notification, label it, and produce an accurate summary with nothing needed from either the user or the cloud.
Meanwhile, the clutch of iOS 18 rumors we've already collected have pointed to other AI capabilities like generated playlists in Apple Music, slide creation in Keynote, suggested text for Pages documents and a Safari browsing assistant that could help summarize content on web pages.
The good news is we won't have to wait long to hear about all this firsthand — WWDC is less than a month away.
More from Tom's Guide
- iOS 18 apps: 7 biggest rumored upgrades
- Android 15 — 5 ways it just put iOS 18 on notice
- Android 15: 7 best features just announced at Google I/O
Jeff is UK Editor-in-Chief for Tom’s Guide looking after the day-to-day output of the site’s British contingent. Rising early and heading straight for the coffee machine, Jeff loves nothing more than dialling into the zeitgeist of the day’s tech news.
A tech journalist for over a decade, he’s travelled the world testing any gadget he can get his hands on. Jeff has a keen interest in fitness and wearables as well as the latest tablets and laptops. A lapsed gamer, he fondly remembers the days when problems were solved by taking out the cartridge and blowing away the dust.