A first look at Apple Intelligence and its (slightly) smarter Siri

Siri’s big upgrade starts now, but you’ll need the right iPhone to access it. Photo by Vjeran Pavic / The Verge

In iOS 18’s latest developer preview, Siri gets a glow-up. Like, the whole phone actually glows around the edges when you invoke Siri.

A splash screen reintroduces you to the virtual assistant once you enable Apple Intelligence, an early version of which is now available on the iPhone 15 Pro and Pro Max in a developer beta. You’ll know Siri is listening when the edges of the screen glow, making it pretty obvious that something different is going on.

The big Siri AI update is still months away. This version comes with meaningful improvements to language understanding, but future updates will add features like awareness of what’s on your screen and the ability to take action on your behalf. Meanwhile, the rest of the Apple Intelligence feature set previewed in this update feels like a party waiting for the guest of honor.

That said, Siri’s improvements in this update are useful. Tapping the bottom of the screen twice will bring up a new way to interact with the assistant: through text. It’s also much better at parsing natural language, waiting more patiently through hesitations and “um”s as I stumble through questions. It also understands when I’m asking a follow-up question.

Outside of Siri, it’s kind of an Easter egg hunt finding bits of Apple Intelligence sprinkled throughout the OS. They’re in the mail app, with a summarize button at the top of each email now. And anywhere you can type and highlight text, you’ll find a new option called “writing tools” with AI proofreading, writing suggestions, and summaries.

“Help me write something” is pretty standard fare for generative AI these days, and Apple Intelligence does it as well as anyone else. You can have it make your text more friendly, professional, or concise. You can also create summaries of text or synthesize it into bulleted lists of key points or a table.

I’m finding these tools most useful in the Notes app, where you can now add voice recordings. In iOS 18, voice recordings finally come with automatic transcriptions, which is not an Apple Intelligence feature since it also works on my iPhone 13 Mini. But Apple Intelligence will let you turn a recording transcript into a summary or a checklist. This is helpful if you want to just free-associate while recording a memo and list a bunch of things you need to pack for an upcoming trip; Apple Intelligence turns it into a list that actually makes sense.

These writing tools are tucked out of the way, and if you weren’t looking for them, you might miss them entirely. The more obvious new AI features are in the mail app. Apple Intelligence surfaces what it deems to be important emails in a card that sits above the rest of your inbox marked as priority. Below that, emails show a brief summary in place of the first line or two of text that you’d normally see.

There’s something charming about AI’s sincere attempt to summarize promotional emails, trying to helpfully pull out bits of detail like “Backpacks and lunch boxes ship FREE” and “Organic white nectarines are sweet and juicy, in season now.” But the descriptions in my inbox were accurate — helpful in a few instances and harmless at worst. And the emails it gave priority status to were genuinely important, which is promising.

The search tool in the Photos app now uses AI to understand more complicated requests. You can ask for pictures of a particular person wearing glasses or all the food you ate in Iceland, all in natural language.

Source: https://www.theverge.com/2024/7/31/24209910/apple-intelligence-ios-18-preview-siri

Exit mobile version