For the last few years, Apple’s macOS releases have been interesting, if not particularly exciting. But that’s all set to change this year with the launch of macOS Sequoia, and it’s all thanks to one feature: Apple Intelligence.
Apple’s artificial intelligence (AI) platform has the potential to completely change how you use your Mac on a daily basis. From generating images, rewriting emails, and summarizing your audio recordings to revamping Siri into a much more capable virtual assistant, Apple Intelligence could be the most significant new macOS feature in years.
Now that it’s available in the latest macOS Sequoia beta, I thought I’d take Apple Intelligence for a spin to see whether it’s worth your time. Read on to see my first impressions.
Siri
The good news is that you can try out Apple Intelligence on your Mac for free by downloading the macOS 15.1 developer beta. The bad news is that a large number of Apple Intelligence’s features — the majority, in fact — are currently missing in action and unavailable to test out. So, while I was able to try a few parts of Apple Intelligence, it was far from a complete look at the new system.
Let’s start with Siri. Apple’s virtual assistant has squandered its early lead since it launched in 2011 and has fallen far behind its rivals in recent years. Apple Intelligence is a massive chance to close the gap, injecting the assistant with a much-needed dose of attention (and, you know, artificial intelligence).
Unfortunately, many of the new Siri’s features are evidently not ready, and Apple hasn’t added them to macOS Sequoia yet. Awareness of a question’s context — as well as what’s happening on your screen — is totally absent, for example, and it’s the same case for its ability to run functions inside other apps, among other features.
With that in mind, what about the remaining new Siri features that you can try out? One of the additions that Apple talked up this year was Siri’s ability to understand you if you change your mind or stumble over your words.
Except this doesn’t work very well at all, at least not in my testing. Every time I spoke this phrase, the Siri window showed that it was accurately logging what I was saying — the displayed text was correct. The problem is that in several attempts, it barely ever set the correct timer. Clearly, more work is required here.
What else is new? Typing to Siri is much easier than before — just select the Siri icon in your Mac’s menu bar, and you can type to the assistant right away. This change makes Siri much more useful for Mac users, as you might find yourself working in a library or coffee shop and don’t want to disturb other people by talking to Siri.
Siri also has a new user interface, with glowing edges that pulse as it works on your query. But it still feels very limited since most of its features are still absent from the macOS Sequoia beta.
Writing Tools
When you think of artificial intelligence, writing tools are probably one of the first things that come to mind. Now, macOS can do that sort of thing natively. And unlike Siri, this feels a lot more fleshed out. Somewhat predictably, Apple calls this feature Writing Tools. Just highlight a few lines of text, right-click it, and you’ll see the Writing Tools menu item.
These tools are generally pretty good, whether you want to rephrase your text, get a summary, or make it easier to understand. In particular, I can see these tools being useful for writing formal or important emails, or for drafting documents before you make later edits.
I don’t think it’s the sort of thing I’ll be using every single day, but it might be helpful to get some ideas for rewriting my words every now and then. The text manipulation tools (summarizing, turning text into a list or table) feel a bit more useful to me, though, especially when I’m confronted with a large wall of text and just don’t have the energy to read through it all.
Transcription
Another useful Apple Intelligence feature that has made it into the latest macOS Sequoia beta is audio transcription. I tried it out on a 38-minute audio recording in the Voice Memos app.
Unfortunately, the summary was peppered with wrongly transcribed words throughout. That’s not so unusual — even the best AI-powered audio-transcription services will make mistakes. But what is annoying is the fact that you can’t correct any of these errors, as there’s no way to edit the text.
For now, Apple Intelligence’s transcription tools remain a work in progress, but there’s definitely potential there as long as Apple can refine the rough edges.
What’s missing?
That’s more or less all the Apple Intelligence features that are available in macOS Sequoia right now. But there are plenty more waiting in the wings that have yet to be released, with predicted launch dates ranging anywhere from shortly after macOS Sequoia’s launch date to well into 2025.
In a way, that itself neatly sums things up for Apple Intelligence in macOS Sequoia’s latest beta. Right now, there are a few things to try out — and many of them are pretty good so far — but there’s a whole lot of features that are in rough shape or downright absent.