I'm so sick and tired of AI. Which is a bit of a problem given AI has been the focus of pretty much every single tech launch this year.
I wouldn't mind except for the hype.
We keep being told AI is the answer to all our prayers. It'll boost our productivity. Give us more free time. Answer all our emails and make our dinner reservations for us.
Unfortunately, from what I've seen so far, most people seem to use it to generate funny pictures, compose terrible music and answer questions they could have figured out themselves with a basic Google search.
And to compound the issue, every day it seems like there's a different story about how AI got something wrong. Really wrong.
All that being said, it can't be denied there seems to be a big demand for AI and many have been surprised at how long it's taken Apple to roll its version out.
Well, the wait is almost over. In New Zealand, with the iOS 18.2 update in early December, compatible Apple devices will finally become sentient. Well... you know what I mean.
You may or may not be aware that before leading up to a big iOS update, Apple releases Beta versions users can try at their own risk to help iron out any last minute bugs before the more public release. Even before then, there's a Developer version (or more often, several developer versions) primarily to allow app makers to test their own updated products in the new environment.
Over the last week or so, I've boldly jumped into this experimental universe in order to try out the forthcoming suite of Apple Intelligence features on your behalf. Some are so silly they don't interest me at all. Some are so good, they're scary. Many are just Apple's version of existing features we've already seen from other manufacturers and a few have become essential tools for me that I now use pretty much every day.
While it's easy to think of Apple Intelligence as just an iPhone thing, it's also coming to the M1 and later iPad Airs and iPad Pros, the just released A17 iPad mini and any Mac powered by an M1 chip or later.
Right now there are just six iPhones that'll run it; last year's iPhone 15 Pro and Pro Max and all four 2024 iPhone 16-series handsets. This is because unlike a lot of AI out there, a big chunk of Apple Intelligence happens on device, using the Neural Processor (NPU) built into the latest "A" or "M" chips.
Not only does this mean a more integrated, faster and smoother response from any AI functions but it also guarantees a better degree of privacy, something Apple seems to prize more highly than many of its competitors.
Even when Apple Intelligence needs to access larger, server-based models for more complex tasks, Apple has assured users its Private Cloud Compute protocols mean your data is never stored anywhere except on your device.
What I've noticed so far with a lot of the new features is many of them aren't all that obvious, even when you're using them. For example, the new Clean Up tool in the photos app just appears as another editing option. This is for when you need to remove something from your shot, like that photo-bombing bystander in the background. In many cases, Apple Intelligence will identify what you're trying to do automatically as soon as you press the Clean Up icon. If not, you can do it manually as I demonstrate in the video below.
The question is, is this an AI feature, or just a photo editing feature? Who cares? It works pretty well and you don't have to go hunting through a special AI menu or app to access it.
In fact, although the new-look Photos app took a lot of iPhone users by surprise when it appeared in iOS 18, the way you can now use Apple Intelligence to search your library for certain people, locations and other things - perhaps to create your own instant album or montage - is a smooth and seamless evolution.
The same can be said for Siri. In fact, you could argue Siri has always been a form of AI but now it's really been levelled-up, taking more notice of the context in which you're using it. Things like where you are and what apps you have open will affect Siri's responses and thanks to Apple's ever-increasing compatibility with more and more IoT devices, I definitely feel like I can depend on Siri to run my smart home more effectively than before.
There are still some inconsistencies. Ask Siri to take a screenshot? No problem. Ask Siri to start a screen recording? It refers you to the Apple website. How weird is that? At least you can now use text prompts to talk to Siri by double-tapping the bottom of the screen.
Another integration feature brings two AIs together. In the new Apple Intelligence menu you can toggle on Chat GPT to work with Siri, even signing into your Chat GPT account if you have one. I love this kind of cross-platform collaboration and I look forward to more of it in the future.
Writing Tools is going to save a lot of people a lot of time. Basically, anywhere you use text, Writing Tools can help you proof-read, summarise and even suggest small changes to change the tone and style of what you write.
This works hand-in-hand with other features like Visual Intelligence, which makes recognising and capturing text with your camera easy - even translating and reading it out loud if you want.
Of course, an AI isn't an AI unless you can use it to create silly pictures and yes, Apple Intelligence comes to the party here too. Image Wand utilises the power of Apple Pencil to turn your rough sketches into professional illustrations. On the other hand, Genmoji is just a way to create your own, bespoke emojis. I never use emojis, but I'm sure it's great. (Insert boomer joke here)
Image Playground is perhaps the most creative way to use Apple Intelligence - whether you use existing photos or drawings to create new enhanced concepts or simply describe an image to bring it to life.
But probably my favourite of the new Apple Intelligence features is the way both emails and notifications are now organised and prioritised. I'm not usually very trusting when it comes to letting AI decide what I need to deal with now and what can wait till later but even I have to admit, not having to sift through a dozen press releases from PR companies and today's great offer from Nespresso in order to get to my actual work emails has been a godsend.
And that's just a taste. Every day I'm discovering more tweaks Apple Intelligence can make to the way I use my phone, my Mac and my iPad - in many cases it was so well integrated I didn't even realise it was happening at first... and that's just the way I like it.
Take your Radio, Podcasts and Music with you