Apple Intelligence is cool, useful, and perhaps most importantly, charmingly unfinished. Unlike some other AI projects, this upgrade in software that powers iPhones, iPads and Macs doesn’t seem to threaten our existence. Its standout features are supposed to include: Confidentiality and a Siri that actually works. But none of it works yet, and despite its imminent launch, it probably won’t for many months.
At this year’s annual new iPhone announcement, Apple revealed that, months after announcing new products, it would finally release Apple Intelligence. October. Apple Intelligence will only work on the latest iPhone models, including the iPhone 15 Pro, and MacBooks and iPads with M1 processors or better. It will not be shipped as a finished product. When you update your devices with Apple Intelligence in a few weeks, you’ll get a beta version that may or may not work better than the beta version I’ve been testing on my phone and laptop for the past few weeks.
Even with the unfinished business, I can see how Apple Intelligence will change the way I use my iPhone and my Mac. It’s a subtle, meaningful change, but I doubt any of these new AI-driven habits will change my life, at least not in the next year or so.
Before we delve into what it’s like to use Apple Intelligence, let’s review Apple’s surprisingly quiet promises about the product. Apple Intelligence isn’t designed to blow you away. Apple calls it “AI for the rest of us,” which translates to “AI that won’t scare you” when you consider the competition Google’s Gemini ran into trouble earlier this year because its image generation showed unusual biases. ChatGPT has scared people a lot since its launch in the fall of 2022. Meanwhile, experts say the steady advance of super-powerful AI technology — which we don’t fully understand and which consume vast amounts of resources — could kill us all.
So yes, I personally want the diet version of this. I left us. Apple Intelligence is for me. It’s too bad I won’t really be able to use it so soon.
Apple Intelligence is slow and steady and certainly not scary
New Apple Intelligence technology is baked into the latest versions of iOS, iPadOS and macOS. When you update your software, you get the option to turn it on and at that time, you may not even notice it because some features are buried in the menu. When you find them, though, they sort of work!
In Its iPhone 16 announcement on MondayApple describes four pillars of Apple intelligence: language, images, actions, and personal context. The first two refer to features that exist in many generative AI software. Language refers to Apple Intelligence’s ability to read and summarize things like your email inbox and notifications, as well as rewrite things you write in apps like Notes and Pages. Images refers to its image-editing features, such as Clean Up, which lets you remove objects from photos.
All of these features were available in the developer beta I tested. (Beta versions of software are pre-release versions that developers release to a large group of people to see how they stand up to stress tests.) The new Apple Intelligence features are good. If the summary is sufficient Sometimes error-prone. Email summaries were the first Apple Intelligence feature I noticed, although you won’t notice them if you use the Gmail app. The writing tool feature can do a lot more — if you feel like highlighting text and then tap to activate the dashboard. There you can do some pretty simple rewriting to make the text friendlier or more professional. You can also record meetings or phone calls and then get Apple Intelligence to transcribe and summarize what was said. None of these are revolutionary, even basic writing tools Can save your time.
On the image front, the Clean Up feature is neat, though very similar to what you can do with Magic Eraser in Google Photos. for the month. Once you activate the clean up tool, you can literally use your finger or mouse to erase the part of the image you want to remove and it will magically disappear. However, it makes my photos look obviously manipulated. Removing a swan from a photo of my wife and daughter next to a pond makes the water look doctored. But at least it gets around the problem of AI creating fake images that you can’t tell are fake.
There are two other image-generation features — an Image Playground that lets you create images, and one called Genmoji that lets you create your own custom emoji — that will roll out “later this year and in the months to come,” according to Apple.
That disclaimer applies to the other two pillars of Apple Intelligence: actions and personal context. Generally speaking, these vague terms simply refer to the new Siri and how it can work and learn more about you. The example that Apple continues to offer for this more future of Apple Intelligence is that you can tell Siri to send photos of a certain group of people to a certain person at a certain event (like “text grandma our family pictures from last weekend’s barbecue”) and Siri will do just that. . You can now type your Siri request into a new menu that you pull up when you double-tap the bottom edge of the screen. This is a game changer for people like me who don’t like talking to computers in public.
I have no idea if this works because these new Siri features weren’t available in the version of Apple Intelligence I tested, and it’s unclear when they will be. But the new Siri — or at least the initial features Apple has released so far — is definitely better. If you ask a follow-up question after an initial question, it can understand the context. Siri will also understand what you’re saying if you stutter or change your mind, which doesn’t represent a revolution in natural language processing capabilities, but a step in the right direction for it. Siri is famously limited and clumsy.
When Apple Intelligence is released to the public, however, we’ll also start to see what third-party apps do with the new Siri functionality. “Siri will gain screen awareness,” Apple senior vice president Craig Federighi said at Monday’s event. “It will enable hundreds of new actions in your app.”
In other words, Siri will know what you’re doing on your phone when you make requests and act on them — hence the buzzwords “actions” and “personal context.” But again, these more advanced features, as they sound, aren’t ready yet.
We have to wait for the best features. This is a good thing.
To say Apple Intelligence has changed the way I use my iPhone and MacBook would be a gross overstatement. Because the AI-powered features are so limited and out of sight, I actually forget they’re there. It’s also worth emphasizing that I’m testing an unreleased version of the software While it probably looks pretty close to what Apple will release next month, the version of Apple Intelligence I’m using is buggy and unfinished. Apple will be working out many bugs in the coming weeks. However, it will not release a finished product in October.
Apple hasn’t said when Apple Intelligence will graduate from beta. It could be years. After all, Google Gmail left in beta for five years Before labeling. And as long as Apple wants to distance itself from any Apple Intelligence mistakes — and generative AI mistakes, namely HallucinationsBasically by design — we should expect the label to stick around.
One potential limitation Apple faces in its quest to bring more advanced AI features to its users is its commitment to privacy. I mentioned privacy as a key feature earlier, although I didn’t describe what it looks like because it’s invisible. Although AI technologies like Google Gemini and OpenAI’s ChatGPT require you to send a lot Data on the server In the cloud, Apple, which is famously serious about privacy and security, promises that its proprietary AI models will do as much as they can on your device, as Apple does with most of your data. And when Apple needs to send your data to servers, it will do so in a secure way A new system called private cloud computing. Since more advanced features require more computing power, it remains to be seen how this system stacks up against its competitors.
Then there is the matter of cost. Apple Intelligence was free for me to test through the beta version of iOS 18, but it’s not clear that all features will be free for everyone. First, you’ll need a device that supports Apple Intelligence, which, again, only works on latest device. For most people, that means buying a new iPhone. For the most advanced features, Apple will report Charge a monthly fee At some point in the future. So while a super-smart Siri sounds useful, your life probably won’t be completely transformed Unless you are willing to pay for privilege.
So here I am, almost stumbling upon Apple Intelligence. I upgraded to an iPhone 15 Pro before Apple announced Intelligence, and if I hadn’t I wouldn’t have bought a new phone to get the new AI technology. But since Apple Intelligence works on my phone, it makes a lot of things I enjoy a little easier. When I think about using it, Apple Intelligence saves me moments of work that I would otherwise spend reading every notification or editing a photo. At this point, I need to test the AI’s work, but it doesn’t bother me.
It’s the near future where AI’s work is indistinguishable from our own that I fear. For now, I’m happy with the simplicity of Apple’s Diet AI, and I don’t mind it screwing up sometimes. So do I.
A version of this story also appeared in the Vox Technology Newsletter.Sign up hereSo you don’t miss the next one!