Apple (AAPL) is finally releasing its first set of Apple Intelligence features for the iPhone, iPad, and Mac. It’s a massive moment for the world’s most valuable company by market cap, and a number of Wall Street analysts are banking on the platform to help reaccelerate iPhone sales.
It’s also a big bet for Apple. The company is pushing the software to some of its most important products. If it doesn’t hit the mark out of the gate, it could sour users’ perception of Apple Intelligence moving forward.
The initial version of Apple Intelligence — available via iOS 18.1, iPadOS 18.1, and macOS 15 — includes notification and email summaries, priority email alerts in Mail, text Writing Tools, natural language search and image editing in Photos, and an improved version of Siri.
Probably the most important thing to keep in mind when it comes to Apple Intelligence is that it’s not going to be available for all iPhone users. The software will only run on the iPhone 15 Pro, iPhone 15 Pro Max, and iPhone 16. So if you’ve got an older iPhone, you’re out of luck. That’s why analysts believe the platform will spark a rush of iPhone upgrades. If most of Apple’s user base can’t use Apple Intelligence, the thinking goes, they’ll want to buy newer models to get access to it.
?USD
(AAPL)
And that’s helped send shares of Apple soaring 38% over the last six months. Which makes the platform’s success a must for Apple.
I’ve been using a beta version of the first set of Apple Intelligence features for a few weeks already, and while I don’t think they’re the type of upgrades that will get consumers to rush out the door to buy the latest iPhone, they do help set the stage for the company’s next set of AI updates. That includes ChatGPT integration, the debut of Apple’s Visual Intelligence, and more.
But it’ll be some time before those capabilities are available for the general public, as developers are just getting their hands on them via Apple’s latest developer beta. In the meantime, here’s what to expect when you download the latest version of iOS and finally get to start using Apple Intelligence.
Notification summaries
The most useful tool in the early Apple Intelligence lineup, notification summaries make the never-ending stream of texts, Slacks, and news updates just a bit easier to deal with. Summaries use AI to condense your notifications into bite-sized bits of information providing you with the most important part of your messages you need to know, and it’s proven to be my favorite part of Apple Intelligence.
Text message summaries grab the most pertinent information from your texts. Normally, those include summaries of texts that are relatively long or complicated, so don’t expect every back and forth to get a summary of its own.
The Mail app also gets summarizations of emails, which can help when you’re scrolling through hundreds of messages after getting back to the office from a day off. Apple Intelligence’s priority emails, another Mail feature, automatically pulls out what it thinks are your most pressing messages and puts them in a Priority Messages box at the top of the screen. Those tend to be emails that include some kind of timely information or requests, but Priority Messages doesn’t pull in important emails from sources I’m waiting on, which tend to be my real priority.
Writing tools
Apple Intelligence’s Writing Tools features are found across the operating system including third-party apps, and can make your writing sound more friendly, professional, and concise, turn it into a list, summarize it, pull out key points, or make it into a table.
Overall, Writing Tools does a great job at all of these tasks, though, because it’s powered by generative AI, there are bound to be instances when Apple Intelligence doesn’t quite provide you with what you were hoping for. I haven’t run into any problems with it yet, but once millions of people are using the feature, I’m sure we’ll see some crop up here and there.
Photo Clean Up and natural language search
Apple has also added Apple Intelligence capabilities to the Photos app. The one that will get the most attention at first will likely be Clean Up. An AI-assisted editing tool, Clean Up lets you remove unwanted people or items from your photos. Don’t expect to be able to crop your ex out of your vacation shots, though. Clean Up is meant to help you remove smaller items, like a person walking across the background of your group photo.
Natural language search in Photos is exactly what you think it is: the ability to search for photos and videos using sentences and phrases rather than just dates and locations. I typed in shots of “John and Daniel drinking” and quickly got shots of my friend John and I throwing back a few adult beverages.
As Apple improves the search feature over time, I can imagine this finally ending those times when you want to show someone a photo of your vacation years ago, but instead end up scrolling through your phone for an uncomfortable amount of time before just giving up entirely.
Siri gets a bit smarter
Siri also gets an upgrade in the first iteration of Apple Intelligence, though it’s not the smarter version of Siri with ChatGPT. This version allows you to use Siri without it taking up your whole screen with responses and uses a slick new interface that makes the edges of your screen glow.
Siri will also recognize when you fumble your words and need to restart a sentence without getting confused and can answer product knowledge questions about your Apple device, like "How do I scan documents with my phone?" That in and of itself will be helpful for users who aren’t too familiar with some of the iPhone’s more niche capabilities.
You’re also finally able to type questions to Siri by double-tapping the bottom of the screen. It’s a nice addition, seeing as how sometimes you either don’t want to, or can’t, speak your questions to Siri out loud.
I appreciate the new features, especially typing to Siri, but it’s just not as smart as I was hoping yet. That, hopefully, will happen when Apple brings ChatGPT to the software.
Data privacy
One of the concerns surrounding the use of generative AI is how user data is processed. On-device processing ensures your data never leaves your device and that any requests or commands you give AI apps are processed using your phone’s own processor. Cloud processing means your data is uploaded to the cloud and processed on a company’s servers before being sent back to your device.
Apple says Apple Intelligence uses a mix of on-device and cloud processing depending on the complexity of your requests. But Apple has also staked its reputation on user privacy and security. To address that, the company has set up what it calls Private Cloud Compute servers.
Apple says data sent to its Private Cloud Compute servers is never stored or shared with the company and that it’s not used to train any models, just to answer requests. Apple also promises to allow independent experts to check its servers’ code to ensure the company isn’t using its users’ data for anything beyond its stated needs.
Overall, Apple’s first raft of Apple Intelligence features is a solid jumping-off point for the company’s AI ambitions. And while photo editing and text tools are helpful, notification summaries are sure to stand out the most. Now it’s up to Apple to continue to deliver.