While the rest of the tech world tries to add as many AI features as possible to every product and service imaginable, Apple remains still. In the eighteen months since OpenAI changed the game with the launch of ChatGPT, Apple has yet to bring any significant AI features to the iPhone, iPad, and Mac, even though Microsoft, Google, and Meta seemed to have little else in mind.
However, if rumors are to be believed, that will change this year as Apple is expected to unveil recent AI features for iOS 18 and macOS 15 at WWDC in June. Rumors have been circulating for months about the trillion-dollar company’s artificial intelligence plans, and the clues keep coming. According to Bloomberg’s Mark Gurman, which has a renowned track record of reporting Apple rumors, the company is planning an approach to artificial intelligence that won’t be as flashy as some of its competitors’ efforts. Instead, Apple will introduce artificial intelligence features that integrate with apps that iPhone and Mac users already know and utilize, such as Photos, Notes and Safari. The initiative is called “Project Greymatter.” AppleInsider also confirms thisand reports on several additional AI features.
As an AI skeptic, I like this plan. It allows Apple to enter the AI space while offering AI-powered features that people can actually utilize, rather than wasting resources on “revolutionary” AI capabilities that most users will ignore once the novelty wears off.
How Apple plans to improve artificial intelligence features
Whatever Apple announces this year, AI features will have to be powered by… something. Your iPhone or Mac probably already has an NPU (neural processing unit), which is a piece of hardware designed specifically to perform AI tasks. (Apple already does this Some AI features including live text that uses NPU to handle processing). The company made a huge deal about the NPU on the M4 chip in the recent iPad Pros, which when added to the recent Mac lineup will likely power many of Apple’s upcoming artificial intelligence features.
However, not all features are ideal for on-device processing, especially for older iPhones and Macs. Gurman predicts that most Apple features will be able to run locally on devices manufactured in the last year or so. If your iPhone, iPad or Mac is brand recent, the hardware should keep up. However, for older devices or any features that are particularly power-hungry, Apple plans to outsource computing power to the cloud.
Apple is reportedly in talks with both OpenAI and Google to lease those companies’ cloud AI processing to support some recent features, but it’s unclear if (or when) those deals will come to fruition. Gurman says Apple plans to run some cloud-based features from server farms with M2 Ultra chips. If Apple can handle cloud computing on its own, I’m sure it will be better than signing a contract with a competitor. We’ll likely see how Apple’s huge AI plan comes to fruition at WWDC.
Speaking of plans, here are the AI features Apple is expected to reveal in June:
Artificial Intelligence Generating Emoji
Emojis are a huge part of any iOS or macOS update (we may have already seen some recent emojis coming in future versions of iOS 18 and macOS 15). However, Gurman suggests that Apple is working on a feature that will create unique emojis with generative artificial intelligence based on what you are currently typing. This sounds really fun and useful if done right. While there are already plenty of emojis to choose from, if nothing suits your particular mood, perhaps an icon based on what you’re actively talking to a friend about is a better choice. Android users have been using “Emoji Kitchen” for several years now, which allows you to combine specific emojis to create something completely recent. Apple seems ready to offer a successful version of this idea.
Building on this idea, AppleInsider claims that Apple is working on something called “Generative Playground” that allows users to create and edit images generated by artificial intelligence. Perhaps emojis are just one part of this experience.
Siri powered by artificial intelligence
I don’t know about you, but I’ve never been too thrilled with Siri. The astute assistant often fails to comply with my requests, whether it’s because it misunderstands my question or simply ignores me completely. For some reason, it’s especially bad on macOS, to the point where I don’t even bother trying to utilize Siri on my MacBook. If Apple manages to supercharge Siri with AI, or at least make it happen reliablethat sounds great to me.
We learned about the possibility of integrating Apple’s artificial intelligence with Siri earlier this month based on information leaked to The Fresh York Times from an anonymous source. Gurman’s latest report states that Apple plans to make interactions with Siri “more natural-sounding,” and while it’s possible the company will outsource this work to Google or OpenAI, Gurman says the company wants to utilize its own internal models of immense languages (LLM). Perhaps Siri’s specialized artificial intelligence will work on the Apple Watch.
That doesn’t mean Apple is turning Siri into a chatbot — at least not according to Gurman, who reports that Apple wants to find a partner who can deliver a chatbot to the company’s platforms in time for WWDC, without Apple having to create one . Right now it looks like the company has sided with OpenAI over Google, so do we power see ChatGPT on iPhone this year.
According to AppleInsider, Apple also plans to make other AI changes to Siri, including the ability to ask Siri on one device to control media playback on another device. (For example, asking Siri on iPhone to pause a show on Apple TV).
Safari Shrewd Search
Earlier this month, we learned that Apple was planning at least one AI feature for Safari: Shrewd Search. This feature reportedly scans any website and highlights keywords and phrases to create a generative summary of that website using artificial intelligence. While Gurman says Apple is working on an improved search experience for Safari, we don’t know much else about how Shrewd Search works yet.
AI in Spotlight Search
Speaking of better search, Apple could utilize artificial intelligence to make on-device searches in Spotlight more useful. This feature always gives me mixed results, so I’d love it if generative AI could not only speed up my searches, but also return more relevant results.
AI-powered accessibility features
Apple recently announced that several recent accessibility features are coming “later this year,” which almost certainly means “available in iOS 18 and macOS 15.” While not all of these features are AI-powered, at least two seem really engaging. First, Apple is using artificial intelligence to allow users to control iPhone and iPad with just their eyes, without the need for external hardware. There’s also a “Listen for unusual speech” feature that uses your device’s artificial intelligence to learn and identify specific speech patterns.
Automatic replies to messages
Starting last year, iOS and macOS will suggest words and phrases as you type, helping you finish sentences faster. (Do not confuse this feature with the three predictive text options in Messages that have been available for years.) In iOS 18 and macOS 15, Apple may also provide automatic replies. This means that when you receive an email or text message, the app can suggest a full reply based on what you are replying to. Soon we will all be able to communicate with just the touch of a button. (Hmm, do I want to click the response “Sounds good: I’ll meet you there” or “Sorry, I can’t. Raincheck!”)
Notes will get an artificial intelligence upgrade
AppleInsider sources say that artificial intelligence will completely change Notes: In iOS 18 and macOS 15, you will be able to record audio directly in a note, and artificial intelligence will also transcribe this recording. Apple already transcribes voicemails sent in the Messages app (quite quickly in my experience), so using AI to transcribe recordings in Notes makes sense. Gurman suggests that voice notes will also have this transcription feature.
Additionally, AppleInsider says the Notes app will also get something called Math Notes, which recognizes math notation and answers math-related questions. As you write down your equations, “Keyboard Math Predictions” may suggest ways to auto-complete your entries. I don’t have much utilize for complicated math these days, but I imagine these changes could make the Notes app popular with students.
Shrewd summaries
Gurman says Apple’s primary focus is on “astute summaries,” which are AI-generated summaries of information you might have missed without looking at your iPhone, iPad or Mac, including notifications, messages, websites, articles, notes and more. not only . iOS already has a notification summary feature, but these astute summaries seem more complicated.
AppleInsider confirms this claim and actually suggests a working name: Greymatter Catch Up. According to the outlet, you’ll be able to download AI-generated summaries from Siri as well.
Photo retouching
AppleInsider also confirms another Gurman leak: artificial intelligence will appear in the Photos app. While Gurman simply says that Apple plans to utilize its AI tools to “retouch photos,” AppleInsider claims that this feature will allow you to select objects to remove from your photos, similar to other AI photo editing tools currently on the market.
However, it is unclear whether more AI editing tools are on the way. The company has already built an AI image editor that accepts natural language suggestions to make changes; it’s possible that some of these features will be incorporated into the AI photo “enhancer” for the Photos app on iOS and macOS.