Apple Intelligence experienced a difficult start, especially with regard to Siri. The assistant still has a lot to be desired, and it should certainly be at the forefront of Apple’s priorities.
Be that as it may, Mark Gurman de Bloomberg reports that Apple plans to extend the current capacities of Apple’s intelligence to additional applications in iOS 26, and I thought I will throw some ideas that I would like to see.
Summary in more places
I think that the provision of summaries is probably one of the best cases of using large -scale language models. Apple introduced notification summaries in iOS 18, and although there have been major inaccuracies from the start, things seem to be mainly good. Apple recently activated Apple Intelligence on default compatible devices, rather than doing its operation.
On the one hand, I think it would be good if there was an API for developers to use summary models in their applications. I am sure that Apple would put strict railings, but allowing third parties to use Apple’s summary models would be a big victory. This would allow independent developers to create AI features without having to worry about an Openai bill.
In addition to that, I would really like to see summary improvements in the messages application, especially in group cats. If you have missed a conversation of 100 messages, Apple should provide a more detailed summary than what can be in the two lines.
Or, say that you are a student – imagine being able to summarize the notes you have taken in a class after the fact. You would always need to read the notes to get an in -depth understanding, but a summary of the notes could serve as an excellent way to jog if you quickly try to recall something.
Genmoji for everyone
Genmoji is probably one of the most popular Apple’s intelligence features unveiled on WWDC24. Unfortunately, it is only available on some of the most recent iPhone models: iPhone 15 pro / pro maxiPhone 16th, iPhone 16/16 Plusand iPhone 16 pro / pro max.
If you have something older, including the year iPhone 15You can’t use Genmoji.
I do not expect Apple to make its models work locally on less capable equipment, as pleasant as it is. However, they announced the calculation of the private cloud – a private server to manage Apple intelligence requests in the cloud.
These servers were probably of low capacity when they just started to deploy them, but it will be more than a year that the deployment started at the moment when iOS 19 awarded to the public.
Although I do not expect Apple to give free use of private cloud calculation, I think it would be quite careful if they brought together Genmoji in iCloud + subscriptions for users with older devices – giving people a vanguard of what Apple Intelligence’s offer.
More customizable development modes
One of my favorite features in iOS 18 was the new mode of development of reduction interruptions. In short, he analyzes each notification that passes and presents only what it thinks is important. The rest remains in the notification center.
I would really like to see Apple offer additional granularity here. For example, you can configure a focus mode that only triggers the keywords you configure. I could also see the useful opposite, where you normally authorize an application to inform yourself, but you would like notifications with the corresponding keywords to be in disinterest.
This only scrapes the surface, but I really think that there could be many opportunities for AI to allow more granular management of the notification. The new orientation of “reduction in interruptions” is only the beginning.
My favorite Apple accessory recommendations:
Follow Michael: X / Twitter,, Bluesky,, Instagram