All Android Apple features announced on WWDC 2025


Apple announcing the features of Android years after Google has shipped them is a story as old as time, but it makes it less fun to emphasize whenever this time. This year’s WWDC was in particular Android -Y – not helped by Siri mainly seated this year’s announcements while Apple has put its new liquid glass design language at the front and center.

The imitation goes both directions: Android launches its version of the activities live from iOS and follows the example of Apple by adding more customization options to the quick settings. However, I couldn’t help but notice a series of new features from Apple, the Keynote that I definitely saw somewhere before. Not that Apple would never admit borrowing them.

Call screening and Hold Assist

The screen call dates are returned to Android 12, and the Pixel phones have offered a version of the functionality even longer. The previous versions force you to manually invoke the functionality, but on the Pixel 7 and more recent, you can have it replied automatically and screen the incoming calls likely to be spam. The version of Apple launches with iOS 26 automatically resumes.

The screen call is something that I certainly miss when I go from Android to iOS, so as long as it works quite well, I think it will be a welcome feature on the iPhone.

The screen call is such a boon.
Image: Barbara Krasnoff / The Verge

Hold Assist is another familiar phone function. Google’s version made its debut in 2020 on pixel phones And then started to flow into the rest of the ecosystem last year. The functionality works roughly the same way as on iOS 26: instead of having to stay in play and listen to Hold Music, you can put your phone and get an alert when a human is ready to talk to you.

It’s super practical! I find myself pushed to solve my problems with customer service chatbots on the web more than on the phone lately, but in the rare moments when I have to hold it, it is generally for unreasonable time. I take it.

Translations! On the phone application

Recent Samsung phones already offer a live translation that is cooked in the telephone application which is very similar to what Apple has unveiled this week. They both provide translations spoken in real time of the appellant’s language to the recipient and vice versa. Do not expect a long and nuanced conversation using one of these features, but at least the version of Samsung is sufficiently capable for its planned use: short and transactional exchanges as reserving a table or hotel room.

On Apple and Samsung phones (illustrated here), cat translations go in both directions so that the two participants can benefit from it.
Image: Vjeran Pavic / The Verge

In both cases, translations also extend to messaging. The version of Samsung will offer to adapt your messages to different writing styles in order to avoid seeming too relaxed at the wrong time. Could be useful!

Suggesting actions based on what is on your screen

Google continued the whole “Use of contextual consciousness of surface information” from the Dawn of Time – or at least since 2012. Same thing with the search for what is on your screen, Gracious Google Lens. In the AI ​​generating era, it has spread to look for, which uses intelligence to try to better identify what you are looking for. Now Apple offers a version of it based on screenshots.

iOS gets its own version of Circle to search.
Image: Apple

On the phones with Apple Intelligence, you will see new options when you take a screenshot. If there is an hour and a date on the screen, it suggests making a calendar event. You can also surround, uh, emphasize Something on the screen to search for it.

Apple’s version always starts with a screenshot, which is intelligent. People who are not aware of the new feature will probably find him appear in a place they already know. The Google search circle requires unique navigation, generally typing the handle at the bottom of the screen. Personally, I always train to use this gesture rather than opening a new tab and typing a search in Chrome. I doubt that I am alone.

Tabbing between photo and video recording in the application of the camera

IOS 26 shakes the user interface of the camera application by hiding most default shooting modes, except for two: photo and video. The rest appears when you scroll to the left or right, so you can always find your portrait or panorama options. But the simple video / photo dichotomy recalls the rocking between these two options on Pixel phones.

The Pixel Camera application includes a persistent video / photo rocking.

In iOS 26, the switch between video and photo modes is more important.

In the Pixel Camera application, it is an autonomous rocking, so it’s always at hand regardless of the mode in which you turn. But I appreciate that the two options put this main functionality before and in the center. And hey, if you want to transform your regular photo into a panorama, you can still use the AI ​​afterwards. RIGHT??

Leave a Reply

Your email address will not be published. Required fields are marked *