WWDC shows that the myth on Apple and Android’s features is true


There is a longtime myth Apple who says that the company has been taking a feature used on Android for a few years, adds it to the iPhone, gives it a new name and acts as if it had found the best thing from the sliced ​​bread. If you watched the WWDC 25 Keynote today, or Read our cover, you will know that it is not a myth. Three features that I wanted on the iPhone, maintenance of help, live translation and calls for calls were taken from Google AI and Galaxy Ai found respectively on the Pixel and Samsung handsets respectively.

Apple finally catches Android with a functionality published on the Pixel 6 line in 2021

When Google presented the Pixel 6 and Pixel 6 PRO In 2021, he revealed a new feature that was safe for users called HOLD for me. When activated, Google Assistant monitors the telephone line while you are pending. With Google Assistant at work, you can turn your attention to something else without having to continue listening to someone at the other end of the call to return to the conversation. As soon as a live person is ready to help you, Google Assistant alerts you audiblely.

It is excellent feature and that I missed a lot when I returned to the iPhone. Today, during the WWDC Keynote, Apple announced its Hold version for me that it calls Hold Assist. Let’s say you need to call an airline to ask a question about a reservation you made. When you are pending, you can ask your iPhone to sit through silence while you return to your work. When an agent is ready to help you, you will be informed so that you can give your attention to the call.

During a call, press the button on the screen displaying the dial. This is the button with a three -point icon. Press this button offers you several options with the last as support for assistance. Press Hold Assist and you will see in the dynamic island that the functionality has been activated, the call is pending and that you will receive a notification when a living person is at stake.

Another new functionality to come on the iOS telephone application is called Live translation. This is another useful characteristic and that that Samsung offers with its Galaxy Ai Continuation of features and it bears the similar name of Live Translate. I hoped that Apple would add that to iPhone and that it happens IOS 26. Integrated into the phone, front time and messages, live translation uses an AI on a device to translate what a caller tells you in a foreign language via audio or text. This translation is done in real time, on the fly.

Keeping technology available allows spoken or written conversations to stay private. Your response is translated in real time in the language of the other part allowing a transparent bidirectional conversation.

Finally, a feature on pixel models called the Google call screen uses AI and asks an appellant to reveal his name and the reason to call before connecting the call. Apple now has a similar feature that he calls calls to screen. The idea is to find enough appellant information to allow the AI ​​to decide to block the call or let it go.

As with Hold Assist, iPhone users will find the translation live and call screening to be extremely useful features. I am happy to see them available on iOS. As for this myth, how can someone not see that it is one thing. This shows you the difference between Google and Apple with regard to their operating systems. Android is changed to improve the user experience. Apple waits before it adds these new useful features, then offers a somewhat similar name. However, after saying, I am happy to see Apple add these features to iOS and I can’t wait to use them.

Leave a Reply

Your email address will not be published. Required fields are marked *