
There’s a long-standing theory that Apple will take a feature that’s been on Android for years, add it to the iPhone, give it a new name, and then tout it as some great innovation.
That's exactly what Apple did at its WWDC 2025 keynote, when it revealed a number of new features that will be available in Apple Intelligence when the software update launches later this year.
"Clearly copied"
A series of new AI-powered features that improve the user experience will appear on iPhone, iPad, Mac, Apple Watch, and Vision Pro. The most prominent is Live Translation.
![]() |
Live Translation feature in FaceTime calls. Photo: Apple. |
Designed to break down language barriers when communicating, Live Translation integrates into messaging apps, FaceTime, and phones. The feature runs entirely on the device to keep conversations private.
Messages can be automatically translated as users type and replies received instantly translated. On FaceTime, live captions display translations while users hear the other party's voice. Phone calls are also translated in real time.
In fact, this is a useful feature that Samsung introduced on its Galaxy AI. The Korean brand even calls this feature with a similar name, Live Translate.
Like Apple, Samsung's Live Translate feature translates call content into another language in real time. Thanks to a built-in language model and optimized hardware, the recording and content processing takes place on the device to ensure user privacy.
In addition, the live translation feature was also introduced by Google at I/O 2022 with the first image of the augmented reality (AR) glasses model under development. In Google's video , the device is translating the language and displaying it directly in front of the user's eyes.
“You can see what I’m saying in real time, like subtitles,” a Google representative shared. In the video, you can see Google’s AR glasses equipped with external speakers, designed quite similar to regular fashion glasses.
As a latecomer, Apple will need to address the inherent weaknesses of this translation feature. Specifically, when using Live Translate, users will have difficulty determining when to start and stop speaking for AI to translate.
When CNET reporter Lisa Eadicicco tried it out at a busy market in Paris, she found it difficult to focus on the conversation because she had to pay attention to what was on the screen. In the end, she ended up buying the items the old-fashioned way: pointing, gesturing, and using broken French to describe them.
Answer or listen to calls on behalf of the user
When Google introduced the Pixel 6 and Pixel 6 Pro in 2021, it announced a very useful new feature for users called Hold for Me. When activated, Pixel AI will listen when someone picks up the phone and then notify the user with a prompt so the conversation can continue.
This feature is quite useful as it allows users to comfortably do other things without having to continue listening to see if someone on the other end of the line has returned to the conversation. Once the user on the other end of the line returns to the conversation, the virtual assistant on Pixel will notify with a sound.
![]() |
The feature to answer or listen to calls on behalf of the user when busy is called Hold Assist. Photo: Apple. |
This is a great feature and many users have been calling for Apple to bring it to the iPhone. That has come true when Apple announced a similar version called Hold Assist.
According to Apple, when placed on hold, users can leave their iPhone on hold in silence while they return to their work. When an operator is available, the user will be notified so they can return the call.
Finally, a feature that used to be available on Pixel models called Google Call Screen uses AI and asks the caller to reveal their name and reason for calling before connecting the call.
Apple now has a similar feature they call Call Screening to help users avoid distractions. Similar to Google Call Screen, the system will collect information about the caller and provide the necessary details for users to decide whether they should answer the phone or not.
Both Hold Assist, Live Translation, and Call Screening are useful features on Android and have been long-awaited by many iPhone users.
According to PhoneArena reporter Alan Friedman, this trio of AI features shows the difference between Google and Apple when it comes to the operating system between the two.
Accordingly, Android is tweaked to improve the user experience. Apple waits before adding these useful new features and then comes up with a slightly similar name.
"Still, I'm happy to see Apple adding these features to iOS and can't wait to use them," Friedman commented.
Source: https://znews.vn/apple-lai-hoc-android-post1559633.html












Comment (0)