Vietnam.vn - Nền tảng quảng bá Việt Nam

Apple 'learns' Android again?

New features like Hold Assist, Live Translation, and Call Screening that Apple introduced at its annual WWDC 2025 event have all appeared on Android devices.

ZNewsZNews10/06/2025

There’s a long-standing theory that Apple will take a feature that’s been on Android for years, add it to the iPhone, give it a new name, and then tout it as a major innovation.

That's exactly what Apple did during its WWDC 2025 keynote, when it revealed a number of new features that will be available in Apple Intelligence when a software update launches later this year.

"Copy" blatantly

A series of new AI-powered features that improve the user experience will appear on iPhone, iPad, Mac, Apple Watch, and Vision Pro. The most prominent is Live Translation.

Apple anh 1

Live Translation feature in FaceTime calls. Photo: Apple.

Designed to break down language barriers when communicating, Live Translation integrates into messaging, FaceTime, and phone apps. It runs entirely on the device to keep conversations private.

Messages can be automatically translated as users type and replies received instantly translated. On FaceTime, live captions display translations while users hear the other party's voice. Phone calls are also translated in real time.

In fact, this is a useful feature that Samsung introduced on its Galaxy AI. The Korean brand even calls this feature with a similar name, Live Translate.

Like Apple, Samsung's Live Translate feature will translate the content of the call into another language in real time. Thanks to the built-in language model and optimized hardware, the recording and content processing takes place right on the device to ensure user privacy.

In addition, the live translation feature was also introduced by Google at I/O 2022 with the first image of the augmented reality (AR) glasses model under development. In Google's video , the device is translating languages ​​and displaying it directly in front of the user's eyes.

“You can see what I’m saying in real time, like subtitles,” a Google representative shared. In the video, you can see that Google’s AR glasses are equipped with external speakers, and the design is quite similar to regular fashion glasses.

As a latecomer, Apple will need to overcome the inherent weaknesses of this translation feature. Specifically, when using Live Translate, users will have difficulty determining when to start and stop speaking for the AI ​​to translate.

When CNET reporter Lisa Eadicicco tried it out at a bustling Paris market, she found it difficult to focus on the conversation because she had to pay attention to what was on the screen. In the end, she ended up buying the items the old-fashioned way: pointing, gesturing, and using broken French to describe them.

Answer or listen to calls on behalf of the user

When Google introduced the Pixel 6 and Pixel 6 Pro in 2021, it announced a very useful new feature for users called Hold for Me. When enabled, Pixel AI will listen when someone picks up the phone and then notify the user with a prompt so the conversation can continue.

This feature is quite useful because it allows users to comfortably do other things without having to continue listening to see if someone on the other end of the line has returned to the conversation. Once the user on the other end of the line returns to the conversation, the virtual assistant on Pixel will notify with a sound.

Apple anh 2

The feature to answer or listen to calls on behalf of the user when busy is called Hold Assist. Photo: Apple.

This is a great feature and many users have been calling for Apple to bring it to the iPhone. That has come true when Apple announced a similar version called Hold Assist.

According to Apple, when placed on hold, users can leave their iPhone on silent hold while they return to their work. When an operator is available, the user will be notified so they can return the call.

Finally, a feature that was previously available on Pixel models called Google Call Screen uses AI and asks the caller to reveal their name and reason for calling before connecting the call.

Apple now has a similar feature called Call Screening that helps users avoid distractions. Similar to Google Call Screen, the system will collect information about the caller and provide the necessary details for the user to decide whether they should answer the phone or not.

Hold Assist, Live Translation, and Call Screening are all useful features on Android and have been long-awaited by many iPhone users.

According to PhoneArena reporter Alan Friedman, this trio of AI features shows the difference between Google and Apple when it comes to the operating system between the two.

Accordingly, Android is tweaked to improve the user experience. Apple waits before adding these useful new features and then comes up with a slightly similar name.

“Still, I'm happy to see Apple adding these features to iOS and can't wait to use them,” Friedman commented.

Source: https://znews.vn/apple-lai-hoc-android-post1559633.html


Comment (0)

No data
No data

Heritage

Figure

Enterprise

No videos available

News

Political System

Destination

Product