Google recently announced plans to remove pornographic content that uses deepfake. The move comes when the technology is used for malicious purposes or to spread misinformation. Deepfake is where a person's face or body has been digitally edited into an image or video .
While users have been able to request the removal of images from Google Search, new systems are being developed to make the process easier. Specifically, when users request the removal of fake content featuring them from Search without their consent, Google's systems will filter relevant results on similar search pages about them. Copies of the images can be processed at the same time to help them prevent deepfakes.
Google's solution has been tested and proven effective in dealing with other types of non-consensus images.
Accordingly, the company is making it harder for websites containing deepfake images to appear in search results because of this fake content. Google is known to deploy ranking updates to reduce the order of websites with fake content.
This new feature marks a new step forward in finding more solutions to tackle the deepfake problem.
Source: https://kinhtedothi.vn/google-tran-ap-deepfake.html
Comment (0)