Meta - the parent company of Instagram, has just announced a series of new safety features to protect teenagers when using Meta platforms, including displaying information about the accounts sending messages to them, along with the option to block and report with just 1 tap.
Meta also said it had removed thousands of accounts that left sexually explicit comments or requested explicit photos, according to a statement on the company's blog. The accounts were identified as adults but posed as children under the age of 13. Of these, 135,000 were for comments and 500,000 were for "inappropriate interactions."
The move comes as social media platforms come under increasing scrutiny over their impact on the mental health and safety of young users, particularly amid concerns that they could be targeted by predators who use nude photos to blackmail them.
According to Meta, thanks to the safety alert reminding users to "be careful when private messaging and report anything you see that makes you uncomfortable," more than 1 million accounts were blocked and more than 1 million violations were reported.
Earlier this year, Meta began testing artificial intelligence (AI) technology to detect users who lied about their ages on Instagram, a platform that is only for people over 13.
If violations are detected, the account will automatically be converted to a junior account, which has more restrictions than an adult account.
Teen accounts will be private by default, allowing messages only from people they follow or have previously connected with. This policy will apply from 2024.
Meta is currently facing multiple lawsuits from dozens of states in the US, accusing the company of intentionally designing addictive features on Instagram and Facebook, thereby harming the mental health of young people./.
(Vietnam News Agency/Vietnam+)
Source: https://www.vietnamplus.vn/meta-tang-cuong-bien-phap-bao-ve-thieu-nien-tren-instagram-post1051542.vnp
Comment (0)