According to the WSJ, Meta safety staff had warned management that some parents who were managing their children's accounts were taking advantage of monetization tools on Facebook and Instagram to sell content including photos of their children in revealing clothing, private chats, and leotards to interested parties.
While these images did not violate Meta’s policy on nudity or illegal content, Meta staff found evidence that some parents were fully aware that they were producing content to satisfy pedophiles. Sometimes these parents also engaged in sexually explicit comments directed at their children or even exposed their daughters to malicious messages from payers.
Algorithms on Facebook and Instagram are "enabling" pedophiles by enabling some parents to sell images of their own children.
Meta employees also found that the company’s recommendation algorithms on Facebook and Instagram linked to accounts of child models, young athletes, and more, which attracted pedophiles. In some cases, parents were suggested to sell “extra” content on other platforms.
The safety team has proposed a complete ban on selling subscriptions to child models’ accounts, as TikTok and paid platforms like Patreon and OnlyFans have done. Meta should also require subscription accounts to register for management.
However, Meta’s leadership ignored these suggestions and opted to build a system that automatically detects and blocks pedophiles who pay to follow parent-run accounts. The problem is, deviants can easily bypass the system by creating new accounts.
Meta also launched a “gifting” feature. The company said it does not charge commissions or any fees for subscription payments, so it has no financial incentive to encourage users to pay to “follow,” but Meta does collect fees from the gifting feature.
Meanwhile, the NYT said that some popular child accounts on Instagram can receive $3,000 per post and monthly income through paid subscriptions can reach hundreds of thousands of dollars. The newspaper commented that Meta “has not made adequate efforts to censor”, responding to only one out of 50 reports related to questionable content about children in the past 8 months.
Meta has struggled to address child safety on its platform. Last year, the WSJ reported that the company’s algorithms helped connect and promote a secret network of groups that sexually exploited minors. In 2020, an internal study conducted by Meta found that about 500,000 child Instagram accounts engaged in “inappropriate” interactions every day.
Source
Comment (0)