Meta Just got Banned from Monetizing

The Federal Trade Commission has recently proposed a blanket ban on Meta from profiting from minors' data as it has failed in child privacy protections, including Children's Online Privacy Protection Act (COPPA).

 

The FTC also said that the social media giant had misled parents about how they had control over their kids' interaction and how the app developers had access to private information on the Messenger Kids platform, thus breaching a 2020 agreement on privacy it had reached about its previous role in the Cambridge Analytica scandal. "Facebook has repeatedly violated its privacy promises," said Samuel Levine, director of the FTC's Bureau of Consumer Protection. "The company's recklessness has put young users at risk, and Facebook needs to answer for its failures."

 

The proposed changes by the FTC would prevent Meta from monetizing any form of data across all its platforms for any users below 18. Meta would also have its limitations on its facial recognition technology along with providing additional privacy options to its users. Meta has 30 days to answer this; if the company disagrees with a commission decision, it can request a review of the decision by an appeals court.

 

Child development experts have always been skeptical since the launch of the service Messenger Kids in 2017. In 2018 a group of child experts filed a complaint against the Messenger Kids app with concerns that it violated federal law by gathering information about kids' without the consent of their parents. Contesting the claims, Facebook (now Meta) at that time highlighted the fact of how parents were always in control of the ways their kids interacted with the app.

 

"Despite the company's promises that children using Messenger Kids would only be able to communicate with contacts approved by their parents, children in certain circumstances were able to communicate with unapproved contacts in group text chats and group video calls," the FTC said.