Concerns over addicted kids spur probe into Meta and its use of dark patterns

An iPhone screen displays the app icons for WhatsApp, Messenger, Instagram, and Facebook in a folder titled

Getty Images | Chesnot

Brussels has opened an in-depth probe into Meta over concerns it is failing to do enough to protect children from becoming addicted to social media platforms such as Instagram.

The European Commission, the EU’s executive arm, announced on Thursday it would look into whether the Silicon Valley giant’s apps were reinforcing “rabbit hole” effects, where users get drawn ever deeper into online feeds and topics.

EU investigators will also look into whether Meta, which owns Facebook and Instagram, is complying with legal obligations to provide appropriate age-verification tools to prevent children from accessing inappropriate content.

The probe is the second into the company under the EU’s Digital Services Act. The landmark legislation is designed to police content online, with sweeping new rules on the protection of minors.

It also has mechanisms to force Internet platforms to reveal how they are tackling misinformation and propaganda.

The DSA, which was approved last year, imposes new obligations on very large online platforms with more than 45 million users in the EU. If Meta is found to have broken the law, Brussels can impose fines of up to 6 percent of a company’s global annual turnover.

Repeat offenders can even face bans in the single market as an extreme measure to enforce the rules.

Thierry Breton, commissioner for internal market, said the EU was “not convinced” that Meta “has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram.”

“We are sparing no effort to protect our children,” Breton added.

Meta said: “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”

In the investigation, the commission said it would focus on whether Meta’s platforms were putting in place “appropriate and proportionate measures to ensure a high level of privacy, safety, and security for minors.” It added that it was placing special emphasis on default privacy settings for children.

Last month, the EU opened the first probe into Meta under the DSA over worries the social media giant is not properly curbing disinformation from Russia and other countries.

Brussels is especially concerned whether the social media company’s platforms are properly moderating content from Russian sources that may try to destabilize upcoming elections across Europe.

Meta defended its moderating practices and said it had appropriate systems in place to stop the spread of disinformation on its platforms.

© 2024 The Financial Times Ltd. All rights reserved. Not to be redistributed, copied, or modified in any way.

Source link

About The Author

Scroll to Top