EU investigates Facebook and Instagram over addictive effects on children

US tech giant Facebook and Instagram's platforms could "exploit the weaknesses and inexperience of minors", the European Commission has said.

European Union regulators on Thursday opened investigations into US tech giant Meta over the potentially addictive effects of Instagram and Facebook on children, an action with far-reaching implications because it cuts to the heart of how the company's products are designed.

Meta's products can "exploit weaknesses and "minors' inexperience" creates behavioral addictions that threaten their mental well-being, the European Commission, the executive branch of the 27-member bloc, said in a statement. EU. Regulators could ultimately fine Meta up to 6% of its global revenue, which was $135 billion last year, as well as impose other product changes. /p>

The investigations are part of a growing effort by governments around the world to rein in services like Instagram and TikTok to protect minors. Meta has faced criticism for years that its products and recommendation algorithms are optimized to attract children. In October, three dozen U.S. states sued Meta for using “psychologically manipulative product features” to lure children, in violation of consumer protection laws.

European Union regulators said they had been in contact with their American counterparts about the investigations announced Thursday. Regulators said Meta could violate the Digital Services Act, a law approved in 2022 that requires large online services to more aggressively monitor their platforms for illegal content and put policies in place to mitigate risks to the children. People under 13 are not supposed to be able to create an account, but the EU. Investigators said they would review the company's age verification tools as part of their investigation.

“We will now fully investigate the potential dependency and “rabbit hole” effects of “Platforms, the effectiveness of their age verification tools and the level of privacy afforded to minors in the operation of recommendation systems,” said in a statement Thierry Breton, commissioner for the internal market of the European Union, who is overseeing the investigations. “We spare no effort to protect our children.”

On Thursday, Meta said its social media services were safe for young people, highlighting features that enable parents and children to set time limits on their use of Instagram or Facebook. Teens are also subject to more restrictive content and recommendation settings by default. Advertisers are prohibited from serving targeted ads to underage users based on their activity on Meta applications.

We are having difficulty retrieving article content .

Please enable JavaScript in your browser settings.

Thank you for your patience while we verify access. If you are in Reader mode, please exit and log in to your Times account, or subscribe to the entire Times.

EU investigates Facebook and Instagram over addictive effects on children

US tech giant Facebook and Instagram's platforms could "exploit the weaknesses and inexperience of minors", the European Commission has said.

European Union regulators on Thursday opened investigations into US tech giant Meta over the potentially addictive effects of Instagram and Facebook on children, an action with far-reaching implications because it cuts to the heart of how the company's products are designed.

Meta's products can "exploit weaknesses and "minors' inexperience" creates behavioral addictions that threaten their mental well-being, the European Commission, the executive branch of the 27-member bloc, said in a statement. EU. Regulators could ultimately fine Meta up to 6% of its global revenue, which was $135 billion last year, as well as impose other product changes. /p>

The investigations are part of a growing effort by governments around the world to rein in services like Instagram and TikTok to protect minors. Meta has faced criticism for years that its products and recommendation algorithms are optimized to attract children. In October, three dozen U.S. states sued Meta for using “psychologically manipulative product features” to lure children, in violation of consumer protection laws.

European Union regulators said they had been in contact with their American counterparts about the investigations announced Thursday. Regulators said Meta could violate the Digital Services Act, a law approved in 2022 that requires large online services to more aggressively monitor their platforms for illegal content and put policies in place to mitigate risks to the children. People under 13 are not supposed to be able to create an account, but the EU. Investigators said they would review the company's age verification tools as part of their investigation.

“We will now fully investigate the potential dependency and “rabbit hole” effects of “Platforms, the effectiveness of their age verification tools and the level of privacy afforded to minors in the operation of recommendation systems,” said in a statement Thierry Breton, commissioner for the internal market of the European Union, who is overseeing the investigations. “We spare no effort to protect our children.”

On Thursday, Meta said its social media services were safe for young people, highlighting features that enable parents and children to set time limits on their use of Instagram or Facebook. Teens are also subject to more restrictive content and recommendation settings by default. Advertisers are prohibited from serving targeted ads to underage users based on their activity on Meta applications.

We are having difficulty retrieving article content .

Please enable JavaScript in your browser settings.

Thank you for your patience while we verify access. If you are in Reader mode, please exit and log in to your Times account, or subscribe to the entire Times.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow