Children's rights groups slam TikTok's 'design discrimination'

A study examining the default settings and terms and conditions offered to minors by social media giants TikTok, WhatsApp and Instagram in 14 different countries, including the United States, Brazil, India Indonesia and the United Kingdom, revealed that the three platforms do not offer the same level of privacy protection and safety for children in all markets in which they operate.

The level of protection minors receive on a service may depend on where in the world they are, according to the new report—titled: Global Platforms, Partial Protections— which has found "significant" variation in children's experience in different countries on "seemingly identical platforms".

The research was conducted by Fairplay, a non-profit organization that advocates for an end to marketing to children.

TikTok has proven particularly problematic in this regard. And, alongside the release of Fairplay's report, the company was singled out in a joint letter, signed by nearly 40 child safety and digital rights advocacy groups, calling on it to come up with an approach " Safety By Design" and "Children's Rights by Design". globally, rather than only providing the highest standards in places like Europe, where regulators have taken early action to protect children online.

Citing information in Fairplay's report, the 39 child protection and digital rights organizations from 11 countries, including the 5Rights Foundation in the UK, the Tech Transparency Project, the Africa Digital Rights Hub in Ghana and the Eating Disorders Coalition for Research, Policy & Action, to name a few - co-signed the letter to TikTok CEO Shou Zi Chew urging him to address key discrimination in of design highlighted by the report.

These include deviations in places where TikTok offers an "age-appropriate" design experience for minors, such as default settings to private (as is the case in the UK and in some EU markets) - while elsewhere it was found to be 17 by default. -former users to public accounts.

The report also identified many (non-European) markets where TikTok does not provide its terms of service in the native language of young people. He also criticizes the lack of transparency regarding minimum age requirements - finding TikTok sometimes provides users with conflicting information, making it difficult for minors to know if the service is right for them.

"Many young TikTok users are not European; TikTok's largest markets are in the United States, Indonesia and Brazil. All children and young people deserve an age-appropriate experience, not just those who come from Europe,” say the authors of the report.

Fairplay's research methodology involved central researchers, based in London and Sydney, analyzing the platforms' privacy policies and terms of service, with support from a global network of research bodies. local search, which included creating experimental accounts to explore variations of the default settings offered to 17-year-olds in different markets.

The researchers suggest their findings challenge the social media giants' claims of concern for protecting children, as they clearly do not provide the same standards of security and privacy to minors everywhere.

Instead, social media platforms appear to exploit gaps in the global patchwork of legal protections for minors to prioritize business goals, such as boosting engagement, over safety and privacy of children.

In particular, children in the Global South and some other regions are exposed to more manipulative designs than children in Europe, where legal frameworks have already been enacted to protect their online experience, such as Age UK Appropriate Design Code (effective September 2020); or the European Union's General Data Protection Regulation (GDPR), which began to apply in May 2018 - requiring data processors to take extra care to build in safeguards when services process information about minors, with the risk of significant fines for non-compliance.

Asked to summarize the research findings in one line, a spokeswoman for Fairplay told TechCrunch, "In terms of a one-line summary, it's that regulation works and tech companies don't don't act without it." She also suggested that it is correct to conclude that a lack of regulation makes users more vulnerable to the "quirks of the platform's business model".

In the report, the authors make a direct appeal to lawmakers to put in place settings and policies that provide "the best protection for the well-being and privacy of young people."

The report's findings are likely to add to calls for lawmakers outside Europe to step up...

Children's rights groups slam TikTok's 'design discrimination'

A study examining the default settings and terms and conditions offered to minors by social media giants TikTok, WhatsApp and Instagram in 14 different countries, including the United States, Brazil, India Indonesia and the United Kingdom, revealed that the three platforms do not offer the same level of privacy protection and safety for children in all markets in which they operate.

The level of protection minors receive on a service may depend on where in the world they are, according to the new report—titled: Global Platforms, Partial Protections— which has found "significant" variation in children's experience in different countries on "seemingly identical platforms".

The research was conducted by Fairplay, a non-profit organization that advocates for an end to marketing to children.

TikTok has proven particularly problematic in this regard. And, alongside the release of Fairplay's report, the company was singled out in a joint letter, signed by nearly 40 child safety and digital rights advocacy groups, calling on it to come up with an approach " Safety By Design" and "Children's Rights by Design". globally, rather than only providing the highest standards in places like Europe, where regulators have taken early action to protect children online.

Citing information in Fairplay's report, the 39 child protection and digital rights organizations from 11 countries, including the 5Rights Foundation in the UK, the Tech Transparency Project, the Africa Digital Rights Hub in Ghana and the Eating Disorders Coalition for Research, Policy & Action, to name a few - co-signed the letter to TikTok CEO Shou Zi Chew urging him to address key discrimination in of design highlighted by the report.

These include deviations in places where TikTok offers an "age-appropriate" design experience for minors, such as default settings to private (as is the case in the UK and in some EU markets) - while elsewhere it was found to be 17 by default. -former users to public accounts.

The report also identified many (non-European) markets where TikTok does not provide its terms of service in the native language of young people. He also criticizes the lack of transparency regarding minimum age requirements - finding TikTok sometimes provides users with conflicting information, making it difficult for minors to know if the service is right for them.

"Many young TikTok users are not European; TikTok's largest markets are in the United States, Indonesia and Brazil. All children and young people deserve an age-appropriate experience, not just those who come from Europe,” say the authors of the report.

Fairplay's research methodology involved central researchers, based in London and Sydney, analyzing the platforms' privacy policies and terms of service, with support from a global network of research bodies. local search, which included creating experimental accounts to explore variations of the default settings offered to 17-year-olds in different markets.

The researchers suggest their findings challenge the social media giants' claims of concern for protecting children, as they clearly do not provide the same standards of security and privacy to minors everywhere.

Instead, social media platforms appear to exploit gaps in the global patchwork of legal protections for minors to prioritize business goals, such as boosting engagement, over safety and privacy of children.

In particular, children in the Global South and some other regions are exposed to more manipulative designs than children in Europe, where legal frameworks have already been enacted to protect their online experience, such as Age UK Appropriate Design Code (effective September 2020); or the European Union's General Data Protection Regulation (GDPR), which began to apply in May 2018 - requiring data processors to take extra care to build in safeguards when services process information about minors, with the risk of significant fines for non-compliance.

Asked to summarize the research findings in one line, a spokeswoman for Fairplay told TechCrunch, "In terms of a one-line summary, it's that regulation works and tech companies don't don't act without it." She also suggested that it is correct to conclude that a lack of regulation makes users more vulnerable to the "quirks of the platform's business model".

In the report, the authors make a direct appeal to lawmakers to put in place settings and policies that provide "the best protection for the well-being and privacy of young people."

The report's findings are likely to add to calls for lawmakers outside Europe to step up...

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow