Britain's Ofcom says a third of under-18s lie about their age on social media

Companies like Instagram are being hit with hefty fines (and dragged through the advertising smut) for mishandling children's privacy on their platforms. But if a recent Ofcom report is accurate, maybe they're getting off lightly.

The UK's media watchdog today released research which found that a third of all children aged 8-17 use social media with fake adult ages, mostly in s ' registering with a false date of birth.< /p>

He also noted that the use of social media by these young consumers is significant: of those aged 8 to 17 who use social media, approximately 77% use the services on one of the largest platforms. forms under their own profile; 60% of the youngest in this group, ages 8-12, have accounts under their own profile (others use their parents', it seems).

Up to half of minors registered on their own and up to two-thirds were assisted by a parent or guardian.

The three studies, commissioned by Ofcom from three separate organisations: Yonder Consulting, Revealing Reality and the Digital Regulatory Cooperation Forum, are published before the UK pushes forward the Privacy Bill. online security.

Years in the making (and still being amended, apparently, with every political tide shift in the country), Ofcom expects the bill to finally be passed into law in early 2023. But the mandate of the bill is tricky (if not potentially self-contradictory), aiming both to "make the UK the safest place in the world to be online" while "defending freedom of expression". /p>

In this respect, the research published by Ofcom could be seen as a warning sign on what not to overlook, and what could easily escalate into mismanagement if not managed properly, whatever platform these young users are using right now. But it also highlights the idea of ​​taking different approaches to different types of 18+ content.

Ofcom notes that even in the realm of children and digital content, there appears to be a fundamental gray area when it comes to adult perceptions: some content marked for "adult" such as social media and games is relatively "less risky" than other adult content like gambling and pornography, which are still inappropriate for underage users. The former is more likely to rely on simple checks (easy to circumvent). Parents and children, according to the research, were more likely to favor "hardware identifiers" as identity verification for the latter sites.

The choices parents make also highlight how interwoven digital platforms have become in the lives of their young ones and how good intentions can go wrong.

Ofcom said parents noted that in cases where they viewed content as "less risky" - such as on social media or gaming platforms - they were balancing child safety with peer pressure their children were facing (not wanting to feel left out) and the idea that as they got older they wanted them to learn to manage the risks themselves.

But that doesn't mean social media is always less risky: the recent UK court case investigating the death of a teenage girl found that self-harm and content suicide that the girl had found and viewed on Instagram and Pinterest were factors in her death. This highlights how sites like these control the content that appears on their platforms and how they steer users towards or away from it. And given that kids who lie around age 8 to connect are still only 13 five years later, aging in disconcerting ways can take years.

The goal of keeping free speech intact may well be increasingly challenged. Ofcom notes that it is approaching its first full year of regulating video-sharing platforms. Its first report will focus "on the measures platforms have in place to protect users, including children, from harmful content and will set out our strategy for the year ahead".

Britain's Ofcom says a third of under-18s lie about their age on social media

Companies like Instagram are being hit with hefty fines (and dragged through the advertising smut) for mishandling children's privacy on their platforms. But if a recent Ofcom report is accurate, maybe they're getting off lightly.

The UK's media watchdog today released research which found that a third of all children aged 8-17 use social media with fake adult ages, mostly in s ' registering with a false date of birth.< /p>

He also noted that the use of social media by these young consumers is significant: of those aged 8 to 17 who use social media, approximately 77% use the services on one of the largest platforms. forms under their own profile; 60% of the youngest in this group, ages 8-12, have accounts under their own profile (others use their parents', it seems).

Up to half of minors registered on their own and up to two-thirds were assisted by a parent or guardian.

The three studies, commissioned by Ofcom from three separate organisations: Yonder Consulting, Revealing Reality and the Digital Regulatory Cooperation Forum, are published before the UK pushes forward the Privacy Bill. online security.

Years in the making (and still being amended, apparently, with every political tide shift in the country), Ofcom expects the bill to finally be passed into law in early 2023. But the mandate of the bill is tricky (if not potentially self-contradictory), aiming both to "make the UK the safest place in the world to be online" while "defending freedom of expression". /p>

In this respect, the research published by Ofcom could be seen as a warning sign on what not to overlook, and what could easily escalate into mismanagement if not managed properly, whatever platform these young users are using right now. But it also highlights the idea of ​​taking different approaches to different types of 18+ content.

Ofcom notes that even in the realm of children and digital content, there appears to be a fundamental gray area when it comes to adult perceptions: some content marked for "adult" such as social media and games is relatively "less risky" than other adult content like gambling and pornography, which are still inappropriate for underage users. The former is more likely to rely on simple checks (easy to circumvent). Parents and children, according to the research, were more likely to favor "hardware identifiers" as identity verification for the latter sites.

The choices parents make also highlight how interwoven digital platforms have become in the lives of their young ones and how good intentions can go wrong.

Ofcom said parents noted that in cases where they viewed content as "less risky" - such as on social media or gaming platforms - they were balancing child safety with peer pressure their children were facing (not wanting to feel left out) and the idea that as they got older they wanted them to learn to manage the risks themselves.

But that doesn't mean social media is always less risky: the recent UK court case investigating the death of a teenage girl found that self-harm and content suicide that the girl had found and viewed on Instagram and Pinterest were factors in her death. This highlights how sites like these control the content that appears on their platforms and how they steer users towards or away from it. And given that kids who lie around age 8 to connect are still only 13 five years later, aging in disconcerting ways can take years.

The goal of keeping free speech intact may well be increasingly challenged. Ofcom notes that it is approaching its first full year of regulating video-sharing platforms. Its first report will focus "on the measures platforms have in place to protect users, including children, from harmful content and will set out our strategy for the year ahead".

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow