Big Tech sues California, claims child safety law violates First Amendment

Large Tech sues California, claims child safety law violates First AmendmentExpand Image taken by Mayte Torres | Moment

In the last half of 2022 alone, many services - from gaming platforms designed for children to popular apps like TikTok or Twitter suitable for all ages - have been accused of endangering young users , to expose minors to self-harm and financial and sexual exploitation. Some kids have died, their parents have sued, and some tech companies have been shielded from their legal challenges by Section 230. As regulators and parents continue to examine how kids are getting addicted to visiting web destinations that could put them at risk of serious harm, a pressure that is increasingly hard to escape has prompted tech companies to take more responsibility for protecting children's safety online.

In the United States, protecting children from online harm is still a responsibility largely left to parents, and some tech companies would rather it be that way. But by 2024, a California child online safety law, the first of its kind, is set to take effect, designed to shift some of that responsibility to tech companies. California's Age-Appropriate Design Code Act (AB 2273) will require tech companies to design products and services with child safety in mind, requiring age verification and limiting features such as autoplay or the discovery of underage accounts through friend finder tools. That won't happen, however, if NetChoice succeeds.

The tech industry trade association, whose members include Meta, TikTok and Google, filed a lawsuit this week to block the law, arguing in a complaint that the law is not only potentially unconstitutional, but that it also causes allegedly ignored harm to minors.

Some tech companies don't like the California law, NetChoice said in a statement, because they allege it "violates the First Amendment" repeatedly. They also say it grants California "the unchecked power to compel the government prefers. By keeping the terms of the law deliberately vague and never really defining what is considered 'harmful,' even companies that try to complying in good faith could find themselves charged with unforeseen violations, according to the complaint.

Some tech companies have already taken steps to strengthen online protection for young users this year. AB 2273 is based on a UK child online safety law passed last year that prompted many tech companies to change their policies, including Google, Instagram, Facebook, Pinterest, TikTok, Snap and YouTube, the report reported. New York Times. None of these tech companies immediately responded to Ars' request for comment.

California law goes further, however, by requiring tech companies to submit "data protection impact assessments", which would detail risks and child safety provisions before launching new features. All online businesses are currently required to submit these DPIAs before AB 2273 comes into force in July 2024, and then they are required to submit to biennial reviews.

These DPIAs are intended to increase accountability by prompting companies to think about how product features might harm young users, and then create timelines for mitigation efforts to avoid any identified harm. They also go to great lengths to ensure companies do indeed enforce their own posted policies, which NetChoice's complaint specifically claims is unreasonable without the state defining the law...

Big Tech sues California, claims child safety law violates First Amendment
Large Tech sues California, claims child safety law violates First AmendmentExpand Image taken by Mayte Torres | Moment

In the last half of 2022 alone, many services - from gaming platforms designed for children to popular apps like TikTok or Twitter suitable for all ages - have been accused of endangering young users , to expose minors to self-harm and financial and sexual exploitation. Some kids have died, their parents have sued, and some tech companies have been shielded from their legal challenges by Section 230. As regulators and parents continue to examine how kids are getting addicted to visiting web destinations that could put them at risk of serious harm, a pressure that is increasingly hard to escape has prompted tech companies to take more responsibility for protecting children's safety online.

In the United States, protecting children from online harm is still a responsibility largely left to parents, and some tech companies would rather it be that way. But by 2024, a California child online safety law, the first of its kind, is set to take effect, designed to shift some of that responsibility to tech companies. California's Age-Appropriate Design Code Act (AB 2273) will require tech companies to design products and services with child safety in mind, requiring age verification and limiting features such as autoplay or the discovery of underage accounts through friend finder tools. That won't happen, however, if NetChoice succeeds.

The tech industry trade association, whose members include Meta, TikTok and Google, filed a lawsuit this week to block the law, arguing in a complaint that the law is not only potentially unconstitutional, but that it also causes allegedly ignored harm to minors.

Some tech companies don't like the California law, NetChoice said in a statement, because they allege it "violates the First Amendment" repeatedly. They also say it grants California "the unchecked power to compel the government prefers. By keeping the terms of the law deliberately vague and never really defining what is considered 'harmful,' even companies that try to complying in good faith could find themselves charged with unforeseen violations, according to the complaint.

Some tech companies have already taken steps to strengthen online protection for young users this year. AB 2273 is based on a UK child online safety law passed last year that prompted many tech companies to change their policies, including Google, Instagram, Facebook, Pinterest, TikTok, Snap and YouTube, the report reported. New York Times. None of these tech companies immediately responded to Ars' request for comment.

California law goes further, however, by requiring tech companies to submit "data protection impact assessments", which would detail risks and child safety provisions before launching new features. All online businesses are currently required to submit these DPIAs before AB 2273 comes into force in July 2024, and then they are required to submit to biennial reviews.

These DPIAs are intended to increase accountability by prompting companies to think about how product features might harm young users, and then create timelines for mitigation efforts to avoid any identified harm. They also go to great lengths to ensure companies do indeed enforce their own posted policies, which NetChoice's complaint specifically claims is unreasonable without the state defining the law...

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow