February 9, 2023

A tech industry trade association sued the state of California on Wednesday in an attempt to block a new child online safety law, a legal challenge amid heightened public concern about the possible risks of content on popular platforms like Instagram and TikTok to young adults. User.

This new law, known as the California Age Appropriate Design Act, would require many online services to install comprehensive protections for minors, including protecting children from potentially harmful content and turning off dating features that could allow adult strangers to connect with young people . Gov. Gavin Newsom signed the nation’s first child online safety bill of its kind into law in September.

A trade association called NetChoice is suing to block the law before it is scheduled to go into effect in 2024. trade bloc member These include Amazon; Funfun; Douyin; Google, which owns YouTube; and Meta, the parent company of Facebook and Instagram.

NetChoice said in legal action filed in the U.S. District Court for the Northern District of California that the legislation would require online services to act as content moderators, a violation of constitutional protections for free speech. The group also argues that the law harms minors and others by preventing them from accessing free and open online resources.

The law “forces companies to act as mobile censors of Internet speech,” NetChoice Complaint Say. “Such excess moderation,” it added, “will limit the availability of information to users of all ages and stifle vital resources, especially for vulnerable youth who rely on the Internet for life-saving information.”

Over the past few years, children’s groups, parents and researchers have raised concerns about algorithms promoting harmful content on platforms like TikTok and Instagram About Eating Disorders And self-harm to younger users. In response, U.S. and European lawmakers and regulators have stepped up safeguards for children’s online privacy and safety.

The Child Safe California Act was a bipartisan effort that passed both chambers of the state legislature by a unanimous vote. It builds on the UK’s children’s online safety rules that were implemented last year.

UK regulations require online services that may have minors as users to prioritize children’s safety. In practice, this means that many popular social media and video game platforms must turn on the highest level of privacy settings for young users in the UK. They also have to turn off certain features that might drive kids online for hours on end, such as autoplay — videos that play automatically one after another.

Last year, as the UK rules came into effect, Google, Instagram, Pinterest, TikTok, Snap, YouTube and others rolled out new protections for young users around the world. For example, YouTube turns off the default video autoplay feature for minors.

California’s rules also require online services to turn off features such as autoplaying videos for children.

In the complaint, NetChoice argues that such rules are overly broad, would affect an overly broad range of online services, and would impair a platform’s ability to freely select and promote content for users. In particular, the technology trade group considers the widespread use of systems such as autoplay and content recommendation algorithms to be “benign” features.

In response to a reporter’s question about why many of the group’s members already abide by similar U.K. rules, NetChoice blocked the California law, saying the state’s law is unconstitutional under the First Amendment.

“While Britain has similar laws, it has neither the First Amendment nor a long tradition of Protecting Speech Onlinesaid NetChoice attorney Chris Marchese.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *