sourcegraph
June 9, 2023

Britain’s data protection agency fined popular video-sharing app TikTok $15.9 million on Tuesday, saying the platform failed to comply with data protection rules designed to protect children online.

The Information Commissioner’s Office said TikTok improperly allowed up to 1.4 million children under the age of 13 to use the service in 2020, breaching UK data protection rules that require organizations to obtain parental consent for the use of children’s personal information. TikTok failed to obtain that consent, even though it should have been aware that young children were using the service, the regulator said.

British investigation found Despite TikTok’s rules prohibiting children under 13 from creating accounts, the video-sharing app hasn’t done enough to identify underage users or remove them from the platform. Regulators say TikTok has failed to do enough, even as some of the video-sharing platform’s top employees have expressed concerns internally about underage children using the app.

TikTok, owned by Chinese internet giant ByteDance, also faces censorship in the United States. Last month, members of Congress questioned its chief executive, Zhou Shou, about the possible national security risks posed by the platform.

The TikTok privacy fine underscores public concern about the mental health and safety risks the popular social network may pose to some children and teens. Last year, researchers reported that TikTok began recommending content related to eating disorders and self-harm to 13-year-old users within 30 minutes of joining the platform.

UK Information Commissioner John Edwards said in a statement that TikTok’s practices could put children at risk.

“TikTok collects and uses personal data from an estimated 1 million children under the age of 13 who have been inappropriately granted access to the platform,” Edwards said in a statement. “This means their data may have been used to stalk They and analyze them, potentially delivering harmful, inappropriate content on their next roll.”

In a statement, TikTok said it disagreed with the regulator’s findings and was reviewing the case and considering next steps.

“TikTok is a platform for users 13 and older,” the company said in a statement. “We invest heavily to help keep children under 13 off the platform, and our 40,000-strong security team works around the clock to help keep the platform safe for our communities.”

This isn’t the first time regulators have cited the popular video-sharing app Concerns about children’s privacy. In 2019, the operators of Musical.ly (the platform now known as TikTok) agreed to pay $5.7 million to settle FTC charges that it violated the US Children’s Online Privacy Protection Rules.

Since then, lawmakers in the United States and Europe have enacted new rules in an attempt to strengthen protections for children online.

In March, Utah passed a sweeping law banning social media platforms Like TikTok and Instagram Minors in the state are allowed to have accounts without parental consent. Last fall, California passed a law requiring many social media, video games and other apps to turn on the highest level of privacy settings for minors by default and to turn off potentially risky features, such as those that allow adult strangers to contact children. Make friends function.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *