sourcegraph
November 28, 2023

The Federal Trade Commission escalated its fight against the tech industry’s biggest companies on Wednesday as it began imposing what it called a “blanket ban” on Facebook parent Meta’s collection of personal data on young people.

The committee wants to significantly expand a record $5 billion consent order with the company from 2020, saying Meta has failed to fully meet legal commitments it has made to overhaul its privacy practices to better protect its users .

The regulator also said Meta misled parents into thinking they had the ability to control who their kids communicated with on its Messenger Kids app and misrepresented the permissions it granted certain app developers to access users’ private data.

The proposed changes mark the third time the agency has taken action against the social media giant over privacy concerns.

“The company’s reckless actions put young users at risk,” Samuel Levine, director of the FTC’s Bureau of Consumer Protection, said in a press statement. “Facebook needs to be held accountable for its failure.”

The FTC’s administrative action, an internal agency process known as an “order of cause,” was an initial warning to Meta that the regulator believed the company had violated a 2020 privacy agreement. The document sets out the Commission’s allegations against Meta and its proposed restrictions.

Meta had 30 days to challenge the application, but the FTC gave no advance notice of the action

After Facebook responded, the committee said it would consider the company’s arguments and make a decision. Meta can then appeal the agency’s decision in a federal appeals court.

The FTC’s proposed changes would prohibit Meta from profiting from the data it collects on users under the age of 18, and would apply to Meta’s businesses, including Facebook, Instagram and the company’s new virtual reality platform, Horizon Worlds. Regulators want to bar the company from monetizing the data, even after those users turn 18.

That means Meta could be banned from using details about young people’s movements to show them ads based on their behavior or market them digital goods, such as virtual clothes for their avatars.

Whether the court will approve such changes is unknown.exist a statement On Wednesday, Commissioner Alvaro M. Bedoya, who voted to issue the executive order, said he was concerned about whether the agency’s proposal to limit Meta’s use of data on young people was sufficiently relevant to the original case.

In a statement, Meta called the FTC’s administrative warning a “political stunt” and said the company had launched an “industry-leading” privacy program under its agreement with the FTC. The company has vowed to fight the agency’s actions.

“Despite three years of ongoing engagement with the FTC surrounding our agreement, they did not provide an opportunity to discuss this new and entirely unprecedented theory,” Mehta said in a statement.

Meta has announced restrictions on advertising to users under the age of 18. By 2021, advertisers will be able to tailor ads based on a minor’s location, age and gender, but will no longer be able to target ads based on the interests or interests of young people, the company said. Activities on Other Sites.This year, Meta says it will also stop ad targeting Based on Minor’s Sex.

The FTC’s aggressive action is the first time the commission has proposed a blanket ban on data use in an attempt to protect minors’ privacy online. The government has taken the broadest move to insulate young Americans online since the 1990s, when the commercial internet was still in its infancy.

Legislators in at least two states have introduced bills in the past year that would require certain websites, such as social networks, to ban or restrict young people from using on the platform. Regulators are also stepping up efforts to impose fines on online services whose use or misuse of data could put children at risk.

Over the past few years, critics have accused Meta of recommending self-harm and extreme dieting to teenage girls on Instagram, as well as failing to adequately protect young users from child sexual exploitation.

The FTC’s case against the social media giant dates back more than a decade.

In 2011, the agency accused Facebook of deceiving users about their privacy. In the settlement, Facebook agreed to implement a comprehensive privacy program, including agreeing not to misrepresent its privacy practices.

But the FTC struck again in 2018 after news reports emerged that a voter analytics firm, Cambridge Analytica, had harvested data on millions of Facebook users without their knowledge.

In a consent decree finalized in 2020, Facebook agreed to reorganize its privacy programs and practices and allow independent evaluators to examine the effectiveness of the company’s privacy programs. The company also paid a record $5 billion fine to settle the agency’s charges.

The FTC said Facebook violated the agreement. In Wednesday’s executive order, the agency cited the privacy assessor’s report that it found “holes and weaknesses” in Meta’s privacy program that required substantial additional work.

Although many This report has been edited, which indicates that the assessors found problems with the way Meta assesses user data privacy risks and manages privacy incidents. It also cited Meta’s oversight of its data-sharing arrangements with third parties.

The FTC’s crackdown on Meta is the latest signal that the agency is following through on a pledge by its chairman, Lina M. Khan, to rein in the power of dominant companies in the tech industry. In December, the agency moved to halt consolidation among video game makers. file a lawsuit Trying to block Microsoft’s $69 billion acquisition of Activision Blizzard, the company behind the hugely popular Call of Duty franchise.

The FTC has also become more aggressive in privacy regulation. Rather than simply trying to protect consumers from increasingly powerful surveillance tools, regulators are working to ban certain types of data collection and use they deem to be high-risk.

The FTC in December accused Epic Games, the company behind the popular Fornite game, of illegally collecting data on children and putting them at risk by pairing them with strangers and enabling live chat. Epic agreed to pay a $520 million fine to settle these and other charges. The settlement order also requires Epic to turn off real-time voice and text chat by default — the first time regulators have imposed such a remedy.

But the agency now wants to go further with the data restrictions imposed by Meta.

The FTC’s proposed changes would prohibit Meta-owned sites and products from monetizing youth data. This will allow corporate platforms such as Horizon Worlds to collect and use minors’ information only to provide services to users and for security purposes.

The FTC also wants to bar Meta from releasing any new products or features until the company can certify, through written confirmation from an independent privacy assessor, that its privacy program is fully compliant with the 2020 Consent Order.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *