Meta, the parent company of two of the most popular social media platforms – Instagram and Facebook, has “routinely documented” children under 13 on Instagram and collected their data, according to a newly unsealed complaint. It is alleged that the tech giant received more than 1.1 million reports of users under the age of 13 on its Instagram platform since early 2019 yet it “disabled only a fraction” of those accounts, as per the complaint brought by the attorneys general of 33 states.
Recent court documents reveal that Meta knowingly took advantage of weaknesses in the brains of young users to make its social media platforms more addictive for kids. The company is accused of periodically collecting children’s personal information, like their locations and email addresses, without parental permission, in violation of a federal children’s privacy law. If these allegations are proven true, Meta could face hundreds of millions of dollars, or more, in civil penalties.
Also Read: Meta To Shut Down Instagram Guides: Find Ways To Save Your Data Before The Shutdown
The privacy charges are part of a larger federal lawsuit, filed last month by 33 states in U.S. District Court for the Northern District of California. The lawsuit accuses Meta of unfairly ensnaring young people on its Instagram and Facebook platforms while concealing internal studies showing user harm. And it seeks to force Meta to stop using certain features when it comes to young users.
“Within the company, Meta’s actual knowledge that millions of Instagram users are under the age of 13 is an open secret that is routinely documented, rigorously analyzed and confirmed,” the complaint said, “and zealously protected from disclosure to the public.” The unsealed complaint provides new details from the States’ lawsuit using snippets from internal emails, employee chats, and company presentations.
The Allegations
The unsealed filing said that Meta “continually failed” to make effective age-checking systems a priority and instead used approaches that enabled users under 13 to lie about their age to set up Instagram accounts. It also accused Meta executives of publicly lying in congressional testimony that the company’s age-checking process was effective and that the company removed underage accounts when it learned of them — even while they knew there were millions of underage users on Instagram.
According to The Wall Street Journal’s report, an internal presentation by Meta from 2020 stated how “Teens are insatiable when it comes to ‘feel good’ dopamine effects.” It described the company’s existing product as already well-suited to providing the sort of stimuli that trigger the potent neurotransmitter. “And every time one of our teen users finds something unexpected their brains deliver them a dopamine hit,” the presentation stated.
Meta’s Side Of The Story
“Tweens want access to Instagram, and they lie about their age to get it now,” Adam Mosseri, the head of Instagram, said in an internal company chat in November 2021, according to the court filing. But just a month later in the Senate testimony, Mr. Mosseri said: “If a child is under the age of 13, they are not permitted on Instagram.”
Meta said in a statement that it had spent a decade on making online experiences safe and age-appropriate for teenagers and that the states’ complaint “mischaracterizes our work using selective quotes and cherry-picked documents.” The statement also noted that Instagram’s terms of use prohibit users under the age of 13 in the United States. And that the company had “measures in place to remove these accounts when we identify them.”
Concerns about well-being, particularly among younger teens, were acknowledged internally within Meta. Karina Newton, the Head of Policy at Instagram, in May 2021 sent an email that stated, “It’s not ‘regulators’ or ‘critics’ who think Instagram is unhealthy for young teens — it’s everyone from researchers and academic experts to parents. The blueprint of the app is inherently not designed for an age group that don’t have the same cognitive and emotional skills that older teens do.”
Meta’s Laxity
Verifying people’s ages for online services was a “complex” challenge for the company, especially with younger users who may not have school IDs or driver’s licenses. Meta said it would like to see federal legislation that would require “app stores to get parents’ approval whenever their teens under 16 download apps” rather than having young people or their parents supply personal details like birth dates to many different apps.
The privacy charges in the case center on a 1998 federal law, the Children’s Online Privacy Protection Act. As per this law, online services with content aimed at children need to obtain verifiable permission from a parent before collecting personal details — like names, email addresses, or selfies, from users under 13. Fines for violating the law can run to more than $50,000 per violation.
The lawsuit argues that Meta chose to not build effective systems to detect and exclude such underage users because it viewed children as a crucial demographic that the company needed to capture for continued growth. An internal company chart displayed in the unsealed material, showed how Meta tracked the percentage of 11 and 12-year-olds who used Instagram daily, the complaint said.
Meta also knew about accounts belonging to specific underage Instagram users through company reporting channels. But it “automatically” ignored certain reports of users under 13 and allowed them to continue using their accounts, the complaint said, as long as the accounts did not contain a user biography or photos. The company algorithms in the U.S. estimated Meta has as many as four million underage users.
In one case in 2019, Meta employees discussed in emails why the company had not deleted four accounts belonging to a 12-year-old, despite requests and “complaints from the girl’s mother stating her daughter was 12,” according to the complaint. The employees concluded that the accounts were “ignored” partly because Meta representatives “couldn’t tell for sure the user was underage,” the legal filing said.
This is not the first time the social media giant has faced allegations of privacy violations. In 2019, the company agreed to pay a record $5 billion, and to alter its data practices, to settle charges from the Federal Trade Commission of deceiving users about their ability to control their privacy. It may be easier for the states to pursue Meta for children’s privacy violations than to prove that the company encouraged compulsive social media use among young people.
Since 2019, the F.T.C. has successfully brought similar children’s privacy complaints against tech giants including Google and its YouTube platform, Amazon, Microsoft, and Epic Games, the creator of Fortnite. Will Meta be held accountable for getting these young kids addicted to its social media platforms while violating their personal data? Only time will tell.