A bipartisan group of state attorneys-general have announced they will launch a nationwide investigation into Meta, formerly known as Facebook, to examine whether the company has violated state consumer protection laws and put the public at risk.
Among other things, the investigation will examine whether Meta provides and promotes its social media platform Instagram to children and teens, despite knowing that such use is associated with physical and mental health harms. It will also target the techniques used by Meta to increase the frequency and duration of engagement by teens, and the alleged result harms caused by such extended engagement.
Involved in the investigation are attorneys-general from California, Florida, Kentucky, Massachusetts, Nebraska, New Jersey, Tennessee, and Vermont.
"Facebook, now Meta, has failed to protect young people on its platforms and instead chose to ignore or, in some cases, double down on known manipulations that pose a real threat to physical and mental health -- exploiting children in the interest of profit," Maura Healey, Massachusetts attorney-general who is co-leading the investigation, said in a statement.
"As attorney-general it is my job to protect young people from these online harms. Today I am co-leading a nationwide coalition to get to the bottom of this company's engagement with young users, identify any unlawful practices, and end these abuses for good. Meta can no longer ignore the threat that social media can pose to children for the benefit of their bottom line."
The investigation follows recent leaks of internal documents that explored a variety of topics, including the use of different content moderation policies for high-profile users, the spread of misinformation, and the impact of Instagram on teenagers' mental health. These internal documents are known as the Facebook Files and were first published by The Wall Street Journal.
Behind the leak was Frances Haugen, who used to work as the lead product manager for the social media giant's civic misinformation team.
Read: Facebook shelving Instagram for kids, expanding parental oversight for teens
In October when the whistleblower fronted a Senate inquiry into Facebook's operations, she declared the company as morally bankrupt, casting "the choices being made inside of Facebook" as "disastrous for our children, our privacy, and our democracy."
Haugen also told Senate members that "Facebook knows that its amplification algorithms can lead children from innocuous topics -- such as healthy food recipes -- to anorexia-promoting content over a short period of time".
Following these claims, Meta founder and CEO Mark Zuckerberg hit back saying it was "just not true".
"We care deeply about issues like safety, wellbeing, and mental health. It's difficult to see coverage that misrepresents our work and our motives. At the most basic level, I think most of us just don't recognize the false picture of the company that is being painted," Zuckerberg wrote in a note to Facebook employees that he publicly posted on his Facebook page.
"The argument that we deliberately push content that makes people angry for profit is deeply illogical," he continued.
"We make money from ads, and advertisers consistently tell us they don't want their ads next to harmful or angry content. And I don't know any tech company that sets out to build products that make people angry or depressed. The moral, business and product incentives all point in the opposite direction."
In response to the launch of the inquiry, a Meta spokesperson made similar remarks, telling ZDNet the "accusations are false and demonstrate a deep misunderstanding of the facts".
"While challenges in protecting young people online impact the entire industry, we've led the industry in combating bullying and supporting people struggling with suicidal thoughts, self-injury, and eating disorders," the spokesperson said.
"We continue to build new features to help people who might be dealing with negative social comparisons or body image issues, including our new 'Take a Break' feature and ways to nudge them towards other types of content if they're stuck on one topic. We continue to develop parental supervision controls and are exploring ways to provide even more age-appropriate experiences for teens by default."