Why you can trust ZDNET
:ZDNET independently tests and researches products to bring you our best recommendations and advice. When you buy through our links, we may earn a commission.Our process
'ZDNET Recommends': What exactly does it mean?
ZDNET's recommendations are based on many hours of testing, research, and comparison shopping. We gather data from the best available sources, including vendor and retailer listings as well as other relevant and independent reviews sites. And we pore over customer reviews to find out what matters to real people who already own and use the products and services we’re assessing.
When you click through from our site to a retailer and buy a product or service, we may earn affiliate commissions. This helps support our work, but does not affect what we cover or how, and it does not affect the price you pay. Neither ZDNET nor the author are compensated for these independent reviews. Indeed, we follow strict guidelines that ensure our editorial content is never influenced by advertisers.
ZDNET's editorial team writes on behalf of you, our reader. Our goal is to deliver the most accurate information and the most knowledgeable advice possible in order to help you make smarter buying decisions on tech gear and a wide array of products and services. Our editors thoroughly review and fact-check every article to ensure that our content meets the highest standards. If we have made an error or published misleading information, we will correct or clarify the article. If you see inaccuracies in our content, please report the mistake via this form.
This week, over 40 US states sued Meta, the parent company of social media platforms Instagram, WhatsApp, Facebook, Messenger, and Threads. The lawsuit alleges that Meta knowingly designed its social media sites to be highly addictive to young people, inflicting mental health damage.
The lawsuit alleges that Meta runs a "scheme to exploit young users for profit" by increasing engagement, harvesting data, falsely advertising safety features, and promoting unhealthy social expectations, sleep habits, and body image.
Most importantly, the lawsuit alleges that Meta knows how its platforms affect young people and does not act accordingly to protect them. Additionally, the plaintiffs claim that Meta, concerning Instagram and Facebook, is in noncompliance with the Children's Online Privacy Protection Rule (COPPA).
The lawsuit states that Instagram and Facebook collect the personal information of children without parental consent, do not verify parental consent before collecting such data, and violate COPPA because the platforms do such practices while being marketed towards children.
The lawsuit points to Meta's algorithms and claims they are exploitative and predatory. There's not much public information available about how Meta's algorithms work, but we do know that multiple algorithms are responsible for the content users see on Meta's platforms.
Another aspect of the lawsuit claims that Meta knowingly markets its platforms to children despite evidence that Instagram can have a direct impact on mental health and body image issues, especially for teenage girls.
In 2021, Facebook whistleblower Frances Haugen shared internal Facebook documents and research with members of the US Congress. Haugen alleged that the documents, collectively called the Facebook Papers, proved that Meta repeatedly and knowingly prioritized profits over the public good.
The documents also suggested that Meta knew that Instagram's content, filters, and features fueled teenagers' body image issues, anxiety, and depression.
The federal lawsuit includes 33 US attorneys general. Instead of individually suing a company, the 33 plaintiffs are combining their resources and legal expertise to create a united front in the fight against online harms against children. Many legal experts and news publications liken this lawsuit to those pointed toward Big Tobacco and Big Pharma, both of which were struck by severe ramifications and costly payouts.
"Just like Big Tobacco and vaping companies have done in years past, Meta chose to maximize its profits at the expense of public health, specifically harming the health of the youngest among us," Phil Weiser, Colorado's attorney general, said in a statement.
The plaintiffs of the federal lawsuit are seeing financial penalties from Meta, while the eight individual states are seeking injunctive relief to forcefully stop Meta from using certain features mentioned in the lawsuit that allegedly harm children.
The lawsuits are expected to be a lengthy legal battle, as Meta is likely to fight them. According to The New York Times, Weiser said in a news conference that he filed the lawsuit because he was unable to reach a settlement agreement with Meta out of court.
Why does this matter?
The last few years have been pivotal for those seeking extra child protection online. Tech companies like Amazon, Google, YouTube, Microsoft, and Meta have been hit with lawsuits in the US and the European Union for failure to comply with online child safety laws.
TikTok is also embroiled in ongoing legal issues regarding child safety, as an investigation was launched in 2022 by 46 US attorneys general to find out if TikTok violated consumer protection laws.
Although there's insufficient research, many researchers and mental health advocates blame social media for the decline in teens' social skills, socialization levels, and emotional and mental well-being.
Many parents agree with this sentiment, too, but are stuck between a rock and a hard place of not wanting to deprive their children of online interaction but wanting to preserve their kids' real-life relationships, mental health, and social skills.
Lawmakers seem to understand the importance of children staying connected online in the digital age, but it must be within proper boundaries. The issue is that parents work, have multiple children, take care of their aging parents, maintain a household, and maintain their own personal and social life. Most parents do not have the bandwidth to constantly and accurately monitor their children online, which leaves their kids vulnerable to Big Tech.
But what happens when Big Tech prioritizes money over impressionable and vulnerable children? Attorneys and lawmakers agree tech companies have a responsibility to improve the safety measures on their platforms to keep kids connected yet safe and healthy.