A coalition of 14 states and the District of Columbia have launched lawsuits against TikTok and its Chinese parent company ByteDance on Tuesday, accusing the platform of harming youth mental health by designing the app to be addictive to children. The lawsuits, filed in state courts, allege that TikTok’s algorithm and features like endless scrolling and push notifications are intentionally addictive, leading to psychological issues such as anxiety and depression.
The states, led by California Attorney General Rob Bonta and New York Attorney General Letitia James, claim that TikTok targets children to boost ad revenue, despite being aware of the potential harms. The platform’s safety measures are criticized as insufficient, with allegations of facilitating unregulated virtual economies that exploit young users.
“TikTok intentionally targets children because they know kids do not yet have the defenses or capacity to create healthy boundaries around addictive content,” Bonta wrote. “When we look at the youth mental health crisis and the revenue machine TikTok has created, fueled by the time and attention of our young people, it’s devastatingly obvious: Our children and teens never stood a chance against these social media behemoths.”
The lawsuit from New York further alleges that TikTok has been responsible for the deaths of children and teens who have died while attempting irresponsible social media challenges.
“In New York and across the country, young people have died or gotten injured doing dangerous TikTok challenges and many more are feeling more sad, anxious, and depressed because of TikTok’s addictive features,” James said while announcing the lawsuit. “Kids and families across the country are desperate for help to address this crisis, and we are doing everything in our power to protect them.”
TikTok has denied these accusations, asserting that it implements robust safety measures for young users.
“We’re proud of and remain deeply committed to the work we’ve done to protect teens and we will continue to update and improve our product,” a company spokesperson said. “We provide robust safeguards, proactively remove suspected underage users, and have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16.”
This legal action is part of an ongoing scrutiny of TikTok’s practices, including a potential federal ban and other lawsuits concerning children’s data privacy.