A US jury has delivered a landmark verdict finding Meta and YouTube liable for designing social media platforms in ways that harm children and young people. Amnesty International described the ruling as a major turning point in acknowledging the dangers posed by platform designs that prioritize user engagement over wellbeing. The case focused on how major social media companies have allegedly used features such as infinite scroll, autoplay, and constant notifications to keep young users online for longer periods, creating patterns of compulsive use that can negatively affect mental health.
Amnesty said the verdict highlights years of concerns that tech companies have profited from addictive design features specifically harmful to younger audiences. During the trial in Los Angeles, a 20-year-old plaintiff testified that she began using YouTube at age six and Instagram at age nine, eventually spending most of her day online as a child. According to her testimony, this compulsive use intensified over time and contributed to struggles with addiction and worsening depression, illustrating the real-world impact of platform design on vulnerable users.
The jury in the KGM case found Meta and YouTube negligent and ordered them to pay $6 million in damages. Amnesty argued that the decision makes clear that these platforms are “unsafe by design” and that meaningful reforms are urgently needed. The organization said governments should focus on requiring major changes to how platforms are built and operated—particularly addressing addictive design features—rather than relying only on broad restrictions such as banning younger teens from social media.
The ruling could have wider implications for other legal cases involving alleged harm caused by social media companies. Meta and Google have both said they disagree with the verdict and plan to appeal. Snap and TikTok were originally included in the case, but both companies settled before the trial began. In a separate case in New Mexico, another jury also found Meta liable for harming children’s mental health and safety, further increasing pressure on social media firms over how their products affect young users.







