The recent verdict issued by the Los Angeles Court on March 25, 2026, in the trial of the case K.G.M. vs. Meta Platforms and Google LLC cannot be treated as just another procedural decision condemning major digital platforms.
The understanding presented by the jurors represents a symbolic and legal inflection in the way the Judiciary now views social networks: no longer as neutral spaces for content circulation, but as a product structured to capture attention, prolong permanence, and, under certain circumstances, end up generating concrete damages, especially to younger users. The jury concluded that the companies failed to warn about the risks, setting a compensatory indemnity of US$ 3 million additional in fines. Liability was apportioned at 70% to Meta and 30% to Google¹.
The case, brought by a young woman identified as K.G.M., revealed that she started using YouTube at age 6 and Instagram at age 9, developing conditions of depression, self-mutilation, and body dysmorphic disorder attributed to addiction to the platforms.
After receiving an accusation that social networks can harm health, with about six weeks of trial, the jury concluded that Meta Platforms and Google acted with negligence and failed to adequately warn about the risks associated with using Instagram and YouTube, setting a total indemnity of US$ 6 million. This change in approach exposes what the prosecution called “dependency engineering”. Design features are described as “Trojan horses”: tools that seem harmless and useful, but are designed to take control of the user’s time and attention, just like the functioning of cigarettes and online casinos².
The central point of this decision does not reside properly in the value of the condemnation, but in the legal logic adopted by the Court. The prosecution shifted the axis of responsibility, so that not only third-party generated content was discussed, but the very design of the platforms.
The shift made is legally relevant because it weakens the traditional thesis of mere intermediation and introduces responsibility for structural engagement mechanisms, such as infinite scrolling (doom scrolling), automatic video playback, and algorithmic recommendation systems, pointed out as inducers of compulsive use.
The argumentative construction adopted by the plaintiffs, widely highlighted by global journalistic coverage, refers, in strategic terms, to historical litigations against the tobacco industry. It is not about asserting identity between the cases, but recognizing methodological similarity.
The focus shifts from the individual user’s behavior to the product, its engineering, and the incentives embedded in its architecture. Therefore, it is a change in approach that focuses the analysis on corporate decisions and the intentional structuring of social networks for user experience.
Reports and documents made public in related investigations and litigations indicate that within Meta Platforms, there was knowledge of intensive usage patterns by teenagers and even pre-adolescents, as well as strategies aimed at expanding this engagement, even at the time when use by people in this age group was not permitted, evidencing contradictions present in the company’s public statements.
The condemnation in Los Angeles did not occur in isolation. In parallel, a jury in the case involving State of New Mexico vs. Meta Platforms concluded that the latter violated state consumer protection legislation by deceiving users about the safety of its platforms and failing to adopt adequate measures to protect children against sexual exploitation. The indemnity set in this case reached US$ 375 million³. There is also provision for a second procedural phase aimed at analyzing structural measures, which may imply direct impositions on platform operations.
This second case is particularly relevant because it targets the same gear that sustains the big techs’ business model: algorithmic personalization. The logic that maximizes engagement and advertising revenue can, at the same time, facilitate the approach between malicious agents and vulnerable users. Not by chance, discussions of this nature have been accompanied, in different processes and investigations, by reports from former employees about internal alerts related to risks faced by younger users.
Amid this scenario, relevant collateral movements also emerge. Meta Platforms itself confirmed recent changes related to Instagram messaging features, demonstrating that technical and product decisions now directly dialogue with the regulatory and contentious environment.
In the American legislative sphere, the absence of comprehensive federal legislation on social networks, mainly aimed at use by children and adolescents, persists. In contrast, a growing movement is observed at the state and international level aiming for greater control and regulation of use by children and adolescents.
Various states have been approving norms aimed at protecting children and adolescents in the digital environment, including age verification requirements, restrictions on use in school settings, and additional care duties on the part of platforms. This fragmented scenario has contributed to the judicialization of the theme, transferring to the Judiciary the protagonism in defining limits and responsibilities.
The overcoming of the neutrality thesis also finds echo in Latin America, as demonstrated in the Richter vs. Google legal framework in Mexico. Mexican justice condemned Google to payment of approximately US$ 250 million (5 billion pesos) for moral damages and punitive damages. The condemnation was based on the platform’s omission in removing a blog that usurped the identity of lawyer Ulrich Richter Morales and disseminated false information, deciding that the company, by not applying its own usage policies after notification, becomes responsible for the damage. This case is emblematic for applying the figure of punitive damages (or social retribution), aiming to discourage recidivism of illicit conducts by big techs⁴.
In Brazil, in turn, the discussion has already entered a more structured normative stage. The Digital Child and Adolescent Statute, instituted by Law No. 15.211/2025, came into force on March 17, 2026, and was regulated by the Executive Power the following day. As detailed in our previous analyses on the Child and Adolescent Statute and the impacts of Law No. 15.211/2025⁵, this statute establishes concrete obligations to digital platforms, including robust age verification mechanisms, parental supervision tools, risk prevention duties, and reinforced safety standards for underage users. It is a cogent norm, with effective potential to alter the operational design of companies operating in the country.
This regulatory and judicial pressure is not unprecedented, but is gaining scale. It is worth recalling that, as early as 2019, the FTC imposed a record fine of US$ 5 billion on Facebook for privacy violations, establishing a compliance and oversight structure over its operations⁶.
It is at this point that the intersection with the international scenario becomes most evident. In Brazil, progress occurs via legislation. In the United States, via judicial. In both cases, however, the direction is convergent. The progressive overcoming of the platform neutrality thesis, which day by day leaves the position of comfortable equidistance.
The set of these elements reveals a relevant transformation in the legal treatment of digital platforms. Gradually, a logic of accountability is consolidated that does not limit itself to content, but reaches the very architecture of systems and the economic incentives that sustain them. The technical neutrality argument loses strength as the active role of companies in modeling user behavior becomes evident, including psychological and behavioral damages.
The message that emerges from this new scenario is unequivocal. It is not enough to intermediate information. When there is active intervention in the way this information is distributed, prioritized, and consumed, there is also potential incidence of legal responsibility for the effects arising from this structure. The Law, even if progressively and non-linearly, begins to shift its focus from platform discourse to the reality of their functioning and the consequences of their use⁷.
[1] UNITED STATES. Los Angeles Federal Court. Case K.G.M. vs. Meta Platforms and Google LLC. Verdict on negligence and addictive design in social networks. March 2026
[2] GONÇALVES, André Luiz Dias. Condemnation of Meta and YouTube exposes harmful effects of social networks. TecMundo, March 26, 2026
[3] UNITED STATES. New Mexico State Court. State of New Mexico vs. Meta Platforms. Condemnation for failures in child protection and sexual exploitation of minors. March 2026
[4] MEXICO. Tenth Civil Court of Mexico City. Case Ulrich Richter Morales vs. Google Inc. Sentence of condemnation for moral damages and punitive damages. March 2021
[5] Adequacao Regulatoria a Nova Realidade do ECA Digital Published on 03/17/2026.
[6] UNITED STATES. Federal Trade Commission (FTC). FTC Imposes $5 Billion Penalty and Sweeping New Privacy Restrictions on Facebook. July 2019
[7] UNITED STATES. Department of Justice (DOJ). Department of Justice Wins Significant Remedies Against Google. Washington, DC, September 2, 2025. In this decision, the court imposed “significant remedies” against Google, prohibiting exclusivity contracts for the distribution of Google Search, Chrome, and Google Assistant, in addition to ordering the sharing of search and interaction data with competitors to restore competition in the market. The sentence confirmed that the company deliberately acted to maintain its monopoly, violating Section 2 of the Sherman Act.
AUTHOR