Why tech giants shouldn’t be responsible for creating addictive platforms

While social media companies are criminals in many ways that haven’t done nearly enough to protect children on their platforms, they still shouldn’t be held accountable based on claims that they create addictive and dangerous places online.
Last week, a trial began in Los Angeles Superior Court in a lawsuit filed by a woman, referred to in articles by Kaley GM, against tech giants YouTube and Instagram. (TikTok stayed with him before). The complainant says that these forums are designed in such a way as to make children addicted. His is just one of more than 2,500 cases now pending stemming from a variety of legal claims against some of the world’s biggest companies.
At the heart of these lawsuits is that internet and social media companies, including those owned by Meta and Google, should be held accountable on the same theory popularly used against Big Tobacco: that the brands intentionally create an addictive product. But the analogy fails for one simple reason. Internet and social media companies engage in speech, which is protected by the First Amendment, while no constitutional right is involved in regulating tobacco and other tobacco products.
Lawsuits against social media companies say they design platforms in a way that keeps kids engaged for long periods of time and keeps them coming back for hours on end. But you can say that about all kinds of media. Books, including those for children, are often written with stone crosses at the end of each chapter for people to read. Television series do the same, encouraging people to keep watching or “binge” for as long as they can. Video games are apparently designed to keep people, including children, playing late into the night.
Algorithms are speech
Holding any media company liable for the content of its speech raises serious First Amendment issues. Plaintiffs in the suits say the algorithms are designed and manipulated by individual users to keep them stuck. But algorithms are themselves a form of speech and there is no reason to treat this speech any differently than TV scripts or novels or the code that makes video games work. As Supreme Court Justice Elena Kagan wrote in a 2024 opinion, “The First Amendment …
The decision of the Supreme Court in the case of Brown v. Entertainment Merchants Association (2012) is relevant here. The case involved the invalidity of a California law that made it a crime to sell or rent violent video games to those under 18 without parental consent. The Supreme Court, in an opinion by Justice Antonin Scalia, declared the California law unconstitutional. Initially, the court rejected the argument that there was little constitutional protection because the law was designed to protect children.
The court instead declared that “children have a fundamental right to First Amendment protection, and only in limited and well-defined circumstances may the government prevent the public distribution of material protected against them.”
California asserted that playing violent video games has a negative effect on children, predisposing them to violent acts. However, the court rejected this argument and emphasized the heavy burden of proof of causation that must be met in regulating speech.
Scalia, writing for the majority, concluded, “California cannot unite [strict scrutiny.] At the outset, it admits it cannot demonstrate a direct causal link between violent video games and child injury. … The State’s evidence is not compelling. … They show very well some correlation between exposure to violent entertainment and small real-world outcomes, such as children feeling aggressive or making more noise minutes after playing a violent game than after playing a non-violent game.”
The court concluded that the government could not prove the necessary causation to hold video game companies liable for their content. The same, of course, is true of the Internet and social media companies, each of which is a different communication platform.
But, as the Supreme Court saw in Packingham v. North Carolina (2017), social media are “primary sources for learning about current events, checking job ads, speaking and listening in the modern public space, and exploring many areas of human thought and experience.” The Court emphatically concluded that it “must be very cautious before suggesting that the First Amendment provides little protection for access to such large networks.”
Technology is not a cigarette
There are other legal barriers to holding internet and social media companies accountable for creating addictive and dangerous online environments for children. Section 230 of the Communications Decency Act provides that these platforms cannot be held liable for content posted on their sites, whether that includes content to be uploaded or to be taken down. Pending lawsuits against Internet and social media companies cannot overcome this assertion.
None of this is to deny how some children are harmed by time spent on social media. There is research that shows that the use of platforms is related to depression, self-esteem and bullying. There are also studies showing that playing violent video games can be linked to antisocial behavior. The solution is not to restrict speech or to hold those responsible for it. Finally, parents need to make careful decisions about when and how they allow their children to participate in social media. Meanwhile, these tech giants should be more careful about things aimed at children.
Ultimately, it will be up to the Supreme Court, not a judge in Los Angeles Superior Court, to decide whether social media companies can be held liable on these grounds. The answer is obvious: Social media is speech, cigarettes are not and that makes a difference.
Erwin Chemerinsky is dean of UC Berkeley Law School. ©2026 Los Angeles Times. Distributed by Tribune Content Agency.



