Using the Big Tobacco and Remington Plaintiffs’ Playbook, Seattle School System Sues Big Tech for Social Media’s Role in Youth Mental Health Crisis

Jan 18, 2023 | Blog
Partner

Background

In the age of the social media algorithm, sometimes the greatest damage isn’t the product itself but the way it’s marketed and promoted, especially when that marketing and promotion is aimed at children.

On January 6, 2023, Seattle Public Schools (“SPS”) filed a lawsuit against the Big Tech companies behind the world’s most popular social media platforms: Facebook, Instagram, TikTok, YouTube, and Snapchat. (Seattle School District No. 1 v. Meta Platforms Inc. et al., 2:23-cv-00032 (W.D. Wash. Jan. 6, 2023)). The Seattle Public School system, which is the largest K-12 school system in Washington state and serves over 49,000 students, is forcing industry giants such as Meta, Alphabet, Snap, and ByteDance to contend with the role they’ve allegedly played in the worsening mental health crisis among America’s youth.

SPS is employing many of the same legal theories that states employed successfully in lawsuits against Big Tobacco and that the Sandy Hook parent plaintiffs in Connecticut used to sue gun manufacturer Remington Those theories include knowledge and concealment of harm caused by the products (in this case, social media platforms aimed at youths) and marketing that deliberately ignores the risks of harm in order to get consumers (young users) addicted to the product, resulting in objectively provable mental health injuries.

A Burgeoning Mental Health Crisis

The SPS alleges that it has had to divert time, funds, personnel, and other resources in an attempt to handle the overwhelming mental health crisis among its youth. The school system claims that much of this crisis can be traced back directly to social media and so there is a causal link to the algorithms employed by the platforms to push content to pre-teens and teenagers. Specifically, SPS argues that social media platforms (and their algorithms) are designed to manipulate young users and are specifically marketed to youths, despite the known harm these sites can cause. A leak of internal documents by a whistleblower and former Meta employee in 2021 revealed that even Meta’s own research had concluded that its platforms were damaging the mental health of its young users.

The suit also cites a variety of other surveys and studies linking the use of social media among pre-adolescents and adolescents to serious mental health issues such as higher rates of cyberbullying, anxiety, depression, thoughts of self-harm, suicidal ideation, and disordered eating. One of many statistics included in the complaint states that “from 2009–19, the rate of high school students who reported persistent feelings of sadness or hopelessness increased by 40 percent (to one out of every three kids).”

SPS’s standing to bring this suit will no doubt be challenged, but the school district is likely to oppose this challenge by showing economic harm in the increased costs of providing mental health crisis intervention and more general health services related to increased rates of depression, suicidal ideation, and bullying the district will attribute to these social media platforms. SPS posits that school systems bear the brunt of the harm caused by social media because they are some of the largest providers of mental health services for school-aged children.

SPS also argues that behavioral and emotional issues spurred by social media make it more difficult for teachers to teach effectively and students to learn effectively. The suit asks for funds to hire additional mental health personnel, train teachers, educate students about social media, increase disciplinary services, investigate threats made via social media, and even to repair damaged property resulting from students “acting out.”

Social Media & the Young Brain

Pre-teens and teenagers make up a significant portion of social media platforms’ user bases and, consequently, are critical to these platforms’ revenue streams. The SPS contends that despite knowing the detrimental psychological effects these platforms can have on young users’ mental health, the defendants continued to design and market their platforms specifically to young users in order to increase profits.

Surveys indicated that while many young people realized that social media was harming their mental health, many also didn’t feel like they could stop using it, which—according to the complaint—is largely by design. The complaint explains how social media relies on generating brief releases of dopamine to the brain that keep users coming back for more, much in the same way other addictive behaviors like gambling and recreational drug use affect the brain.

A mechanism called “intermittent variable rewards” is at play here, which “works by spacing out dopamine triggering stimuli with dopamine gaps—allowing for anticipation and craving to develop, which strengthens the desire to engage in the activity with each release of dopamine.” The suit likens using social media to playing a slot machine. A user might repeatedly check a post they’ve made to see if they’ve “won” this time, i.e., obtained new likes, shares, views, comments, etc. When someone refreshes their feed, these platforms are purposefully designed to take a few seconds to update in order to create a sense of anticipation of what the next piece of content will be.

These platforms use a variety of other tactics as well to pull users in and keep them engaging with the site. For example, algorithms curate personalized content that appears on users’ feeds in a virtually infinite scroll. Push notifications allow alerts to pop-up on a user’s screen, enticing them to once again interact with the platform. Content that disappears after a certain time limit, pages that generate suggested content, and automatically playing videos are also methods these platforms employ to keep users coming back to (and staying on) the platform as much as possible.

The suit argues that these types of addictive, reward-based systems are more predatory when it comes to young people: “Adolescents’ low capacity for self-regulation means they are particularly vulnerable to the immediately pleasurable, but ultimately harmful, effects of the repeated dopamine spikes caused by an external stimulus, such as ‘likes’ that activate the reward system in the brain.”

Section 230 Issues

This case may test the boundaries of Section 230 of the Communications Decency Act. This Act has been used to purportedly insulate platforms from liability for damages arising from publication of content. The statute states that, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In anticipating that the defendants will argue that they’re covered under Section 230, SPS in its complaint posits that the defendants’ active roles in designing and marketing their platforms, as well as promoting and recommending harmful content, is not shielded by Section 230.

There is support for this argument in the text of Section 230, which states that the Section applies only to providers in their roles as publishers or speakers. The Ninth Circuit Court of Appeals, in its opinion in Lemmon v. Snap, wrote that Section 230 immunity applies only where the subject platform is a “publisher or speaker of any information provided by another content provider.” As in the Seattle matter, plaintiffs in Lemmon alleged that Snap’s driving speed filter, which provided rewards for excessive speed, was a dangerous and negligently designed product that caused injuries. The Ninth Circuit reversed the District Court’s dismissal of the claim, stating that Section 230 does not provide immunity against claims that the product (app or platform) was negligently designed. The Seattle School District has made such claims in its public nuisance lawsuit and may well amend them to focus on the product liability aspects noted by the Lemmon court.

Conclusion

 Regardless of the way a court decides a motion to dismiss the Seattle matter, it is a good bet that this won’t be the last such case. Big Tech should prepare for a wave of lawsuits—many of which will be styled as class actions—alleging mental health injuries arising from its algorithms, promotional activities, and product designs.

If you have any questions pertaining to digital platforms and online youth protection policies or claims, please contact Kenneth Rashbaum.

 

If you or someone you know is experiencing a mental health crisis, please dial 9-8-8 to reach the 988 Suicide & Crisis Lifeline, which provides confidential 24/7 support, or find more resources at www.988lifeline.org.