Social media platforms like TikTok, Instagram, Snapchat, and Facebook have been facing significant scrutiny from parents, educators, and even the U.S. Congress for how they regulate and promote content on their respective platforms. One of the most recent efforts to hold social media companies accountable is coming from the San Mateo County Office of Education, a school district in California. They have filed a social media youth injury lawsuit to recover compensation for the losses they have sustained.
The school district argues that they deserve to be compensated for the thousands, and eventually millions, of dollars they spend dealing with the poor behavior and mental health crises that they attribute to toxic social media algorithms that promote damaging content. The social media youth harm lawyers of Dolman Law Group have been at the forefront of this issue by assisting plaintiffs in demonstrating the impact of harmful algorithms on their financial, mental, and emotional health.
Schools Report Mental Health and Behavioral Issues Connected to Social Media Use
The San Mateo County Office of Education is far from the first school district to pursue a social media youth harm lawsuit. Other counties, like Seattle Public Schools, have taken steps to mitigate the financial damage they have sustained as a result of social media companies’ negligence. They cite concerns about a growing trend of teens experiencing problems related to underregulated social media use, including:
- Eating disorders
- Suicidal thoughts
- Sleep issues
Younger users are engaging with social media at higher rates than ever, with virtually all teens using the internet every day, and about half describing their use as “almost constant”. They are less equipped than adults to process toxic content and filter harmful messages. Parents and educators have largely been left to deal with the fallout.
Social Media Companies Face Product Liability Lawsuits For Their Algorithms
Social media algorithms are designed to anticipate the type of content that a user will like in order to capture their interest so that they will spend more time on the platform. In effect, this often means that users are presented with increasingly shocking, radical, or controversial content. When it comes to children and teens using the internet, they can easily be tunneled into toxic content that promotes harmful behaviors like eating disorders or bullying.
Companies like Meta, which owns Facebook and Instagram, argue that they are protected from liability by the Communications Decency Act of 1996. They cite Section 230, which says that a tech company isn’t liable for content posted by a third party. However, since 1996, the scope of the internet has grown exponentially. Algorithms now promote content tailored to individual users.
The Fate of Social Media Companies’ Liability Remains Undecided
Plaintiffs argue that social media companies are still liable for how they choose to promote the content that is posted on their platforms. They claim that a given social media company’s algorithm is a product that is separate from the content posted by third parties. This issue is currently being decided before the Supreme Court in Gonzalez v. Google.
This case argues that YouTube, which is owned by Google, should be held liable for its algorithm that promoted terrorist content and radicalized individuals who went on to kill over 100 people in the Paris attacks of 2015. At this time, SCOTUS has heard oral arguments but seems inclined to allow the legislature to handle the nuances of this complex issue. If the Supreme Court rules in favor of Gonzalez, that would create a clearer path for families and school districts to pursue claims against social media companies for the impact of their algorithms.
Recovering Damages in a Social Media Youth Harm Lawsuit
The wave of social media youth harm lawsuits being brought by school districts across the country are different from typical personal injury lawsuits, as the defendants are not being asked to compensate individual students for their losses. Instead, school districts are claiming that their resources for mental health and discipline have been unfairly burdened due to a lack of regulation of social media algorithms. Right now, most claims are seeking economic damages for financial relief.
School districts are seeking compensation for costs like:
- Teacher training
- Educational materials
- Securing experts to provide training and guidance
- Counseling and mental health services for students
Many public schools are already facing severe funding crises, particularly after the pandemic. This drain on their resources is unsustainable. Groups like the San Mateo County Board of education say that the claims against social media companies are as much to demand changes in regulation as they are to recover compensation.
Pursuing Compensation in a Social Media Youth Harm Lawsuit
At their heart, these claims are product liability claims, even though the social media companies’ algorithms are intangible. Product liability claims have a specific set of criteria that plaintiffs must meet in order for their claim to be considered eligible for compensation. The design, manufacturing, or marketing of the product must be proven to be defective. The defendant must either have known or reasonably been aware that the product was harmful.
Plaintiffs need to be able to demonstrate that the defendant, in this case, social media companies like Meta and Google, breached the duty of care they owed plaintiffs. This failure to fulfill their duty must have resulted in some harm, which has then caused damages that can be compensated. This area of law is still relatively new and relatively untested, meaning that plaintiffs’ arguments will need to provide compelling evidence to support their claims.
As schools sue tech companies for compensation for the effects of their harmful algorithms, potential plaintiffs should be aware that the best way to mount an effective claim for damages is to contact a personal injury attorney who is familiar with the legal nuances of this issue and has experience handling social media claims. It will likely take meticulous investigation and documentation to create a chain of causation for a social media youth harm lawsuit