Howard County, “like many school districts across the state and country, is at a breaking point.”
Credit: Google, Meta, TikTok
A Maryland school district is suing the parent companies of Instagram, Facebook, TikTok, Snapchat, and YouTube (opens in a new tab)for “intentionally cultivat[ing]” harmful product features that have created “a mental health crisis among America’s youth.” These products, the suit alleges, trigger crises that lead young people to skip school, abuse alcohol or drugs, and overall act out in ways that harm Howard County’s “ability to fulfill its educational mission.”
The county says that the strain has become so unbearable that it is at a “breaking point.”
Children, specifically, have been targeted by tech giants for corporate gain, the suit reads. Meta, ByteDance, Google, Snap, and others have focused their attention on creating “self-destructive feedback loops” that exploit the developing brains of young people to boost engagement with their products. While these products are marketed as “social,” they actively promote forms of “disconnection [and] disassociation” that drive kids to forgo “the intimacy of adolescent friendships.”
The suit then attempts to tie teen use of social media to a 57-percent spike in youth suicide rates and a 117-percent growth in emergency room visits for anxiety disorders (the source of these specific stats is not specified, but there has been a general rise in youth suicides(opens in a new tab) in recent years). In 2019, the suit adds, “one in five high school girls had made a suicide plan,(opens in a new tab)” presumably because of mental health struggles exacerbated by social media.
The goal of the lawsuit is to hold tech companies accountable and put in place “comprehensive, long-term planning and funding to drive sustained reduction in the mental health crises its students experience” as a result of social media.
Howard County is not alone in this crusade to curb big teach’s reach in the classroom and beyond. The Verge(opens in a new tab) notes that two other school districts in Maryland, as well as districts in at least seven other states, have filed similar lawsuits over the harms of social media use by young people. In October 2022, a British court found Instagram and Pinterest liable in the death of 14-year-old Molly Russell who died by suicide after viewing suicide, self-harm and depression-related content on their sites.
A Google spokesperson told The Verge that the company “built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls” and a Snap spokesperson said it “vet[s] all content before it can reach a large audience, which helps protect against the promotion and discovery of potentially harmful material.” Antigone Davis, Meta’s head of safety, told The Verge that the company had “invested in technology that finds and removes content related to suicide, self-injury or eating disorders before anyone reports it to us.”
Elizabeth is a culture reporter at Mashable covering digital culture, fandom communities, and how the internet makes us feel. Before joining Mashable, she spent six years in tech, doing everything from running a wifi hardware beta program to analyzing YouTube content trends like K-pop, ASMR, gaming, and beauty. You can find more of her work for outlets like The Guardian, Teen Vogue, and MTV News right here(opens in a new tab).