BIRMINGHAM, Ala. (WBRC) – Three Alabama school districts, including Tuscaloosa City Schools, are suing Meta, the parent company of Facebook and Instagram, TikTok, YouTube and Snapchat, claiming the social media companies are contributing to the youth mental health crisis.
“We are bringing these lawsuits due to the mental health crisis that these social media companies, products have caused students which has then bled into these districts,” Davis Vaughn, an attorney with Beasley Allen law firm said.
Vaughn says the school districts are having to help provide mental health resources to their students as a result of being addicted among other things to social media.
“It’s addictiveness. It’s design and unfortunately the schools have been left to front a lot of the costs and expenses of the youth mental crisis caused by these social media companies,” Vaughn said.
Vaughn claims social media companies have known about this for years. He points to the 2021 testimony of Facebook whistleblower Frances Haugen who said Facebook knew that its product was harming youth, particularly young female girls, claiming it was causing anxiety, depression and eating disorders.
“These products lack appropriate safeguards. They lack appropriate age verification measures. They lack parental controls and when used to the point of addiction, they are incredibly harmful on our youth,” Vaughn said.
Baldwin County and Montgomery Public Schools also filed lawsuits. We’re told the school districts in the state are among the first to file these type lawsuits against social media companies. The lawsuits are filed in California state court.
Here’s the full statement we received from Meta in response to the lawsuits:
“We want to reassure every parent that we have their interests at heart in the work we’re doing to provide teens with safe, supportive experiences online. We’ve developed more than 30 tools to support teens and their families, including tools that allow parents to decide when, and for how long, their teens use Instagram, age verification technology, automatically setting accounts belonging to those under 16 to private when they join Instagram, and sending notifications encouraging teens to take regular breaks. We’ve invested in technology that finds and removes content related to suicide, self-injury or eating disorders before anyone reports it to us. These are complex issues, but we will continue working with parents, experts and regulators such as the state attorneys general to develop new tools, features and policies that meet the needs of teens and their families.” – Antigone Davis, Head of Safety, Meta
We also received this statement from SnapChat:
“Nothing is more important to us than the wellbeing of our community. At Snapchat, we curate content from known creators and publishers and use human moderation to review user generated content before it can reach a large audience, which greatly reduces the spread and discovery of harmful content. We also work closely with leading mental health organizations to provide in-app tools for Snapchatters and resources to help support both themselves and their friends. We are constantly evaluating how we continue to make our platform safer, including through new education, features and protections.”
Get news alerts in the Apple App Store and Google Play Store or subscribe to our email newsletter here.
Copyright 2023 WBRC. All rights reserved.
The utility industry is facing disruption: As citizens increasingly transition to electric vehicles and adopt renewable energy sources, utility c
Cooper Companies Inc CCOO raised its annual revenue forecast on Thursday, after beating Wall Street estimates for quarterly sales on the back of strong demand
Northern Ireland is a hot spot for engineering firms of all specialties from computer hardware manufacturing to industrial development. The sector is one of the
The United Kingdom is home to thousands of cybersecurity companies. They make up a sector contributing to public and national security while also drawing the g