Social Media Lawsuits: Victory?
With recent multimillion dollar verdicts and settlements, it looks as though Social Media giants are finally being held accountable for the algorithms that have held children and adults hostage. Or is this just the cost of doing business? Today there are more than 10,000 Social Media lawsuits being brought against these firms. With the latest judgments, it is expected that this number will grow substantially. The harms claimed are almost exclusively from the algorithms promoting addiction rather than content, for which these firms are generally protected. So what is next?
Changes to algorithms, parental controls and AI based age verification. We have been down this road before. While the prior legal challenges to videogame addiction had not led to ANY serious awards, it did lead to the inclusion of application parental controls that has since left the industry relatively unscathed. This despite the fact that Gaming Disorder is now officially considered a mental health condition in most of the world. In the United States, no legislation protects children from excess videogame play or their algorithms. Instead parental controls protect the videogame industry. While lawsuits targeting addiction potential have re-emerged. privacy and deceptive design elements to promote in-app purchases have dominated lawsuits and settlements against some of the large studios and retailers.
Problematic Social Media use is more than the algorithms. Yes, they do play a role, but over 20 years ago, well before social media and smartphones even existed, mental health clinicians in my own doctoral studies research reported “Chat’ as a major source of problematic computer use for individuals seeking treatment (Woog, 2004). There were no addictive algorithms, no swiping and users spent time on clunky desktop personal computers. So why was this problematic? For the same reason it is today. Connection and belonging. Or to be more exact, lack of connection and the amount of time individuals spend seeking connection and belonging on platforms where users were either anonymous or could act as though they were. Unsafe, unsupervised and unregulated - same as today. This is difficult enough for adults to navigate (back in 2004, it was mostly adults reporting these problems) but children are severely ill-equipped for the challenge. That is why the former US Surgeon General, Dr. Vivek Murthy released the advisory, “Social media and youth mental health” (Office of the Surgeon General, 2023).
It is clear that there is plenty of blame to throw around. Many social media platforms are fundamentally unsafe due to both algorithms and content. Children are spending 3+ hours/day on those platforms and a majority of parents are not able to adequately supervise them. Only about 1% of parents use the parental controls provided by social media platforms and only 16% use any type of parental controls on their child’s cell phone. In addition to the time squandered, children are being exposed to dangerous content that often dramatically conflicts with family values. To make matters worse, many parents themselves are spending a significant amount of time on screen devices. Because of this, they are unable to encourage quality family time, provide online supervision and are certainly not modeling screen time moderation and self-control,
Sentinel Computers designed the LaunchPad considering all these factors and helps parents delay the introduction of smartphones to at least 16 years of age. It is our belief that limited social media access younger than age 16 may be valuable for a child but only with direct, in-person supervision. The LaunchPad was designed to help parents teach digital citizenship and provide a step-by-step process to introduce social media to their child in a safe and highly supervised manner. For more details on this process, click here. Once a child is on social media without direct in-person parental support, parents must be capable of enforcing daily limits and providing ongoing supervision of online activity.
Since the task of setting up parental controls on so many devices and applications can be daunting, Sentinel Computers has made this quick and simple. Answer a few basic questions regarding your child’s age and weekly schedule and the five minute Easy Setup does the rest, automatically restricting social media access based on age, operating mode and with use limits (including family screen-free time) based on professional recommendations. Screen monitoring helps parents quickly see the kinds of activities and content their child is engaged with. From anywhere in the world, parents can monitor in near real-time, communicate with their child and even remove their child from unacceptable content if necessary.
While the 10,000 cases resolve in court or by settlement, children still need to be safe on screen devices. While billions of dollars may ultimately be paid to compensate harm, Sentinel Computers is working hard to help prevent these harms in the first place. The value of that… priceless.
Office of the Surgeon General. (2023). Social media and youth mental health: The U.S. Surgeon General’s advisory. U.S. Department of Health and Human Services.
Woog, K. (2004). A survey of mental health professionals’ clinical exposure to problematic computer use [Unpublished doctoral dissertation, Trinity College of Graduate Studies]. http://www.pcmoderator.com/research.pdf