Just How Dangerous is Facebook to Kids – and What Should the Law Do About It?

Related articles

For years, we’ve heard that social media is dangerous- especially to kids. The data was sketchy, but the anecdotal reports horrific - the perfect ground for band-aid remedies, poorly thought-out responses, inadequate legislation, and lawyers trolling for personal injury clients. But now, it seems a new threat has been detected: Addiction. Is it real? Other countries seem to think so.

Suicide and Cyberbullying

Harms to teens from social media are not new. In 2006, thirteen-year-old Megan Meier killed herself after a social network user with a fake account told her the world would be better off without her.  In 2013, twelve-year-old Rebecca Sedwick suffered the same fate: suicide from cyberbullying. 2017 produced 11.8 cyber-bullying-suicides per 100,000 in the 15-19 “teen” age group, up from 8 deaths per 100,000 in 2000. Today, the US ranks third globally in cyberbullying. 

While suicide is rare, juvenile victims of cyberbullying are twice as likely as non-victims to cause self-harm and exhibit suicidal ideation and behavior. Anorexia and bulimia are other concerns, as social media users are belittled in their physical appearance or led to believe their bodies are so damaged and disfigured compared to the Stepford Wife-photoshopping of celebrities to which they are exposed online.

"Thirty-two percent of girls under the age of 26 said that when they felt bad about their bodies, Instagram made them feel worse. … Instagram researchers

Facebook knows that they are leading young users to anorexia content."

Testimony of Frances Haugen, former Facebook employee, before a US Senate committee.

Other harms, such as depression, have also been correlated with social media, an open invitation to law firms seeking plaintiffs. So far, all cases suing the social media providers and claiming personal injury damages have failed. The reason? Section 230 of the  Communications Decency Act (CDA).

Section 230

Section 230 of the 1996 Communications Decency Act shields the platform or provider from liability on the grounds that they act as third-party publishers with no control over content.

"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider" 

(47 U.S.C. § 230)

Various states have memorialized this precept into precedent. New York, for example, specifically held that it does not “recognize cyberbullying or Internet bullying as a cognizable tort action.” [1]

The prevalent claim for seeking civil liability against social media (SM) platforms for these types of harms has been under a negligence theory – which requires a showing of wrongdoing or carelessness on the part of the defendant, along with causally- related damages. The CDA shields these companies from liability, considering them outside or third parties with no responsibility or involvement in producing the content – leaving injured parties remedy-less.

What’s an injured (potential) plaintiff to do?

Egregious harms have been known to birth new legal remedies- or productively recycle old ones in new ways. But the challenge to obtaining recovery for harms occasioned by social media platforms is dual – identifying a harm more horrific than suicide (which didn’t seem to engender a valid legal response) and finding a cause of action suitable for repurposing which will withstand legal scrutiny.

Let’s talk about the harm first.

Enter Addiction

If suicide, acute depression, and anorexia aren’t dramatic (or causally- related) [3] enough to predicate allowable lawsuits, what might change the legal landscape?

Enter addiction – the national rallying cry for abuse created by a third party. In the case of social media, the theory is not all that different from opiate-induced addiction, including a “cause” by a third party – the social media platforms.

Social media addiction is a behavioral addiction “driven by an uncontrollable urge to log on to or use social media, and devoting so much time and effort to social media that it impairs other important life areas.”

According to the Addiction Center, “addictive social media use will look much like any other substance use disorder” and may include:

  • mood modification
  • an increasing use or tolerance
  • withdrawal symptoms when social media use is restricted or stopped,
  • conflict over their usage with others
  • relapse after abstinence
  • risky and impaired decision-making

“Behavioral addictions have much the same effect on the brain as drugs and alcohol, and the same is no different for social media. For those who engage with social media apps on a regular basis, the process … creates the chemical dopamine in the brain. Dopamine reacts with neurotransmitters and creates feelings of pleasure and reward and causing the formation of “addiction pathways” in the brain that makes it hard to resist urges or stop the behavior.” - Dan Meshi, Ph.D.  Journal of Behavioral Addictions

Thus, anorexia is not a direct cause of social media under this theory but a sequela of social-media-induced addiction. Provable addiction [4] is undoubtedly noteworthy, newsworthy, grievous, and perhaps a common enough harm to motivate the search for a legal cure.

Now we just need to get around the immunity provided by the CDA.

Repurposing a Legal Theory –  social media platforms' addiction may overcome traditional negligence's impediments.

The “230” immunity of the CDA was predicated on the ground that social media platforms were, like publishers, not actively involved in promulgating content and hence couldn’t be considered negligent. Any harm that accrues from social media must arrive independently of the content of the posting to sustain a legal claim. Photoshopped images or threats, alone, won’t suffice.

In 2021 the case of Lemmon v. Snap changed the legal landscape and offers hope that the claim of strict liability in tort (SLT) claims might work in the social media context. Under the theory of SLT, the wrongdoer is liable not for carelessness – but for producing or introducing into the stream of commerce a product whose risks outweighs its utilities, or in modern parlance, a valuable product for which a reasonable alternative design (or a warning) is not available. In some jurisdictions, the cause of action is described as one for “products that are defective and reasonably dangerous.” In essence, the claim focuses on the unreasonableness of the product- while negligence focuses on the unreasonableness of actions of the defendant.

The Lemon case concerned three teenagers who died in a high-speed accident. The boys’ parents charged that Snap Inc. (the parent body of Snapchat) encouraged their sons to drive at negligent speeds (123 miles an hour) through the negligent design of the Snapchat application. In this capacity, SNAP acted not as an outside content publisher (immune under Section 230) but as a product manufacturer. Specifically, the parents’ claimed the flaw causing the accident was the interplay between Snapchat’s reward system and its speed filter, violating its duty to provide a safe product whose benefits outweighed the risks. In other words, SNAP was liable for creating a product-design defect, which was actionable under product liability law. This distinction, the parents claimed, and the court affirmed, removed the immunity provided under the CDA Decency Act. In this case, SNAP’s design and the interplay of its Speed Filter Device prompted the teens to drive recklessly.

“In short, Snap, Inc. was sued for the predictable consequences of designing Snapchat in such a way that it allegedly encouraged dangerous behavior.”

Judge Kim Lane Wardlaw

Deux ex Machina – Algorithms: A Perfect Storm

Claims of addiction [2] (or suicide, or depression)  don’t involve SNAP’s superimposed “filter device,” which prompted the speeding in the Lemon case. So, to effectuate the addiction claims, a new and different defect, one wholly under the control of the social platform, must be identified. Enter the algorithm.

Because social media aims to maximize user interaction - keeping users active in each app as long as possible (the first step in triggering addiction)—it’s programmed to respond to individual behaviors, hooking them through the presentation of more of what attracts them. This is done via an algorithmic amplification process – created (i.e., designed) by human programmers retained or employed by the social media company. The algorithm – owned and/or operated by the social media platforms, responds to what gets people to click and comment, seeking to amplify and keep users “hooked” in the app for hours, leading to dangerous behaviors.

The double whammy of an egregious injury, say addiction leading to anorexia, coupled with a viable cause of action – strict liability in tort (aka product liability) for social media’s designed algorithms, may have legal traction. Multiple cases are pending, and it will take time for them to wind through the tort system. Such claims are not a slam dunk, however. Liability for errant programming has been the subject of many scholarly legal articles, with academics scrounging to find a basis. This is especially true for AI’s “black-box” technology- where we don’t exactly know how or why the program does what it does. The difference here is that the algorithmic programming is not errant- but intentionally motivated – purposely designed to create addictive states. Think adding menthol to the cigarette - for which liability does accrue against the cigarette manufacturer, other immunity protection notwithstanding. Some jurisdictions define the difference between negligence and SLT thusly: in SLT (AKA products liability law), we focus on the defectiveness (or unreasonable danger) posed by the product, as compared to negligence law which focuses on the reasonableness of the actions (or inactions) of the defendant.

Perhaps a pointed defense is that strict liability in tort (SLT) is not contemplated for use for services, just products. You can’t sue a doctor in strict liability for using the wrong blood type, although you might be able to sue the blood bank which sold the “product.” Meta and its universe might successfully claim that they are providing a service, not a product, for which the theory won’t apply. We’ll have to wait and see.

Another simple defense for SLT is to impose warnings- which of course, are easy to circumvent in action. Nevertheless, these might shield the platforms from liability. So, be on the lookout. Another theory that might be tapped is public nuisance law, similar to its use in opioid cases, but that approach hasn’t been raised so far.

What about now?

That great schizoid state, California, is currently debating a bill that would enable parents and the state attorney general to sue social platforms for these errant algorithms that addict children. Minnesota sought to ban algorithms entirely in recommending content to anyone under 18. This legislation may well grease the path to lawsuits. Better would be if the social media companies would reign in the content, of course.

How real a threat is this?

Perhaps the scariest implication of online excess, even if not outright addiction, is that our global friends and competitors are worried. Very worried.

China has implemented restrictions on gaming and live-streaming and seriously restricted usage to children under 16. Called ‘Cinderella Laws,’ use of social media is banned after 10 pm. The Chinese aren’t worried about suicide or anorexia  – they are worried about brain drain and mind control, as social media use interferes with creative thinking. Italy has implemented similar laws.

According to law professor and legal scholar Professor Bob Brain, India has implemented a disincentivizing methodology, such as reducing points or otherwise penalizing players for excessive use, which might be tried here, too.

If our enemies and competitors are worried- maybe we should be as well. As to how we ratchet down algorithms designed to attract us like a golden goose to social media’s owners, we now have a choice: Lawsuits or legislation. Your pick.

 

[1] Finkel v. Dauber, 906 N.Y.S.2d 697, 703 (N.Y. App. 2010)

[2] In June, Alexis Spence and her parents filed suit against Meta, Facebooks’s parent company, claiming her eating disorder arose from her addiction to Instagram use.

[3] In the wake of the offline role-playing Dungeons and Dragons controversy, the American Association of Suicidology, the U.S. Centers for Disease Control and Prevention, and Health and Welfare Canada all concluded that there is no causal link between fantasy gaming and suicide

[4] Assuming social media does inflict the same physiological and psychological harms similar to opiates on the brain, psychologists note that only 5 to 10% of Americans meet the criteria, and it will be up to plaintiffs’ lawyers to prove their clients suffer from the syndrome