Social Media Is to Blame for Mental Health Harms, a Jury Finds
Is this about personal responsibility? Parental duties? Or a public exhausted by rapacious and all-powerful tech companies?
On Wednesday, a jury found Meta and YouTube liable for the mental health harms experienced by a young woman who says she was addicted to their products. The $6 million in damages is small given the size and profitability of these companies, but this is just one case. Its success virtually guarantees that there will be many more.
The truth is that this was a very hard case to prove, and I think it’s unlikely that the jury made their decision based only on the facts in evidence. The plaintiff, a young woman who started using social media when she was just 6, had a tumultuous life, something Meta’s lawyers pointed to in arguing that it was impossible to claim their product was the determinative cause of her mental health issues. The research on social media and mental health is still emerging, but it’s not as clearly and nearly universally harmful as, say, cigarettes. I suspect that Meta and YouTube went into this case believing they would easily win.
They did not read the room.
This decision is as much about the specific life of this young woman as it is a growing social consensus that these companies have too much power and are wielding it to our collective detriment. At trial, the plaintiffs presented evidence that Meta and YouTube designed their products to be addictive — to keep users infinitely scrolling and compulsively clicking — and knew that their products were harmful, especially to young people. And there is significant evidence of this, including internal Meta emails and discussions.
We’ve spent 20 years running an almost totally unfettered social experiment on the public. The potential harms really were not clear, at least in the beginning. But social media companies seem to have known well before the rest of us that their products weren’t opening up new worlds for their users, but keeping them from living in the real one — tethering them to their devices, affirming their worst impulses. The didn’t design their products to give girls body dysmorphia or pitch boys down the misogynist rabbit hole of the manosphere anymore than Round-Up was designed to give people cancer, but they did design their products to keep users of all ages scrolling and scrolling and scrolling and watching and watching and watching — and when they learned that what they were scrolling through and watching was doing enormous damage, they largely shrugged. Social media isn’t the same as cigarettes, but it’s also true that tobacco companies didn’t design their products to kill people. When they realized they did — long before the general public — they had a similar reaction as these social media companies: The profits matter more. And they offered a similar defense: Don’t blame us for your lack of self control; it’s every individual’s personal responsibility to make healthy choices.
In the wake of the Meta/YouTube lawsuit, I keep hearing that phrase, “personal responsibility.” The whole point, though, is that these companies designed their products to override one’s ability to exercise personal responsibility. Some of the smartest people in the world have toiled for years to keep your attention glued on Meta and YouTube. Of course you can close the app. But most of us, I would guess, have had the experience of thinking “I should close this app” only to look up 30 minutes later and realize we’ve been mindlessly scrolling away. That isn’t a personal failure; that is by design. And most of us are not children who have under-developed abilities to regulate our own impulses.
Frankly, I would like to see more companies held accountable for the harms they cause in pursuit of profit (ideally I’d like our government to impose reasonable restrictions and guardrails, but I’m not holding my breath on that). The betting apps that put a little casino in every user’s pocket, and have caused a spike in debt and personal bankruptcies particularly among young men? I would love to see what a legal discovery process might find about their strategies for drawing in new users and keeping them engaged. The big food companies that spend millions to figure out the exact taste and texture points that will keep consumers eating endless — pleased but not quite satiated, always wanting more — and that spend any more millions figuring out what kids crave and then advertising to them? They absolutely deserve to answer for their actions in court.
We can quibble over the definition of the word “addiction,” but these companies, like cigarette makers, all intentionally seek to create products that users will consume compulsively — products that users will have a very hard time saying no to. These companies have invested hundreds of millions of dollars to figure out how to get more of their food in your mouth (and your kids’ mouths), and more of your time on their apps (and more of your kids’ time). Then they have the audacity to tell you it’s a personal failing when their carefully-honed strategies work.
Others are asking, “where are the parents?” And that’s a good question. Parents should absolutely keep smartphones and tablets out of their kids’ hands for as long as possible. It is absolutely an abdication of parental responsibility to let a six-year-old scroll freely on YouTube or give an Instagram account to a nine-year-old.
But some parents don’t do their jobs. Some parents try really hard to do their jobs but are busy or overwhelmed or simply imperfect. Others are utterly neglectful. Others, like of a child who started using social media 14 years ago, did not have the same information about social media’s potential harms as we have today. But their kids shouldn’t suffer additionally because social media companies can make more money drawing in the eyeballs and attention of the country’s most vulnerable children. Good parents also don’t buy their kids cigarettes. But one of the outcomes of the tobacco litigation of the 1990s was the tobacco companies were barred from advertising to kids — something I would hope we can all agree was righteous and right.
Some have raised free speech questions about this lawsuit, and I do think it’s worth pausing any time we’re potentially constraining speech and expression. But I’m also not convinced that’s what is happening here. This lawsuit wasn’t specifically about the content on Meta or YouTube, although as private companies they are under no obligation to host whatever content people post (and they do not do so). It was about the algorithm, the inner workings behind the machine. I am not convinced that design features (infinite scroll) are “speech” in any meaningful way. And the truth is that these companies make content moderation decisions all the time. OnlyFans is (in theory) not available to minors. Instagram is not a platform that features torture videos or explicit porn. I’m sympathetic to free speech concerns when it comes to social media bans and government regulations, but in social media product liability and personal injury cases, the free speech case feels pretty half-baked.
I think product liability and personal injury lawsuits are really imperfect vehicles through which to regulate big companies, whether it’s Big Tobacco, Big Food, or Big Tech. But under-regulation at the federal level, and elected politicians who frankly wouldn’t know how to wisely regulate Big Tech even if they had the desire to, has left the public with few options. And those who defend Big Tech seem to argue that the answer is… do nothing. Take personal responsibility. Leave it to the parents. We’ve tried that. It’s been a disaster. Don’t like the results of this case? Come up with something better — because the status quo is not working, and the public is using whatever tools are out their disposal to fight back.
xx Jill

