(CNN) — Christopher James Dawley, known as CJ to his friends and family, was 14 years old when he signed up for Facebook, Instagram and Snapchat. Like many teenagers, he documented his life on those platforms.
CJ worked as a busboy at Texas Roadhouse in Kenosha, Wisconsin. He loved playing golf, watching “Doctor Who” and was highly sought after by top-tier colleges. “His counselor said he could get a free ride anywhere he wanted to go,” his mother Donna Dawley told CNN Business during a recent interview at the family’s home.
But throughout high school, he developed what his parents felt was an addiction to social media. By his senior year, “he couldn’t stop looking at his phone,” she said. He often stayed up until 3 a.m. on Instagram messaging with others, sometimes swapping nude photos, his mother said. He became sleep deprived and obsessed with his body image.
On January 4, 2015, while his family was taking down their Christmas tree and decorations, CJ retreated into his room. He sent a text message to his best friend — “God’s speed” — and posted an update on his Facebook page: “Who turned out the light?” CJ held a 22-caliber rifle in one hand, his smartphone in the other and fatally shot himself. He was 17. Police found a suicide note written on the envelope of a college acceptance letter. His parents said he never showed outward signs of depression or suicidal ideation.
“When we found him, his phone was still on, still in his hand, with blood on it,” Donna Dawley said. “He was so addicted to it that even his last moments of his life were about posting on social media.”
Now, the Dawleys are joining a growing number of families who have filed recent wrongful death lawsuits against some of the big social media companies, claiming their platforms played a significant role in their teenagers’ decisions to end their lives. The Dawleys’ lawsuit, which was filed last week, targets Snap, the parent company of Snapchat, and Meta, the parent company of Facebook and Instagram. The suit accuses the two companies of designing their platforms to addict users with algorithms that lead to “never-ending” scrolling as part of an effort to maximize time spent on the platform for advertising purposes and profit.
The lawsuit also said the platforms effectively exploit minor users’ decision-making and impulse control capabilities due to “incomplete brain development.”
Donna Dawley said she and her husband, Chris, believe CJ’s mental health suffered as a direct result of the addictive nature of the platforms. They said they were motivated to file the lawsuit against Meta and Snap after Facebook whistleblower Frances Haugen leaked hundreds of internal documents, including some that showed the company was aware of the ways Instagram can damage mental health and body image.
In public remarks, including her testimony before Congress last fall, Haugen also raised concerns about how Facebook’s algorithms could drive younger users toward harmful content, such as posts about eating disorders or self-harm, and lead to social media addiction. (Meta CEO Mark Zuckerberg wrote a 1,300-word post on Facebook at the time claiming Haugen took the company’s research on its impact on children out of context and painted a “false picture of the company.”)
“For seven years, we were trying to figure out what happened,” said Donna Dawley, adding she felt compelled to “hold the companies accountable” after she heard how Instagram is designed to keep users on the platform for as long as possible. “How dare you put a product out there knowing that it was going to be addictive? Who would ever do that?”
Haugen’s disclosures and Congressional testimony renewed scrutiny of tech platforms from lawmakers on both sides of the aisle. A bipartisan bill was introduced in the Senate in February that proposes new and explicit responsibilities for tech platforms to protect children from digital harm. President Joe Biden also used part of his State of the Union address to urge lawmakers to “hold social media platforms accountable for the national experiment they’re conducting on our children for profit.”
Some families are now also taking matters into their own hands and turning to the courts to pressure the tech companies to change how their platforms work. Matthew Bergman, the Dawleys’ lawyer, formed the Social Media Victims Law Center last fall after the release of the Facebook documents. He now represents 20 families who have filed wrongful death lawsuits against social media companies.
“Money is not what is driving Donna and Chris Dawley to file this case and re-live their unimaginable loss they sustained,” Bergman said. “The only way to force [social media companies] to change their dangerous but highly profitable algorithms is to change their economic calculus by making them pay the true costs that their dangerous products have inflicted on families such as the Dawleys.”
He added: “When faced with similar instances of outrageous misconduct by product manufacturers, juries have awarded tens of millions of dollars in compensatory damages and imposed billion-dollar punitive damage awards. I have every reason to anticipate a jury, after fairly evaluating all the evidence, could render a similar judgment in this case.”
In a statement to CNN Business, Snap spokesperson Katie Derkits said it can’t comment on active litigation but “our hearts go out to any family who has lost a loved one to suicide.”
“We intentionally built Snapchat differently than traditional social media platforms to be a place for people to connect with their real friends and offer in-app mental health resources, including on suicide prevention for Snapchatters in need,” Derkits said. “Nothing is more important than the safety and wellbeing of our community and we are constantly exploring additional ways we can support Snapchatters.”
Meta also declined to comment on the case because it is in litigation but said the company currently offers a series of suicide prevention tools, such as automatically providing resources to a user if a friend or AI detects a post is about suicide.
Tech companies under pressure to make changes
Although alarms have been raised about social media addiction for years, Haugen’s testimony — coupled with concerns around kids’ increased time spent online during the pandemic — has made the issue a national talking point. But change hasn’t come fast enough for some families.
Jennifer Mitchell, who said her 16-year-old son Ian died of a self-inflicted gunshot while on Snapchat, is also working with the Social Media Victims Law Center to file a lawsuit against Snap. She said she hopes it will make more parents aware of the dangers of social media and encourage lawmakers to regulate the platforms.
“If we can put age restrictions on alcohol, cigarettes and to purchase a gun, something needs to be something done when it comes to social media,” she told CNN Business. Snapchat’s age requirement for signing up is 13. “It’s too addictive for kids.”
In August 2019, Mitchell had just landed in Alaska on a business trip from Florida when she received a series of voice messages saying her son died of a self-inflicted gunshot wound. She said police later told her they believed Ian was recording a video at the time of the incident.
“After trying to get into some of his social media accounts, we found video of him [taken] on Snapchat that looked like he was playing Russian roulette with the gun,” Mitchell said. “We don’t know who he was sending it to or if he was playing with someone. The phone was found not too far from his body.”
Snap declined to comment on the incident.
The emergence of wrongful death lawsuits against social media companies isn’t limited to teenagers. In January, Tammy Rodriguez filed a lawsuit, alleging her 11-year-old daughter Selena struggled with social media addiction for two years before taking her own life in July 2021. (Instagram and Snapchat, the two sites her daughter is said to have used most, require users to be at least 13 years old to create accounts, but as with many social platforms, some kids younger than that still sign up.)
According to the lawsuit, Selena Rodriguez had spent more time on those social networks during the pandemic and started communicating with older men on the platforms. She responded to requests to send sexually explicit images, “which were subsequently shared or leaked to her classmates, increasing the ridicule and embarrassment she experienced at school,” the suit alleged.
“Throughout the period of Selena’s use of social media, Tammy Rodriguez was unaware of the clinically addictive and mentally harmful effects of Instagram and Snapchat,” the lawsuit said. It also cited the lack of sufficient parental controls at the time as a contributing factor, an issue that has been a focus of some recent criticism among lawmakers.
Both Snap and Meta declined to comment on the case but referenced their resources to help its users struggling with their mental health.
“If a person walks into a bad neighborhood and is assaulted, that’s a regrettable incident,” said Bergman, who is also representing the Rodriguez family. “But if a tour guide says, ‘Let me show you around the city or I’ll show you the top sites,’ and one of those [spots] is a very dangerous neighborhood where a person is assaulted, the tour guide appropriately has some responsibility for putting the tourist in harm’s way. That’s exactly what these platforms do.”
“It’s not random that teenage girls are directed toward content that makes them feel bad about their bodies. That is the way the algorithms work; it’s by design,” he added.
A long and uncertain legal road
Carl Tobias, a professor at the University of Richmond School of Law, believes these wrongful death lawsuits against social media companies could hold up in court despite inevitable challenges.
“The problem, at least in the traditional notion in the law, has been that it’s difficult to prove addiction that then leads to taking somebody’s life or doing serious damage to somebody that’s self-inflicted,” he said. “But judges and juries in certain situations might be more open to finding liability and awarding damages.”
He said Haugen’s “damning” testimony before Congress and the “seemingly troubling” data companies collect about young users, as revealed in the documents, could potentially support a ruling in favor of the plaintiffs, depending on each case.
“There’s a lot of information we didn’t have before,” Tobias said. “When a company, entity or an individual knows they’re exposing someone else to a risk of harm, then tort law and product liability law is sometimes willing to impose liability.”
While he said it’s “unclear” if the lawsuits will indeed be successful, the “arguments being made by plaintiffs and their lawyers in some of these cases are something the companies have to take seriously.”
Individual lawsuits have been filed against social media companies in the past, but the companies typically have a broad legal liability shield for content posted on their platforms. However, Tobias said because families are now targeting how the platforms are designed, it “might persuade a court to distinguish the new allegations from other actions by defendants that judges found immune.”
In the months following the leaked internal documents, Instagram has rolled out a handful of safeguards aimed at protecting its young users, including a tool called Take a Break, which aims to encourage people to spend some time away from the platform after they’ve been scrolling for a certain period. It also introduced a tool that allows parents to see how much time their kids spend on Instagram and set time limits, and brought back a version of its news feed that sorts posts in reverse chronological order rather than ranked according to the platform’s algorithms.
Last month, dozens of attorneys general wrote a letter to TikTok and Snap calling on the companies to strengthen the platforms’ existing parental tools and better work alongside third-party monitoring apps, which can alert parents if children use language that suggests a desire for self-harm or suicide.
“Your platforms do not effectively collaborate with parental control applications or otherwise provide an adequate opportunity for parental control within the platform,” the letter said. “We ask that you conform to widespread industry practice by giving parents increased ability to protect their vulnerable children.”
Snap told CNN Business in a response it is currently working on new tools for parents that give more insight into what their teens are doing on Snapchat and who they’re talking to. TikTok did not respond to a request for comment. However, the company has expanded its safety features over the years. In 2019, TikTok introduced a limited app experience called TikTok for Younger Users which restricts messaging, commenting and sharing videos for users under age 13. In 2020, it rolled out the ability to disable direct messaging for users under the age of 16.
Bergman said he anticipates a “long fight” ahead as he plans to “file a lot of cases” against social media companies. “The only thing that’s certain is the level of opposition that we’re going to face from companies that have all the money in the world to hire all the lawyers,” he said. “They want to do everything they can to avoid standing up in a courtroom and explain to a jury why their profits were more important than the life of CJ Dawley.”
Donna Dawley said the last time she saw her son, on the day of his death, he was looking down at his phone, appearing sad. “I just wish I would have grabbed him and hugged him,” she said.
“[This lawsuit] is not about winning or losing. We’re all losing right now. But if we can get them to change the algorithm for one child — if one child is saved — then it’s been worth it.”
™ & © 2022 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.