META Platforms, TikTok, Alphabet's Google and YouTube faced courtroom scrutiny this week over allegations that their platforms are fuelling a youth mental health crisis, as the national debate about kids’ screen time enters a new phase.
WHAT DID THE LOS ANGELES JURY FIND?
The jury ordered Meta and Google on March 25 to pay a combined US$6 million in damages to plaintiff Kaley G.M., a 20-year-old who said she suffered from depression and suicidal thoughts after becoming addicted to the companies’ platforms at a young age because of their attention-grabbing design. The jury found that both Meta and Google were negligent in designing their platforms and failed to warn consumers about their risks.
WHAT HAPPENED IN NEW MEXICO?
Separately, a jury in New Mexico on March 24 ordered Meta to pay US$375 million after finding the company misled users about the safety of Facebook and Instagram while enabling child sexual exploitation on those platforms in a lawsuit brought by the state's attorney general.
WHY ARE THESE TRIALS IMPORTANT?
The trials were the first to test whether Big Tech can be held liable for the design of apps blamed for harming young people's wellbeing. Meta, Snapchat and parent Snap Inc., opens new tab, Google's YouTube, and TikTok and parent ByteDance are facing thousands of lawsuits in federal and state courts over claims they knowingly designed their platforms with features that addict children and teens, fuelling a mental health crisis.
SOCIAL MEDIA FACES LEGAL BACKLASH
In addition to cases like Kaley's in state court, the social media companies face more than 2,300 similar lawsuits filed by parents, school districts and state attorneys general in federal court.
“It’s true that in the United States that all companies have goals to make money,” Donald Migliori, an attorney for the New Mexico attorney general, told the jury. But, he added, “Meta made its profits while publicly misrepresenting that its platforms were safe for youth, downplaying or outright lying about what it knows about the dangers of its platforms.”
Meta’s attorney Kevin Huff told the New Mexico jury the company has made extensive efforts to protect its users and has warned about the risk of bad content on its platforms.
The wave of litigation in the U.S. is part of a global backlash against social media platforms over children's mental health. Australia has prohibited access to social media platforms for users under age 16, and other countries including Spain are considering similar curbs.
Despite Australia's social media ban however, one-fifth of Australian teenagers under 16 were still using social media two months after the country banned platforms from allowing minors, industry data showed, raising questions about the effectiveness of their age-gating methods.
The number of 13-to-15-year-olds using TikTok and Snapchat, among the most popular social media apps with Australian teenagers, fell from before the ban took effect in December to February, but still more than 20% used the apps, according to a report by parental control software maker Qustodio provided to Reuters.
The data is among the first to show the effects on youth online behaviour since Australia rolled out the ban, which is being copied by governments around the world.
Fears that teenagers might migrate to unregulated platforms have not materialised, the data showed, although WhatsApp recorded a small uptick in use among 13-15-year-olds.
The Australian government and at least two university studies are tracking the ban's impact but none has published data yet.
TECH INDUSTRY
Social media platforms including TikTok, Facebook and Snapchat, say people need to be at least 13 to sign up.
Child protection advocates say the controls are insufficient, and official data in several European countries shows huge numbers of children under 13 have social media accounts.