Girl wins social media addiction lawsuit — Meta and Google to be fined for how their platforms are built
A high-profile trial took place in the US where social networks were sought to be held accountable not for content, but for the very design of their platforms. And the jury sided with the user, writes TechCrunch. Details are reported by devby.io.

A Los Angeles court found Meta and Google negligent in a case concerning the harm of social networks to a teenager's mental health.
The plaintiff was a girl named Kaylie, who is now 20 years old. She claimed that in her teenage years, she developed an addiction to Instagram and YouTube, which led to anxiety, depression, and a body image disorder.
The jury decided that the companies are responsible — not for specific posts, but for how their platforms are designed. In court, they discussed, for example, the infinite scroll and other mechanics that force users to spend more time on the apps. The court ordered the companies to pay about $6 million: approximately $4.2 million from Meta and $1.8 million from Google.
Lawyers tried to prove that the companies knew about the risks to teenagers. Internal documents were presented in court, according to which the companies studied how addictive their services could be for young users, and yet continued to implement solutions that increased engagement. Company executives, including Meta CEO Mark Zuckerberg, testified and defended the company's decisions — for example, the lifting of a temporary ban on beauty filters, which the company internally considered potentially harmful to teenage girls.
This case is considered indicative — a kind of test process for thousands of similar lawsuits currently being heard in California courts. In recent years, pressure has been growing on tech companies in the US due to the influence of social networks on children and teenagers. Legislation is not keeping pace with technology, so the battle is increasingly moving to the courts.
The companies disagree with the decision and will appeal. But the verdict itself could already have consequences for the entire industry: if courts begin to recognize that harm can be caused by the structure of platforms, and not just content, social networks may have to change algorithms, interfaces, and engagement mechanics, even if it slows down their business.
-
People speak less and less each year — and that's a problem
-
Return home could be the most dangerous part of the mission for Artemis-2 astronauts
-
Special services gained access to Signal correspondence, even though the app was already deleted. The same could happen with TG. How so, and how to protect yourself?
Comments
Что в итоге, будут делать продукты как можно хуже, чтобы никому не нравились и не вызывали зависимости?
Дызайн які стымулюе бясконцае спажыванне павінен быць opt-in а не by default, з усімі папярэджаннямі. Таксама ў стужцы як на пачцы цыгарэт павінны з'яўляцца перыядычныя шок-малюнкі аб шкодзе злоўжывання для псіхічнага здароўя