LOS ANGELES (AP) — The world’s biggest social media companies face several landmark trials this year that seek to hold them responsible for harms to children who use their platforms. Opening statements for the first, in Los Angeles County Superior Court, began on Monday.
Instagram’s parent company Meta and Google’s YouTube face claims that their platforms deliberately addict and harm children. TikTok and Snap, which were originally named in the lawsuit, settled for undisclosed sums.
Jurors got their first glimpse into what will be a lengthy trial characterized by dueling narratives from the plaintiffs and the two remaining social media companies named as defendants. Opening arguments in the landmark case began Monday at the Spring Street Courthouse in downtown Los Angeles.
Mark Lanier delivered the opening statement for the plaintiffs first, in a lively display where he said the case is as “easy as ABC,” which he said stands for “addicting the brains of children.” He called Meta and Google “two of the richest corporations in history” who have “engineered addiction in children’s brains.”
At the core of the Los Angeles case is a 19-year-old identified only by the initials “KGM,” whose case could determine how thousands of other, similar lawsuits against social media companies will play out. She and two other plaintiffs have been selected for bellwether trials — essentially test cases for both sides to see how their arguments play out before a jury and what damages, if any, may be awarded, said Clay Calvert, a nonresident senior fellow of technology policy studies at the American Enterprise Institute.
America really has a litigation culture, not because people are particularly fond of lawsuits, but because problems which are generally solved by legislative enactments or actions by regulatory bodies in other countries, aren’t in the US, and thus the only way to find out who is right is to go to court.
They are going to play the same old “freedom of choice” defense… aren’t they.
It’s not our fault we made it purposefully addictive, you could just not watch it. Hasn’t this been the case with every tobacco-, soda-, fast food-, etc company. For example: the whole mainstream idea that weight gain is about caloric imbalance and not consuming what you eat. That is the mainstream because is helps the food companies sway public opinion for their cause. It’s not our food that is horrible slop, disruptive to metabolism and engineered to make people eat more and more and still crave more, it’s the people who could just not eat it and if they do eat it they could like run 10km to sweat off the effects of like one sandwich.
They always shift the responsibility to the individuals when they are pressed on their wrongdoings. “The freedom of choice” at large is the great lie that at large keeps society running and is the main defense against any complain why something is systematically shit and fundamentally inhuman, from food to labor markets.
I think the food analogy is a good one here. I have debated personally a lot about this false sense of choice, when in reality you are bombarded with every psychological tactic to keep you hooked. Instagram in this sense is no different. If it lawsuit leads to somewhere, I do not know, however at some point the whole manipulative algorithms should be addressed (but by who and when are the biggest questions)
at some point the whole manipulative algorithms should be addressed (but by who and when are the biggest questions)
I have long since been of the opinion that all the big social multinational media should be seen as global technical, communication and media infrastructure. All the companies should be seized and put under some global foundation or the UN, everything open sourced, costs paid by member states and the platforms forced to remain impartial and to be organized for improving human condition, development, communication and understanding. If there is no need for profit then there is no need for entrapping users in toxic swamps of algorithm hell for more platform engagement.
Parents, right? That’s always the solution to platforms.
Edit: all the ironic upvotes. I was being sarcastic. Parents won’t keep their predator sons and daughters off Roblox.
removing or changing section 230 would also allow lemmy instances to be sued or taken down as well, for the content posted by users. it would increase government surveillance and basically allow the american government to dictate content across the entire internet. no more freedom of speech, whistleblowers, organization of protests, etc.
this all sounds well and good “for the sake of the chillren” but its a trojan horse for government censorship.
the only people who would be able to afford the bill for what happens after this would be american social media companies. anything “independant” or emerging like the fediverse would get bot swarmed with “illegal content” and then immediately sued into oblivion and outright removed.
this ensures complete loyalty of the digital space to the whims of the american government.
it would also allow them to remove things like wikipedia, the way back machine, the internet archive, and sites holding or spreading things around like the epstein files or at least sites holding peoples opinions of them.
And the $multi billion companies will use every bent strategy available to delay, prevent, obfuscate evidence, attack & destroy witnesses etc. They will water down the impact to harm minimilisation outcome and so set out the precedent for how bad companies can be and get away with it. We really need that precedent to be seriously strong.
is that nail ok? its awfully curved, is that fine?






