cross-posted from: https://lemmy.world/post/3320637
YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.
Youtube needs to be punished for their hypocrisy.
Average Joe gets a community guidelines strike for “promoting violence” because he said “Dead” instead of “Unalived”, but Penis Prager can advocate for beating your gay kids till they turn straight and YouTube just throws it into everyone’s playlists without so much as a “Boys will be boys”
This is so so stupid. We should also sue the ISPs then, they enabled the use of YouTube and Reddit. And the phone provider for enabling communications. This is such a dangerous slippery slope to put any blame on the platforms.
I think to blame/sue the company that is nearest to the user should work fine. (following is hyperbolical) If you don’t do it that way, then yes it would be slippery because the big bang would need to be sued. But that makes no sense.
I think the thing isn’t just providing access to the content, but using algorithms to promote how likely it is for deranged people to view more and more content that fuel their motives for hateful acts instead of trying to reduce how often that content is seen, all because they make more money if they watch more content, wether it is harmful or not.
What if it isn’t algorithms but upvotes? What if Lemmy is next?
Yeah, the difference is in whether or not the company is choosing what to put in front of a viewer’s eyes.
For the most part an ISP just shows people what they request. If someone gets bomb making directions from YouTube it would be insane to sue AT&T because AT&T delivered the appropriate packets when someone went to YouTube.
On the other end of the spectrum is something like Fox News. They hire every host, give them timeslots, have the opportunity to vet guests, accept advertising money to run against their content, and so on.
Section 512 of the DMCA treats “online service providers” like YouTube and Reddit as if they’re just ISPs, merely hosting content that is generated by users. OTOH, YouTube and Reddit use ML systems to decide what the users are shown. In the case of YouTube, the push to suggest content to users is pretty strong. You could argue they’re much closer to the Fox News side of things than to the ISP side these days. There’s no human making the decisions on what content should be shown, but does that matter?
Yep. I often fall asleep to long YouTube videos that are science or history related. The algorithm is the reason why I wake up at 3am to Joe Rogan. It’s like a terrible autocomplete.
The algorithm is tailored to you. This says more about you. I never get recommended Rogan.
Everytown Law is about to get a lesson on how Section 230 works.
Pretty sure SCOTUS has a case they’re hearing currently that may very well change the scope of section 230 so I’d maybe reserve your quips until after that shakes out lol
The two big cases this year were already decided: https://en.wikipedia.org/wiki/Twitter,_Inc._v._Taamneh and https://en.wikipedia.org/wiki/Gonzalez_v._Google_LLC
Although both dodged the S230 claims, both made it clear that Twitter and Google, respectively, had no liability.
Is there another case I missed?
spoiler
sadfasfasdfsa
Reddit enables more than just racist, it’s a nasty cesspool the like of 4chan, riddled with bots, the CEO himself is a POS.
It’s a fucked up website but if you think it’s remotely as bad as 4chan then I’ve assumed you’ve never been to /pol/. Reddit doesn’t allow the n word.
deleted by creator
It’s literally rule 1…
Rule 1 Remember the human. Reddit is a place for creating community and belonging, not for attacking marginalized or vulnerable groups of people. Everyone has a right to use Reddit free of harassment, bullying, and threats of violence. Communities and users that incite violence or that promote hate based on identity or vulnerability will be banned.
Go to r/4chan or r/Greentext, it’s all over the comment sections there
Will be dismissed on section 230 grounds.
Their content promotion algorithms are not protected by section 230. Those algorithms are the real problem, pushing more and more radical content onto vulnerable minds. (The alt-right YouTube pipeline is pretty well documented. Reddit, I think, less so. But they still promote “similar content”)
Section 230 requires "https://en.wikipedia.org/wiki/Section_230#:~:text=good faith removal or moderation of third-party material Selection of click-bait algorithms is a choice. At least, they could select out promotion of murder and maiming without a sled ride.
One of the top subs on Reddit is literally called “master race” but if you call them out on it they say “it’s just a joke”.