- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
Did Covid-19 make everyone lose their minds? This isn’t even about being cruel or egotistical. This is just a stupid thing to say. Has the world lost the concept of PR??? Genuinely defending 𝕏 in the year 2026… for Deepfake porn including of minors??? From the Fortnite company guy???
Who else just did a search on the Epstein files for “Tim Sweeney”?
I didn’t find anything on jmail, but there’s still a lot that haven’t been released, and a lot of stuff is still redacted.
As if we didn’t need more reasons to hate on epicgames.
They are al such vile people 💀
sigh I’ll get my boycotting tools.
What a surprise, Tim Sweeney is still a shit person
Pedo and facist defendant Tim Sweeney. Burn his business down by disconnecting your, patronage, money, time, etc. Boards can find CEOs who are not supporters of Pedo and Facists
This guy needs to be gimped and fucked by zombie epstein with vance warming up with a recliner waiting on his turn. Can twitter make that happen?
I’m no fan of banning this or that particular platform (it’s like trying to get rid of cheeseburgers by banning McDonalds; the burgers are still available from all the other burger chains and all the people who use the one will just switch to others) but this is a hilariously wrong way to get to the right answer.
This is almost as sus as the the specific preferred age range terminology for pedophiles that comes up now and again in the most uncomfortable of scenarios
The only “charitable” take I can give this is that he’s been fighting Apple and Google over store fees and the like and that he feels like if he says that Apple/Google can do this then they should be able to restrict EGS as well.
I don’t know why CSAM AI material is the hill you’d make this point with though.
Zionazi oligarchist supremacism controlling media/speech promoting hate and genocide is reason to zero out his finances and media control. That bipartisan establishment loves all of this, means this performative whining over image generation tools that can be used to fake offense, is the permitted pathethic discourse establishment masquerades as democracy.
Yeah. I’m as tired of the argument that pretty much anything goes as far as free speech goes as I am of the “everything is a slippery slope when we make laws to keep people from doing harmful shit.”
I mean what’s the required damage before people put a stop to inciteful speech and objectively harmful lies? Or making CSAM of kids using a platform like X? Germany had to kill a few million people before they decided that maybe displaying Nazi symbols and speech wasn’t a good idea. So we have a platform being used to make CSAM. What’s it going to take before someone says that this is a bad idea and shouldn’t be done? How many kids will commit suicide after being taunted and shamed for their images being used? How many is “enough”? There should be immediate action to end the means to use these tools to make porn, there’s plenty of porn available on the internet and making it from user-submitted images on a major public platform is a horrible idea, but too many make up all kinds of reasons why we can’t do that…economic, censorship, whatever.
IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it’s fine to change them as you need. The real thing to talk about is the presence or absence of a victim.
Non-consensual porn victimises the person being depicted, because it violates the person’s rights over their own body — including its image. Plus it’s ripe material for harassment.
This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing.
And it applies to children and adults. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus always victimising the children in question.
Now, someone else mentioned Bart’s dick appears in the Simpsons movie. The key difference is that Bart is not a child, it is not even a person to begin with, it is a fictional character. There’s no victim.
That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.
Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.
There ARE victims, lots of them.
That is a lot of text for someone that couldn’t even be bothered to read a comment properly.
Non-consensual porn victimises the person being depicted
This is still true if the porn in question is machine-generated
The real thing to talk about is the presence or absence of a victim.
Which they then talk about and point out that victims are absolutely present in this case…
If this is still too hard to understand i will simplify the sentence. They are saying:
“The important thing to talk about is, whether there is a victim or not.”
It doesn’t matter if there’s a victim or not. It’s the depiction of CSA that is illegal.
So no, talking about whatever or not there’s a victim is not the most important part.
It doesn’t matter if you draw it by hand with crayons. If it’s depicting CSA it’s illegal.
Nobody was talking about the “legality”. We are talking about morals. And morally there is major difference.
Talking about morals and morality is how you end up getting things like abortion banned. Because some people felt morally superior and wanted to enforce their superior morality on everyone else.
There’s no point in bringing it up. If you need to bring up morals to argue your point. You’ve already failed.
But please do enlighten me. Because personally. I don’t think there’s a moral difference between depicting “victimless” CSAM and CSAM containing a real person.
I think they’re both, morally, equally awful.
But you said there’s a major moral difference? For you maybe.
That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.
Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.
There ARE victims, lots of them.
You’re only rewording what I said in the third paragraph, while implying I said the opposite. And bullshitting/assuming/lying I didn’t read the text. (I did.)
Learn to read dammit. I’m saying this shit Grok is doing is harmful, and that people ITT arguing “is this CSAM?” are missing the bloody point.
Is this clear now?
Yes, it certainly comes across as you arguing for the opposite since you above, reiterated
The real thing to talk about is the presence or absence of a victim.
Which has never been an issue. It has never mattered in CSAM if it’s fictional or not. It’s the depiction that is illegal.
Tim Sweeny is a jackass.









