• KelvarCherry@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    7
    ·
    42 分钟前

    Did Covid-19 make everyone lose their minds? This isn’t even about being cruel or egotistical. This is just a stupid thing to say. Has the world lost the concept of PR??? Genuinely defending 𝕏 in the year 2026… for Deepfake porn including of minors??? From the Fortnite company guy???

  • SpaceCowboy@lemmy.ca
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    1 小时前

    Who else just did a search on the Epstein files for “Tim Sweeney”?

    I didn’t find anything on jmail, but there’s still a lot that haven’t been released, and a lot of stuff is still redacted.

  • MrSulu@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    2
    ·
    3 小时前

    Pedo and facist defendant Tim Sweeney. Burn his business down by disconnecting your, patronage, money, time, etc. Boards can find CEOs who are not supporters of Pedo and Facists

  • MehBlah@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    47 分钟前

    This guy needs to be gimped and fucked by zombie epstein with vance warming up with a recliner waiting on his turn. Can twitter make that happen?

  • Sunsofold@lemmings.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    4 小时前

    I’m no fan of banning this or that particular platform (it’s like trying to get rid of cheeseburgers by banning McDonalds; the burgers are still available from all the other burger chains and all the people who use the one will just switch to others) but this is a hilariously wrong way to get to the right answer.

  • Grass@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    5 小时前

    This is almost as sus as the the specific preferred age range terminology for pedophiles that comes up now and again in the most uncomfortable of scenarios

  • criss_cross@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    6 小时前

    The only “charitable” take I can give this is that he’s been fighting Apple and Google over store fees and the like and that he feels like if he says that Apple/Google can do this then they should be able to restrict EGS as well.

    I don’t know why CSAM AI material is the hill you’d make this point with though.

  • humanspiral@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    4
    ·
    4 小时前

    Zionazi oligarchist supremacism controlling media/speech promoting hate and genocide is reason to zero out his finances and media control. That bipartisan establishment loves all of this, means this performative whining over image generation tools that can be used to fake offense, is the permitted pathethic discourse establishment masquerades as democracy.

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    edit-2
    9 小时前

    Yeah. I’m as tired of the argument that pretty much anything goes as far as free speech goes as I am of the “everything is a slippery slope when we make laws to keep people from doing harmful shit.”

    I mean what’s the required damage before people put a stop to inciteful speech and objectively harmful lies? Or making CSAM of kids using a platform like X? Germany had to kill a few million people before they decided that maybe displaying Nazi symbols and speech wasn’t a good idea. So we have a platform being used to make CSAM. What’s it going to take before someone says that this is a bad idea and shouldn’t be done? How many kids will commit suicide after being taunted and shamed for their images being used? How many is “enough”? There should be immediate action to end the means to use these tools to make porn, there’s plenty of porn available on the internet and making it from user-submitted images on a major public platform is a horrible idea, but too many make up all kinds of reasons why we can’t do that…economic, censorship, whatever.

  • Lvxferre [he/him]@mander.xyz
    link
    fedilink
    English
    arrow-up
    25
    ·
    12 小时前

    IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it’s fine to change them as you need. The real thing to talk about is the presence or absence of a victim.

    Non-consensual porn victimises the person being depicted, because it violates the person’s rights over their own body — including its image. Plus it’s ripe material for harassment.

    This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing.

    And it applies to children and adults. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus always victimising the children in question.

    Now, someone else mentioned Bart’s dick appears in the Simpsons movie. The key difference is that Bart is not a child, it is not even a person to begin with, it is a fictional character. There’s no victim.

    • Atomic@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      21
      ·
      12 小时前

      That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

      Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

      There ARE victims, lots of them.

      • unexposedhazard@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        14
        ·
        11 小时前

        That is a lot of text for someone that couldn’t even be bothered to read a comment properly.

        Non-consensual porn victimises the person being depicted

        This is still true if the porn in question is machine-generated

          • unexposedhazard@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            10
            ·
            9 小时前

            Which they then talk about and point out that victims are absolutely present in this case…

            If this is still too hard to understand i will simplify the sentence. They are saying:

            “The important thing to talk about is, whether there is a victim or not.”

            • Atomic@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              8
              ·
              7 小时前

              It doesn’t matter if there’s a victim or not. It’s the depiction of CSA that is illegal.

              So no, talking about whatever or not there’s a victim is not the most important part.

              It doesn’t matter if you draw it by hand with crayons. If it’s depicting CSA it’s illegal.

                • Atomic@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  2 小时前

                  Talking about morals and morality is how you end up getting things like abortion banned. Because some people felt morally superior and wanted to enforce their superior morality on everyone else.

                  There’s no point in bringing it up. If you need to bring up morals to argue your point. You’ve already failed.

                  But please do enlighten me. Because personally. I don’t think there’s a moral difference between depicting “victimless” CSAM and CSAM containing a real person.

                  I think they’re both, morally, equally awful.

                  But you said there’s a major moral difference? For you maybe.

      • Lvxferre [he/him]@mander.xyz
        link
        fedilink
        English
        arrow-up
        7
        ·
        11 小时前

        That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

        Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

        There ARE victims, lots of them.

        You’re only rewording what I said in the third paragraph, while implying I said the opposite. And bullshitting/assuming/lying I didn’t read the text. (I did.)

        Learn to read dammit. I’m saying this shit Grok is doing is harmful, and that people ITT arguing “is this CSAM?” are missing the bloody point.

        Is this clear now?

        • Atomic@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          6
          ·
          10 小时前

          Yes, it certainly comes across as you arguing for the opposite since you above, reiterated

          The real thing to talk about is the presence or absence of a victim.

          Which has never been an issue. It has never mattered in CSAM if it’s fictional or not. It’s the depiction that is illegal.