As policy makers in the UK weigh how to regulate the AI industry, Nick Clegg, former UK deputy prime minister and former Meta executive, claimed a push for artist consent would “basically kill” the AI industry.

Speaking at an event promoting his new book, Clegg said the creative community should have the right to opt out of having their work used to train AI models. But he claimed it wasn’t feasible to ask for consent before ingesting their work first.

“I think the creative community wants to go a step further,” Clegg said according to The Times. “Quite a lot of voices say, ‘You can only train on my content, [if you] first ask’. And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data.”

“I just don’t know how you go around, asking everyone first. I just don’t see how that would work,” Clegg said. “And by the way if you did it in Britain and no one else did it, you would basically kill the AI industry in this country overnight.”

    • GenderNeutralBro@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      25
      ·
      2 days ago

      This is true almost every time someone says “but without <obviously unethical thing>”, these businesses couldn’t survive! Same deal with all the spyware that’s part of our daily lives now. If it’s not possible for you to make a smart TV without spying on me, then cool, don’t make smart TVs.

      If your business model crumbles under the weight of ethics, then fuck your business model and fuck you.

      Related: https://www.eff.org/deeplinks/2019/06/felony-contempt-business-model-lexmarks-anti-competitive-legacy

    • themurphy@lemmy.ml
      link
      fedilink
      arrow-up
      11
      arrow-down
      2
      ·
      2 days ago

      There’s a big difference in generative image AI, and then AI for lets say the medical industry, Deepmind etc.

      And yes, you can ban the first without the other.

      Going for AI as a whole makes no sense, and this politician also makes it seem like it’s the same.

      Saying AI is the same as just saying internet, when you want to ban a specific site.

      • megopie@beehaw.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        There is a very interesting dynamic occurring, where things that didn’t used to be called AI have been rebranded as such, largely so companies can claim they’re “using AI” to make shareholders happy.

        I think it behooves all of us to stop referring to things blankety as AI and specify specific technologies and companies as the problem.

  • FergleFFergleson@infosec.pub
    link
    fedilink
    arrow-up
    35
    ·
    2 days ago

    I’m starting to think we need to reframe this a little. Stop referring to “artists”. It’s not just lone, artistic types that are getting screwed here, it’s literally everyone who has content that’s been exposed to the Internet. Artists, programmers, scientists, lawyers, individuals, companies… everyone. Stop framing this as “AI companies versus artists” and start talking about it as “AI companies versus intellectual property right holders”, because that’s what this is. The AI companies are choosing to ignore IP law because it benefits them. If anyone, in any other context, tried to use this as a legal defense they would be laughed out of the courtroom.

  • HelixDab2@lemm.ee
    link
    fedilink
    arrow-up
    39
    ·
    2 days ago

    Nick Clegg says asking artists for use permission would ‘kill’ the AI industry

    I fail to see any downside to this.

  • tuhriel@infosec.pub
    link
    fedilink
    arrow-up
    42
    ·
    2 days ago

    If your business modell only works if you don’t follow any moral or official laws…it shouldn’t exist!

    Unfortunately, capitalism doesn’t work like that…

  • Kichae@lemmy.ca
    link
    fedilink
    English
    arrow-up
    35
    ·
    2 days ago

    I bet door-to-door salespeople would make way more money if they could just break into your homes, leave their junk on your table, and steal your credit card, and yet we don’t let them do that.

  • LandedGentry@lemmy.zip
    link
    fedilink
    English
    arrow-up
    38
    ·
    2 days ago

    This is the same shit sites like YouTube use to get out of being accountable for anything they do. “We are too big. It is unreasonable to ask us to follow the law. So our benchmark of what a good faith attempt should suffice.”

    Motherfucker then don’t be so big! If I’m a real estate developer and my building collapses killing 100 people, I can’t go “my empire is too big. It is unreasonable to expect me to follow all the various codes and ordinances designed to keep people safe.“

  • IllNess@infosec.pub
    link
    fedilink
    arrow-up
    29
    ·
    2 days ago

    Speaking at an event promoting his new book, Clegg said the creative community should have the right to opt out of having their work used to train AI models.

    No, it should be the opposite. The creative community should have to opt in. AI can run off the uploaded pieces. Everything else is theft.

    But he claimed it wasn’t feasible to ask for consent before ingesting their work first.

    What the fuck…?! Send a fucking email. If you don’t get an answer, then it’s a “No”. Learn to take no for an answer.

    • tuhriel@infosec.pub
      link
      fedilink
      arrow-up
      27
      ·
      2 days ago

      The big issue is that they don’t just “do not ask”, they also actively ignore if it if someone tells “no” upfront. E.g. in a robots.txt

    • Saleh@feddit.org
      link
      fedilink
      arrow-up
      13
      ·
      2 days ago

      Yeah, if they cant bother to check for an opt-in, how should we trust them to respect an opt-put?

  • will@lemm.ee
    link
    fedilink
    English
    arrow-up
    19
    ·
    2 days ago

    Perhaps the government should collect money from the AI companies — they could call it something simple, like “taxes” — and distribute the money to anyone who had ever written something that made its way to the internet (since we can reasonably assume that everything posted online has now been sucked in to the slop machines)

    • takeda@lemm.ee
      link
      fedilink
      arrow-up
      15
      ·
      edit-2
      2 days ago

      I think the primary goal of LLM is to use it on social media to influence public opinion.

      Notice that all companies that have social media are heavily invested in it. Also the recent fiasco with Grok taking about South African apartheid without being asked shows that such functionality is being added.

      I think talking about it to replace white collar jobs is a distraction. Maybe it can some, but the “daydreaming” (such a nice word for bullshit) I think makes the technology not very useful in that direction.

    • Avid Amoeba@lemmy.ca
      link
      fedilink
      arrow-up
      6
      ·
      2 days ago

      What’s a fucking shocking idea right? My mind is blown and I’m sure Mr. Clegg would be ecstatic when we tell him about it! /s

      Greedy dumb mfkers.

  • hperrin@lemmy.ca
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    2 days ago

    Oh no wouldn’t that be a shame. /s

    I’m sorry but if your industry requires that you commit a bunch of crimes to make money, it’s not a legitimate industry, it’s a criminal industry. We’ve had these for a long time, and generally they’re frowned upon, because the crimes are usually drugs, guns, murder, sex trafficking, or theft. When the crime is intellectual property theft, apparently we forget to care. Then again, same with wage theft.