• Devanismyname@lemmy.ca
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    12
    ·
    18 hours ago

    It’ll just keep better at it over time though. The current ai is way better than 5 years ago and in 5 years it’ll be way better than now.

    • GenosseFlosse@feddit.org
      link
      fedilink
      arrow-up
      2
      ·
      11 hours ago

      To get better it would need better training data. However there are always more junior devs creating bad training data, than senior devs who create slightly better training data.

      • SaraTonin@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 hours ago

        And now LLMs being trained on data generated by LLMs. No possible way that could go wrong.

    • almost1337@lemm.ee
      link
      fedilink
      arrow-up
      13
      arrow-down
      1
      ·
      17 hours ago

      That’s certainly one theory, but as we are largely out of training data there’s not much new material to feed in for refinement. Using AI output to train future AI is just going to amplify the existing problems.

      • Devanismyname@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        11
        ·
        16 hours ago

        I mean, the proof is sitting there wearing your clothes. General intelligence exists all around us. If it can exist naturally, we can eventually do it through technology. Maybe there needs to be more breakthroughs before it happens.

        • Nalivai@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          7 hours ago

          Everything possible in theory. Doesn’t mean everything happened or just about to happen

          • mindbleach@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            arrow-down
            4
            ·
            13 hours ago

            I mean - have you followed AI news? This whole thing kicked off maybe three years ago, and now local models can render video and do half-decent reasoning.

            None of it’s perfect, but a lot of it’s fuckin’ spooky, and any form of “well it can’t do [blank]” has a half-life.

            • SaraTonin@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 hours ago

              If you follow AI news you should know that it’s basically out of training data, that extra training is inversely exponential and so extra training data would only have limited impact anyway, that companies are starting to train AI on AI generated data -both intentionally and unintentionally, and that hallucinations and unreliability are baked-in to the technology.

              You also shouldn’t take improvements at face value. The latest chatGPT is better than the previous version, for sure. But its achievements are exaggerated (for example, it already knew the answers ahead of time for the specific maths questions that it was denoted answering, and isn’t better than before or other LLMs at solving maths problems that it doesn’t have the answers already hardcoded), and the way it operates is to have a second LLM check its outputs. Which means it takes,IIRC, 4-5 times the energy (and therefore cost) for each answer, for a marginal improvement of functionality.

              The idea that “they’ve come on in leaps and bounds over the Last 3 years therefore they will continue to improve at that rate isn’t really supported by the evidence.

            • Korhaka@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 hours ago

              Seen a few YouTube channels now that just print out AI generated content. Usually audio only with a generated picture on screen. Vast amounts could be made so cheaply like that, Google is going to have fun storing all that when each only gets like 25 views. I think at some point they are going to have to delete stuff.