The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.

A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.

        • Encrypt-Keeper@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          4 minutes ago

          AI models require a LOT of VRAM to run. Failing that they need some serious CPU power but it’ll be dog slow.

          A consumer model that is only a small fraction of the capability of the latest ChatGPT model would require at least a $2,000+ graphics card, if not more than one.

          Like I run a local LLM with a etc 5070TI and the best model I can run with that thing is good for like ingesting some text to generate tags and such but not a whole lot else.

            • Encrypt-Keeper@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              37 seconds ago

              Like make a query and then go make yourself a sandwich while it spits out a word every other second slow.

              There are very small models that can run on mid range graphics cards and all, but it’s not something you’d look at and say “Yeah this does most of what chatGPT does”

    • ckmnstr@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      16 hours ago

      Probably not a flash drive but you can get decent mileage out of 7b models that run on any old laptop for tasks like text generation, shortening or summarizing.