Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.

Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.

How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?

  • brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    6 days ago

    To add to what others said, it’s a tradeoff.

    Your gaming PC not only runs up your electric bill from the wall, but the AC as well. It has to work to get all that heat out.

    This is the equivalent of water cooling your PC, and piping it to a hot tub outside. It would heat it and evaporate water faster, but it’s basically free and uses basically no electricity.

    That’s the tradeoff. It’s water evaporation instead of heat pumps. It’s trading water usage for lots of electricity usage, which in some cases, is well worth it.

    And what if you live in a cold climate, you say? Well, evaporative cooling is most cost efficient in hot and (ironically) dry climates.