Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.

Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.

How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?

  • SaltSong@startrek.website
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 days ago

    I’m an actual engineer with a degree and everything, although this is not my area of expertise, it’s one I’m familiar with.

    They could do something like you suggest, but every step becomes more expensive and less effective. The exhaust from a coal fired power plant is still plenty hot, and more energy could be extracted from it. But it requires more and more to make less and less.

    The curse of every engineer is to see a way to them every waste stream into a useful product, but not being able to do so profitably. (Which means no-one will approve the project)