Is there anyway to make it use less at it gets more advanced or will there be huge power plants just dedicated to AI all over the world soon?
Is there anyway to make it use less at it gets more advanced or will there be huge power plants just dedicated to AI all over the world soon?
My understanding is that traditional AI essentially takes a bruteforce approach to learning and because it is hardwired, its ability to learn and make logical connections is impaired.
Newer technologies like organic computers using neurons can change and adapt as it learns, forming new pathways for information to travel along, which reduces processing requirements and in turn, reduces power requirements.
https://www.techradar.com/pro/a-breakthrough-in-computing-cortical-labs-cl1-is-the-first-living-biocomputer-and-costs-almost-the-same-as-apples-best-failure
https://corticallabs.com/cl1.html
Machine learning always felt like a very wasteful way to utilize data. Even with ridiculous quantities of it, and the results are still kinda meh. So just dump in even more data, and you get something that can work.
Lower voltage chip advancement along with better cooling options may come along some day.
They should consider building their super centers underwater in places like Iceland.
Thats only a short term solution, global warming will negate those benefits.
Southern ocean currents just reversed and will likely cause rapid warming of water temps.
Southern Ocean circulation reversed
Southern Ocean current reverses for first time, signalling risk of climate system collapse
France and Switzerland just had to shutdown their nuclear reactors due to the water sources they use for cooling being too warm.
France and Switzerland shut down nuclear power plants amid scorching heatwave
When heat halts power: Europe’s nuclear dilemma
None of that is terrifying at all /s.