Subject: Re: Power usage By AI
Yes, the amount of money and land and resources (electricity, water for cooling etc) is staggering.
FWIW, my prediction is that these huge data centers in ten years will be like shopping malls are now, i.e. white elephants (dead ones) for which no one quite knows how to deal with the corpses.
A lot of the power is consumed by matrix multiply of the large neural nets. But there are advances in matrix multiply algorithms (type "efficient matrix multiply for AI?" into perplexity.ai, also try "approximate"), the new approaches can be up to 100x faster.
But that's just tweaking the implementation of current neural networks. The "neurons" used in current AI neural nets are very simple, so one needs a lot of them, resulting in a lot of connections, and hence lots of matrix multiplies. There's work on using more complicated (more neuromorphic) neurons and fewer connections (again, perplexity.ai is your friend).
Lest one think that there's been a revolution in understanding human thought processes that led to the current explosion in AI, Hinton himself has remarked (sorry, I don't have the reference to hand) that although there's been some very very nice technical and algorithmic advances, neural networks today aren't that much different than the old backprop days. What's made a bigger contribution to what is happening now is the advances in compute power, and the available online training datasets.
There's work on growing brain cells in a dish with implanted electrodes (brain organoids), Front. Sci., 27 February 2023 Volume 1 - 2023 | https://doi.org/10.3389/fsci.2...
One can envision some hybrid of "dish brains" with current, simulated, neural networks. Then there's Musk's "NeuralLink" and similar efforts to interface with the fully developed human brain as opposed to "brain organoids in a dish".
Here's a number to ponder when thinking of current AI power requirements: the human brain runs on roughly 20 watts.