That the use of AI requires a much greater amount of energy to function is more than known and very easy to understand, especially because the computers that execute the calculations require greater calculation capabilities, and therefore consume more energy.

The fact is that the data centers built until now consumed around 30 megaWat. of power, and the new ones that have computers with AI capabilities consume more than 80 megaWat. of power and this continues to grow, and some companies that are dedicated to the installation of these data centers are considering placing small nuclear plants of the SMR type with them to feed them with electrical energy.

It is estimated that AI already accounts for around 1% of global electricity consumption, and this figure is expected to increase significantly in the coming years. Some studies predict that AI energy consumption could reach 20% of the global total by 2030, but these are all estimates without real data.

Because currently, companies that offer AI services (Microsoft, Open AI, Google, etc...) hardly provide data on their activities: they do not indicate how big the system is, how many parameters their models handle, in what physical locations. are working, etc...

And of course they do not say anything about their energy consumption or their economic results, although some “experts” indicate that AI is beginning to generate positive results. The reality is that the opacity is almost total in that market niche, since the competition is also very high and the future is yet to be decided.

   

When an AI system consumes the most is when it is carrying out “learning”, as this is when it manipulates the greatest amount of data, and we must also know that it is more expensive to generate photos and videos than to generate text.

Of course, there are various strategies to reduce AI energy consumption, including:

  • Development of more efficient algorithms: Researchers are working on developing algorithms that require fewer computational resources to perform the same tasks
  • Using more efficient hardware: Manufacturers are developing specialized chips and servers for AI that are more energy efficient.
  • Implementing more efficient cooling strategies: Data centers are adopting more efficient cooling technologies, such as liquid cooling, to reduce energy consumption.
  • Harness renewable energy: Data centers can use renewable energy sources, such as solar and wind energy, to power their systems.

AI is a powerful tool with great potential to improve our lives, but its responsible use requires careful attention to its energy consumption. Implementing strategies to optimize the energy efficiency of AI is crucial to ensure sustainable development of this technology.

And it is also important to think a little about who and why they need AI. A question may be: do I really need AI for my personal issues? And if I need it, it will be very limited, and I should consume very little. Using something without a real need is the most stupid and absurd thing there is, and we do that all the time today with our cell phones.

We upload many millions of photos and videos to the Internet every day just to pass the time. We consume without being aware of what we do. With AI these effects would multiply exponentially at a personal and company level. We are entering with great joy into a “pond” whose depth we do not know, and it may be very, very deep.

I believe that it is essential to promote the development of responsible AI that is sustainable from an energy point of view. Because the future is either sustainable or it is not the future.

By Amador Palacios

Reflections of Amador Palacios on topics of Social and Technological News; other opinions different from mine are welcome

Leave a Reply

Your email address will not be published. Required fields are marked *

en_USEN