Artificial intelligence is advancing at a speed no one anticipated. Models are growing, data is multiplying, and companies are competing to build ever-larger data centers. But there's an obstacle that's starting to slow this expansion: energy.

It may sound strange, but it's perfectly logical. A modern data center needs immense amounts of electricity, both to power thousands of chips and to cool them. And no matter how much money a company has, it can't circumvent the physical limitations of the power grid.

 Building a data center can take two or three years. But reinforcing the power grid to supply it can take between five and ten. High-voltage lines, substations, and permits are a bottleneck as real as life itself.

In fact, the CEOs of Microsoft and NVIDIA have already warned: "The growth of AI will not depend on hardware, but on available energy."

And they're right. A single training cluster can consume as much energy as a small city. The current electrical grid isn't designed to support hundreds of similar centers operating simultaneously.

But not everyone agrees with the most alarmist predictions. Some experts argue that technological advancements will make AI much more efficient, using personal computers with dual-purpose chips, more compact models, local devices with integrated AI, and hybrid systems that don't rely on massive data centers.

In other words, it's not all about multiplying the number of mega-training centers. But until that future arrives—if it ever does—operators don't want to take any risks.

Those in charge of new data centers are looking for any available energy source. And literally anything will do:

, Old nuclear power plants about to close.

. Small modular nuclear power plants (SMRs) still under development.

. Coal-fired plants, highly polluting but reliable.

. Natural gas plants with rapid response capabilities.

The urgency is such that some tech companies have scaled back their climate commitments. Google, for example, removed its goal of achieving net-zero emissions by 2030 from its website in June. A clear message: AI demands more energy than they anticipated.

And as ground-based solutions begin to dwindle, some companies are looking to the heavens. This isn't a metaphor: they are proposing space-based solar power plants, panels that orbit the Earth and transmit their energy via microwaves. Google has already announced tests for 2027.

A bold, almost science-fiction-like idea, but one that demonstrates the extent to which the sector is seeking alternatives. Is it creativity? Is it desperation? Perhaps both.

The expansion of AI is linked to energy, not just silicon. If we don't figure out how to power the necessary infrastructure, AI will advance more slowly than expected. And if we solve it poorly—using coal or gas—we will pay the price in polluting emissions.

The challenge is significant, but it can also drive radical innovations in efficiency, storage, and generation.

Once again, technology is forcing us to rethink the world we are building.

Amador Palacios

By Amador Palacios

Reflections of Amador Palacios on topics of Social and Technological News; other opinions different from mine are welcome

Leave a Reply

Your email address will not be published. Required fields are marked *

en_USEN
Desde la terraza de Amador
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.