The arrival of artificial intelligence has led to explosive growth in data centers worldwide. Every question asked by a chatbot, every generated image, and every trained model requires enormous computing power. The result is a frantic race to build gigantic infrastructures, with increasingly unsustainable energy and resource consumption.

But the question is inevitable: what if some of this weren't actually necessary?

In the current model, everything goes through the cloud. Today, most AI systems work in a fairly straightforward way. The user makes a query, that information travels to a large data center, where it's processed by very powerful models, and the answer is returned to the user. This centralized model makes sense when the calculations are massive and must serve millions of people simultaneously.

The problem is that this approach is only within reach of a few large technology companies, with billions in investment, preferential access to advanced chips, and enormous energy contracts. Furthermore, it poses other challenges: cloud dependence, rising costs, privacy issues, and a significant energy footprint.

In this context, the work being done by researchers at EPFL (École Polytechnique Fédérale de Lausanne) is particularly interesting. They have developed new software—which has already spawned a startup—capable of avoiding sending data to the cloud for many Artificial Intelligence tasks.

Their proposal, called Anyway System, allows for the coordination of several machines within a local network and the execution of open-source AI models directly on those machines. In other words: distributed AI, close to the user and without relying on large external data centers.

If this approach proves viable on a large scale, the impact could be remarkable. Many common AI tasks—internal queries, document analysis, corporate assistants, administrative automation—don't require massive models or millisecond responses.

In these cases, processing information locally reduces costs, improves privacy, and alleviates pressure on central infrastructure. And, incidentally, it challenges the current hype that justifies the accelerated construction of data centers around the world.

Not everything will be local, but neither will everything be centralized. It's important to be realistic. The most complex calculations, large foundational models, and certain critical applications will still require very powerful data centers. But that doesn't mean everything has to go through them.

More and more researchers and companies are working on distributed AI, edge computing, and more efficient models capable of running closer to the user. It's a more balanced, less urgent, and almost certainly more sustainable approach.

Artificial intelligence is still very young in terms of mass adoption. We've only recently begun to experience it as "the hottest technology," and it wouldn't be surprising if its architecture evolves rapidly in the coming years.

Perhaps the solution isn't to eliminate data centers, but to use them more efficiently and only when truly necessary. In the meantime, we should calmly observe how proposals like EPFL's progress. EPFL's project has already passed the prototype stage and is being tested in companies and government agencies in Switzerland, and I imagine we'll have more concrete news about its true viability before long.

As is almost always the case with technology, time will tell, and things will fall into place.

Amador Palacios

By Amador Palacios

Reflections of Amador Palacios on topics of Social and Technological News; other opinions different from mine are welcome

Leave a Reply

Your email address will not be published. Required fields are marked *

en_USEN
Desde la terraza de Amador
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.