Today we're going to talk about one of those news stories that, besides being curious, raises a profound debate about the future we're already experiencing: Can a self-driving car be fined? And, more importantly, who is responsible when software commits a violation?
The incident occurred recently in San Bruno, California. A Waymo self-driving car was traveling when, to the surprise of some officers, it made an illegal turn. When the police stopped the vehicle, they found a scene that until recently had been unthinkable: there was no one at the wheel.
After confirming the software failure, the police allowed the car to continue on its way, but immediately sent a detailed report to Waymo, the fleet management company.
This incident, undoubtedly caused by a failure in the driving algorithm, underscores a crucial point: in the autonomous driving ecosystem, the only party responsible for a traffic violation is the company that created and manages the driving software. If there is a penalty, it is paid by the programmer, not the absent "driver."

This incident is not the first to raise suspicions about the line of liability. We have seen cases of accidents, and even fatalities, involving Tesla vehicles using "Autopilot" mode.
The terminology used by Tesla is ambiguous, to say the least. By calling its system Autopilot, they suggest complete autonomy, but at the same time they recommend the driver to keep their hands on the wheel. This legally confusing duality seeks to dilute responsibility.
However, in the case of a fully autonomous vehicle like Waymo's, where human intervention is not anticipated, the responsibility clearly falls on the technology company.
In response to the news, Waymo reacted quickly, providing data that puts the situation into perspective. The company claims that its autonomous vehicles have had 80% fewer incidents than human-driven cars to date. This appears to be true.
And, curiosities aside, self-driving cars are here to stay. Every day, more companies are deploying fleets in our cities, and more municipalities are joining the initiative. Algorithms, with their occasional glitches, are inherently more cautious; they don't get distracted or experience human fatigue.
It seems clear that, despite the legal and ethical challenges, autonomous vehicles will be a fundamental part of our future mobility, promising safer and more efficient streets. The question is not whether they'll drive, but how we'll hold them accountable when they do.
It's all a matter of time.