A Police Officer Pulled Over a Driverless Waymo. Nobody Knew What Happened Next.

When a police officer pulled over a driverless Waymo robotaxi, the interaction revealed something far more important than one awkward traffic stop: the legal and procedural systems governing autonomous vehicles are still being written in real time. A TikTok video posted by user joe_setaro captured the surreal moment when an officer initiated a traffic stop on a vehicle with no one in the driver's seat, forcing both the officer and the passenger to navigate uncharted territory.

What Actually Happened During the Stop?

The Waymo appeared to behave unpredictably on the road, according to the officer's account. The vehicle was stopping and starting repeatedly before eventually coming to a halt in the middle of the street, which warranted a traffic stop regardless of whether a human was driving. What made the situation genuinely unusual was what happened next: the officer essentially conducted his interaction with a Waymo customer service representative through the car's speaker system rather than with a person behind the wheel.

There was no driver to make eye contact with, no one to hand over a license and registration, and no human being to explain what had gone wrong. The officer was left talking into the air of an empty front seat, trying to get answers from a remote support team. Setaro, the passenger, remained quiet throughout the clip, which made it impossible to determine whether he was unbothered, unsure, or simply enjoying the chaos for content purposes.

Who Gets the Ticket When There Is No Driver?

This question sits at the heart of why this incident matters far beyond the viral moment. In a traditional traffic stop, the answer is straightforward: the driver gets the citation. But when the car itself is the driver, everything becomes legally murky.

Traffic laws across most states are written around the assumption that a human is operating the vehicle. Autonomous vehicle legislation varies wildly depending on location. California, where Waymo operates heavily, has some of the more developed frameworks for autonomous vehicles, but even those frameworks do not always spell out exactly what happens when a robotaxi does something wrong in front of a police officer.

In a situation where the autonomous system itself causes a traffic violation, liability would most likely fall on the company operating the vehicle rather than the passenger inside. The passenger is not controlling anything; they are essentially a stressed-out ride-share customer. Waymo, as the operator of the vehicle and the technology behind it, would be the party responsible for the car's behavior. However, there is no simple mechanism for an officer on the scene to issue a citation to a corporation in real time, and that paperwork gets complicated quickly.

What Would Happen If Waymo Is Actually at Fault?

If the vehicle malfunctioned or made a bad decision that caused a traffic violation or, in a worse scenario, an accident, the legal burden would shift to Waymo as both the manufacturer and operator of the autonomous system. Passengers in self-driving vehicles are generally not considered liable for what the vehicle does on its own, which makes logical sense: you would not blame someone in the back seat of a taxi for the driver running a red light.

Where it gets thornier is in cases involving property damage or personal injury. Waymo carries its own insurance and has an established process for handling incidents involving its fleet. But the officer on the ground does not have a clean protocol for issuing a citation to a cloud-based decision-making system. The practical reality is that the officer likely documents the incident, contacts the company, and the rest gets sorted out later through channels that do not involve a roadside conversation.

How to Prepare for Autonomous Vehicle Traffic Stops

The Waymo incident is funny on the surface, but it is actually a useful stress test for a system that is still very much being built in real time. Several key areas need immediate attention as autonomous vehicles become more common on public roads:

  • Law Enforcement Training: Police officers need better training and clearer protocols for autonomous vehicle encounters. Officers should not have to figure out on the fly how to handle a car with nobody in the front seat. That kind of preparation should already be built into training programs in cities where these vehicles operate.
  • Company Communication Standards: Companies deploying autonomous vehicles on public roads have a responsibility to communicate more clearly with both passengers and local authorities about how to handle unexpected situations. A speaker connecting to customer service is creative, but there should probably be more structured guidance available in the vehicle itself.
  • Passenger Rights Clarification: Passengers need to understand what their role is during a stop. If a police officer pulls over your Waymo, what are you supposed to do? Can you be asked to exit the vehicle? Are you responsible for anything? These are questions worth having answered before you are sitting in a driverless car watching a cop talk to a speaker.

For now, companies like Waymo typically respond quickly to these situations, remotely monitor their vehicles, and have support teams available around the clock, which is exactly what appeared to happen in the video. However, this reactive approach is not sustainable as autonomous vehicle deployments scale across more cities and more jurisdictions.

The technology is moving fast. The legal and social infrastructure around it is moving considerably slower. Clips like this one are a reminder that the gap between those two speeds is where things get interesting, and occasionally a little chaotic. As robotaxis become more common, the questions raised by this traffic stop will shift from curiosities to everyday operational realities that require clear answers.