google_car

Google self-driving car pulled over for holding up traffic

When a California cop pulled over a Google self-driving car for holding up traffic this week, he knew he couldn’t send its robot driver to jail. But exactly where the responsibility lies for traffic problems caused by autonomous vehicles is not always so clear.

For minor offences like speeding or parking tickets, the person sitting behind the driver’s wheel is almost certainly going to face the music, says Bryant Walker Smith, an assistant professor in the School of Law at the University of South Carolina and expert on self-driving car law: “Under existing law, someone who is most immediately and obviously the operator of the vehicle would likely be treated as the driver.”

The US states that have explicitly regulated autonomous vehicles so far – California, Nevada and Michigan – all require a responsible human safety driver ready and able to take over immediately if something should go wrong. But that is not the case in Texas, where Google’s prototype self-driving vehicles have also taken to the road.

If Google decided to start picking up random people on the street in Austin and giving them rides (there’s no evidence that this is happening), those passengers probably would not have any legal liability for everyday traffic tickets. “In that case, I’d expect some enterprising manager at the state highway patrol to send the ticket direct to Google,” says Smith.

The legal situation would be murkier if a self-driving car were, for example, to veer wildly into oncoming traffic and cause a serious or fatal accident. Criminal charges like vehicular manslaughter require negligence or malice on the part of the driver. “If the person in the driver’s seat was doing everything right, monitoring and observing properly, they would not have the mental culpability for manslaughter,” says Smith.

Read more at:

When a robot car makes a mistake, a human always gets the ticket | Technology | The Guardian.