Uber Halts Driverless Tests Following Fatal Crash

An Uber driverless car struck and killed a person in Tempe, Arizona. Uber halted all tests.

Tempe Police says the vehicle was in autonomous mode at the time of the crash and a vehicle operator was also behind the wheel. No passengers were in the vehicle at the time.

Uber has paused self-driving operations in Phoenix, Pittsburgh, San Francisco and Toronto, which is a standard move, the company says.

Bloomberg notes the woman was crossing the road outside of a crosswalk when the Uber vehicle, operating in autonomous mode under the supervision of a human safety driver, struck her, according to the Tempe Police Department.

Bicyclist or Pedestrian?

Several stories say a bicyclist was killed, others say a pedestrian. It may have been a pedestrian walking a bicycle.

Here's the answer.

The fact remains that driverless cars have a better safety record than human-driven cars. Also, recall there was a safety driver behind the wheel who could have but didn't take over.

Had this not involved an autonomous car, this would not be in any news.

These Tweets are from people who have the correct attitude.

This was bound to happen at some point and it is hardly surprising that Uber is the company. However, It's highly likely no one is to blame but the woman stepping in front of a car.

Mike "Mish" Shedlock

Comments (71)
No. 1-50
MarkBC
MarkBC

I strongly disagree with "However, It's highly likely no one is to blame but the woman stepping in front of a car." I am pretty sure that the "driver" of a vehicle is responsible for any damage/death caused by the vehicle. I don't think regular jay-walking (as opposed to darting out into traffic) changes the fault for the accident. That's exactly the sort of omission I want autonomous vehicle manufacturers to be penalized for heavily until they get it right. I don't want the odd errant pedestrian to be the externality that is ignored in the rush to profit generated by autonomous vehicles.

jivefive99
jivefive99

Wait till these companies who now use human drivers to deliver stuff replace those drivers with a computer and no longer have the minimum wage owner/driver to dump the car maintenance and repair bills on. Its the Pizza Hut model of enriching your company: force the drivers to use personal cars and dump the car expense on the poorly paid driver and not on Pizza Hut.

abend237-04
abend237-04

Absolute nightmare for everyone involved, including the coders who will now respond and lengthen their safety lead on that lane-hopping, tailgating texter we've all come to know and hate.

Coasting2018
Coasting2018

In the long term these cars will be safer. And the stock market and housing will be at their right pricing. But the short term is where we live. Do people get to know that Uber is testing on particular streets? What is the exorbitant penalty that Uber and the car company and the software company have to pay?

Grumblenose
Grumblenose

"It's highly likely no one is to blame but the woman stepping in front of a car". Wow. That's a leap.. Here's an alternative leap - the human supervisory driver was not paying attention, the self-driving software failed to detect and avoid a pedestrian/cyclist.

Jojo
Jojo

You shouldn't think so hard. If a pedestrian or bicyclist violates the law then the responsibility for any problems that happens rests with them. If you suddenly dart out from between cars, then you are at wrong.

I once hit a bike rider coming out of a gas station onto a one way street. I was looking in the direction traffic was coming (from the left) to turn right. As I started to do forward, a bike rider came from the opposite direction and I wound up knocking them about 10 feet out into the road. Luckily for them, the traffic was far enough away that the person did not get run over. Their bike was destroyed though. A police report was filed and I was assured that this was all on the bike riders shoulders because they were riding against traffic. The rider found a lawyer and tried to sue me but the case was thrown out of court as again, being the bike riders fault.

The real question here is WHY the human overseer did not take control of the car in time? In all probability, the person was blissing out, texting or playing video games and not paying attention to the road, as they should have been. If anyone is to be held responsible on the car end, it is this person.

Mike Mish Shedlock
Mike Mish Shedlock

Editor

"The real question here is WHY the human overseer did not take control of the car in time? In all probability, the person was blissing out, texting or playing video games and not paying attention to the road, as they should have been. If anyone is to be held responsible on the car end, it is this person. "

AWC
AWC

"A pedestrian does not need a crosswalk to have the right of way at an intersection. Arizona law dictates that there is an implied invisible crosswalk at an intersection, even if one hasn't been placed there officially. However, Arizona law places the burden on the pedestrian to make sure it is safe to cross before stepping out into the street." An excerpt of Vehicle code.

Mike Mish Shedlock
Mike Mish Shedlock

Editor

Did the driver have any time to react? People do dumb things. Most of the time it does not matter.

Jojo
Jojo

I have come close to running someone down in a crosswalk at night because they are wearing dark clothes in a not well lit area and just walk into the street, not realizing that even at 20mph, it takes some amount of time to recognize the need to stop and then to do so. Too many people don't think.

AWC
AWC

And if a "Person" must ultimately be responsible for the operation of a motor vehicle, why bother with all the autonomous aspects of running a 3500 lb missile on city streets? Who's next in the Tort lineup, the code writer? The manufacturer? The Guy who made the optical sensors? What a can of worms. The Bar Association's gotta love it. And just imagine the insurance rates after one of these "Legally Autonomous" robots kills a group of kids somewhere, someday?

Mike Mish Shedlock
Mike Mish Shedlock

Editor

As a pedestrian, I have almost gotten hit by not paying attention. I backed up off a ledge once taking a picture. Fell several feet. Adjusting my radio the other day, I easily could have slammed into the rear of the car in front of me. My knob has a flaw - sometimes it operates in the reverse direction, I was looking at it and not paying attention. This would not happen with self-driving.

Grumblenose
Grumblenose

Jojo said: "If a pedestrian or bicyclist violates the law then the responsibility for any problems that happens rests with them." Really? Where I live road users of all types break the law quite frequently but that doesn't give you the right to kill them. People turning left often go after the lights have changed, pedestrians often fail to finish crossing in the time allotted to them, cyclists swerve out to avoid potholes. I see a cyclist, I slow down and I give them a wide berth, expecting the unexpected. This is precisely why human drivers are better than self-driving - we see and understand the world around us at a much deeper level.

CzarChasm-Reigns
CzarChasm-Reigns

"Who's next in the Tort lineup, the code writer?"---AWC

SILLY RABBIT, you've forgotten Capitol Hill's TRIX to maximize Corporate earnings: "Loophole would protect self-driving car companies from lawsuits"

abend237-04
abend237-04
(deleted message)

It's obviously a group suicide; hold the wheel straight and decelerate asap

SleemoG
SleemoG

A tale told by an idiot, full of sound and fury, signifying nothing.

No one stops this train. Self-driving cars in 2020.

Grumblenose
Grumblenose

I suggest everyone who lives in an area where they are "testing" self-driving cars get themselves a dash-cam. Expect a whole lot of victim-blaming whenever an accident happens and huge teams of expensive lawyers in place to "prove" that you're in the wrong.

abend237-04
abend237-04
(deleted message)

But even at a leisurely 2 G bit clock rate, the algorithm can probe over 600 million alternative option paths, picking most likely in each case, while I'm trying to signal my foot to begin moving toward the brake pedal.

abend237-04
abend237-04
(deleted message)

Depends entirely on whether we're Post-Singularity. If so, I'd expect the autopilot to branch to, "I don't have a clue who this guy is; he hijacked me two miles back."

xil
xil

even at this relatively early stage in the autonomous vehicle, my money's on the pedestrian/bicyclist being at fault (and i'm not a betting man).

Grumblenose
Grumblenose

You don't have to create such complicated and unlikely scenarios. You're passing a cyclist and they suddenly swerve out in front of you. You can choose to hit the cyclist or go over the center line and possibly collide with a car going the other way. Which do you choose? Bear in mind that a car coming the other way will likely try to avoid you should you cross the center (but they may not!), but you will almost certainly kill or badly injure the cyclist if you strike them. Lots of factors to take into account - the size of the vehicle coming the other way, your speed, the other vehicle's speed, the width of the road, etc, etc.

Grumblenose
Grumblenose

Actually the question isn't really, "Who's at fault?", the question is: "Did the self-driving system make a decision at least as good as a human driver would have?". A self-driving system has to be good at protecting lives whether they are at fault or not. Little Jimmy suddenly running out into the street chasing his ball doesn't deserve to die just because he is at fault.

abend237-04
abend237-04

A friend of mine killed a kid on a bike in Colorado some years back. The kid was riding east, barely upright, while waiting for a noisy, accelerating 18 wheeler going west to get past so he could turn west too. He likely never heard my friend's pickup approaching from his left rear and whipped left into the pickup with no warning. A.J Foyt in his prime couldn't have avoided hitting him. I've thought about that accident for years now. I think I finally see something that might have prevented it or at least have minimized the damage done: Autonomous driving.

JonSellers
JonSellers

Mish said "The fact remains that driverless cars have a better safety record than human-driven cars." Do you have numbers on that Mish? Maybe something along the lines of deaths/miles driven?

abend237-04
abend237-04
(deleted message)

Absolutely; would you not?

Grumblenose
Grumblenose

Hard to do an apples-to-apples comparison when self-driving cars are currently only used in the easiest of scenarios (low traffic suburban streets, good weather, etc). Tesla claimed their autopilot is safer than a human driver, but they compared autopilot freeway miles in good conditions with death rates over all miles driven in all conditions.

RedQueenRace
RedQueenRace

"But human drivers kill just 1.16 people for every 100 million miles driven. Waymo and Uber and all the rest combined are nowhere near covering that kind of distance, and they’ve already killed one."

"“This is another major illustration that the technology we’re talking about is evolving over time and not necessarily road ready for wide deployment,” says Bryan Reimer, who studies human behavior and driverless vehicles at MIT."

I'm not going to say that the 1st quote proves driverless cars are less safe but Mish also has no business claiming they are safer at this time.

This, along with his statement that the pedestrian was "highly likely" to be the one to blame (poor taste), comes across as him digging in his heels in his belief he is right about how quickly this stuff will be ready.

abend237-04
abend237-04
(deleted message)

Now there's an optimist. An even better plan: Stay off the spot you're destined to die on. You'll live forever.

Greggg
Greggg

Tempe Police says the vehicle was in autonomous mode at the time of the crash and a vehicle operator was also behind the wheel. Maybe vehicle operator was on facebook at the same time. Both crashed.

abend237-04
abend237-04
(deleted message)

Yes: A forced-choice hypothetical scenario constructed by a lawyer to train other lawyers in how to dodge difficult, common sense decisions when confronted with one. Throw the switch, dammit.

Hammeringtruth1
Hammeringtruth1

Where are the safety statistics on autonomous vehicles?

Mike Mish Shedlock
Mike Mish Shedlock

Editor

Mar 1, 2018 - Waymo's vehicles have driven 4m miles on public roads; the only accidents they have been involved in while driving autonomously were caused by humans in other vehicles. AVs have superhuman perception and can slam on the brakes in less than a millisecond, compared with a second or so for human ...

AWC
AWC

Guess I'll just stick with my 7500lb truck, and if one of these little electric self driving jitneys gets in the way, I'll swing by the car wash and clean it off my grill.

AWC
AWC

@mish, "This would not happen with self-driving." Like the one in Arizona?

abend237-04
abend237-04
(deleted message)

The incumbents and their lawyers are always right...at first. Henry Ford's first business venture was in San Francisco, selling booster engines to local carriage companies, getting heavy loads up steep hills. It was a total bust: Noisy, scared the horses, sued left and right, etc. He packed up and went back to Detroit, restarting by just trying to put his little engine on a carriage without the horse. That's where I think we are with autonomous driving.

Grumblenose
Grumblenose

There's road work on the other side of the 2-lane road you're on and no flagger. To pass it, oncoming traffic has to go slightly over the solid center line (which is illegal) into your lane. To give room to the oncoming traffic and avoid unpleasant head-on collisions, you and other vehicles on your side go into the cycle lane (which is illegal) since there are no cyclists. This is the kind of law-breaking, cooperative driving that human drivers do all the time in order to use the roads efficiently and safely. In such a case a self-driving car would probably go plowing down the middle of its lane (perfectly legally) and maybe cause a head-on collision, which would then be the fault (legally) of the oncoming human driver because he went over the center line! This is why self-driving cars are a long, long way from being ready.

Sunriver
Sunriver

The question of the adoption of self-driving cars has little to do with safety and/or economy, but everything to do with culpability. Society wants a criminal and victim in all cases. The accused, whoever they are, can not "hide" behind autonomy. There are 1.3 million lawyers in the United States and they must be fed! What the heck will the lawyers do if there is no longer gain to be made from automobile infractions! What the about the cascading financial effect on the Department of Corrections, County Courthouses, Police Departments?

El_Tedo
El_Tedo

Add 5 years to your driver-less car projections, Mish. This isn't gong to be a smooth transition.

Mike Mish Shedlock
Mike Mish Shedlock

Editor

Almost no chance this adds 5 years

MntGoat
MntGoat

First of all, when we are talking that "lots of driver-less cars will be on the road soon", are we talking fully autonomous with NO driver at all? Or are we talking cars that drive themselves but with a "safety driver"? If its the former, no way in 2 years. I do not see by 2020 being able to get a ride in a fully autonomous car with NO DRIVER that picks me up at my house and takes me to the airport. Navigating city streets, potholes, kids running into traffic, 4 way stops with no lights, etc.... There are a BILLION little things that go into this! This takes a shit ton of technology! People are reading the SD car company press releases and taking it as gospel. There are basically NONE of these fully operational on the road yet they will be everywhere in 2 yrs??? And Mish, how can you say that they are way safer then human drivers? There is NO DATA to back this up! There aren't really ANY self driving cars driving around in large numbers yet, but you claim they are safer then human drivers? Is this because that's what the white paper on the car technology says ??? Without any proof in the field? You need a lot more data to make this claim.

abend237-04
abend237-04

I notice the insurance companies are betting on the computer: They swag 21,700 lives and $450 Billion saved annually. These people are not bad with odds and numbers...

MntGoat
MntGoat

How do you know this when we pretty much have zero self driving cars on the road yet to have data to work from? Someone is a cubicle somewhere just made up these numbers.

abend237-04
abend237-04

Spent a lot of years debugging complex high performance systems. Rules of thumb: 1.Don't ship before you've killed every single bug you can find inhouse. 2.Ship and expect an explosion of bugs you couldn't find inhouse. 3.If you don't weaken, you can cut the failure rate in half every 18 months. 4.You'll get two cycles of cutting the failure rate in half before you have to ship the next product and start all over again. If you fall off this forced march path, don't worry about it; your troubles are over...your competitor(s) have won. AV is not plagued by throwaway; they're going to win.

Clintonstain
Clintonstain

Sadly, the pedestrian was dead and therefore unable to Tweet the “correct attitude”.

Stuki
Stuki

“Waymo's vehicles have driven 4m miles on public roads; the only accidents they have been involved in while driving autonomously were caused by humans in other vehicles.”

They were caused by the INTERACTION between humans and the AV. Virtually all traffic accidents, short of straight up vehicular crime, are caused by complexity-emergent interactions. Not one guy being baaaaad-baaaad-call-the-ambulance-chasers-to-save-us-all, while everyone else are “perfect.”