Police Release Video of Uber Pedestrian Fatality

The video is out. And without a doubt a driver in full control, in the same situation, would not have had time to react.

Here's the video.

The Uber driver did not seem to be paying much attention. I wouldn't have either. Nor would you. The point is, and it's obvious, there was no time for a human to react.

However, one might easily have expected radar to perform better. In this instance, self-driving fared no better than humans. Not a great performance.

Would a Waymo vehicle have done better? I suspect so, but I do not know so.

Looking ahead: Radar will get better. Humans won't.

That's the all-important bottom line.

Will this slow down self-driving deployment?

Mike "Mish" Shedlock

Comments
No. 1-25
JeffD
JeffD

medical and aviation software contains hundreds to thousands of lines of source code running an algorithm that specifies a specific action clearly. Self-driving car code is thousands to tens of thousands of lines of code, where the user has no idea of how the software is making decisions, because the decisions are based on tens of megabytes to gigabytes of training data. There is a chasm of risk between a few well defined steps in a medical/aviation algorithm and a cloud of data whose interpretation is known only to the machine learning algorithm. Very, very different. BTW I used to work on the Honeywell team that maintained the space shuttle engine controllers in addition to other similar mission critical software. My past experience in eliminating failure in complex systems is likely what compels me to be so vocal in carefully considering when high risk technology is "ready".

FelixMish
FelixMish

@JeffD - Yes, you're asking the standard questions for device control software. For example, medical and aviation are fields with software in such high-reliability environments. Car companies have built systems for such environments for some time now. Check the Internet for answers to how these things are done.

To give a flavor of this sort of engineering, think redundant systems, often of completely different tech. E.g. Use both LIDAR and camera. One dies? The other is used to gracefully fail (pull to the side of the road in the case of a car). Also, reliability/testing is one reason Waymo talks of rolling their driver's 10-plus digit odometer.

JeffD
JeffD

And this is the real problem, Let's say it is a simple bug. How often will software be updated, introducing new bugs? How many simple bugs exist in the system? How often does technology have a 'shutdown' level failure event compared to a human (i.e. humans passing out for no reason). Technology is in total control here, and technology fails all the time. How often have you had to cycle your computer, cell phone, <insert gadget here>, because it mysteriously locked up? How can one insist that the computer software will, at some distant point in the future, work unfailingly when the set of possible inputs is an undefined open system, and not a well-tested closed system? These are the questions that have to be answered in order to make the claim, with a straight face and honesty in one's heart, that self-driving cars (with no steering wheel!) will have a better safety record than human drivers.

FelixMish
FelixMish

Just noodling some more about this Uber kill: I'd not be surprised if it's found to be a simple bug - not related to AI or the sensors or anything particularly groundbreaking in the tech. Instead, just some edge case that fell through the cracks because a loop missed the first or last item in a list. The kill seems a classic single-point-of-failure bug.

It's interesting that so little information has come from Uber on this. They should have published something explaining what went wrong by now. The engineers can't still be trying to find out!?! They would have replayed this event plenty of times by now, and it's not like anyone on their team has more important things to do.

whirlaway
whirlaway

More stupidity mixed with wishful thinking, lack of sensitivity, greed and arrogance. The driver was the BACKUP! Which means, he/she is expected to react AFTER it became clear that the automatic system had failed. So, his/her reaction will necessarily be SEVERAL HUNDRED MILLISECONDS to A FEW THOUSAND MILLISECONDS TOO LATE.