Tesla Hits Parked Police Car, Autopilot Blamed: Waymo vs Tesla

Mike Mish Shedlock

Hours ago there was yet another reported Tesla autopilot crash with a parked vehicle.

I picked up the image from a BBC report.

The amusing quote of the day is from NBC News.

"Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents," the company said in a statement.

Simple Question

Why is it that a Tesla cannot stay in its own lane?

To hit a parked car or a ramp is proof that it cannot.

On May 25, an Unofficial European Drive Tour came to a screeching halt when a Tesla Model 3 slammed into a median.

You You Xue, an early Tesla Model 3 owner, created the project to tour North America and Europe and show the new electric vehicle to the many reservation holders.

You You Xue commented “Vehicle was engaged on Autopilot at 120 km/h. Car suddenly veered right without warning and crashed into the centre median. (edit: divider at the exit fork). I’m unharmed.”

Electrek was sceptical.

When he says that it hits the median at an “exit fork”, the accident sounds reminiscent of the fatal Model X accident on Autopilot in Mountain View where it confused the median for a lane.

But in that case, Tesla showed that the driver had a lot of the time to take control.

While using Autopilot, the driver always needs to stay attentive and be ready to take control at all time.

As for You You’s accident, he makes it sounds like he had no time to respond as the car “suddenly veered”, but I have never seen Autopilot do that before.

I am not saying that You You is lying, but it is certainly a strange situation. Also, it looks like he was driving late at night and has been driving for days.

If it’s not a misuse of Autopilot and the system indeed “suddenly veered” into the median, then it might be the first example of Autopilot causing an accident, but I think we need to have the data logs before we get into that.

Revised Statement

I offer this revisions to Tesla's official statement so that it makes some bit of sense:

"It hasalways been clear that Autopilot doesn’t make the car impervious toanyaccident ."

I recommend this additional statement: "Frankly, we advise not using autopilot at all."

Comparison to Waymo

A reader, citing Tesla crashes, taunted me just yesterday with a silly comment regarding self-driving vehicles and my proposed timeline for them.

Tesla is not self-driving, Waymo is. Someone does not need at the wheel at all times to take control of a Waymo.

Waymo's accident rate is at least as good as the public at large and it will get better, much better.

Electrek was overly generous to Tesla.

Tesla's autopilot clearly cannot even stay in its own lane!

Mike "Mish" Shedlock

Comments (18)
No. 1-18
xil
xil

the problem with functionality which appears to work reliably is it instills a false sense of security and lulls the observer into complacency.

CautiousObserver
CautiousObserver

One has to wonder why Tesla has not yet been buried in product liability lawsuits for this. Less insidious defects have caused other manufacturer's to go bankrupt, such as when people were injured because Blitz's fuel cans could explode when they were misused to pour gasoline on an open flame:

It appears that some attorneys might still be trying to recruit clients for Blitz claims despite Blitz filing bankruptcy in 2012:

Archer1
Archer1

I recently learned that an acquaintance has a Tesla model X. He loves the car but never uses the autopilot. He said he felt like it didn't 'see' the big picture. Especially when the right white lane marker moved away such as at the start of a right turn lane or exit ramp. The car would start drifting toward the turn or exit rather than stay on the road.

shamrock
shamrock

"Waymo's accident rate is at least as good as the public at large" it would be interesting to know if you have a source for that assertion. I can't find any.

Jojo
Jojo

Tesla is just helping us weed the herd a bit. We should thank Musk!

shamrock
shamrock

Oh right, Waymo, the car that drives 25mph max, slower when it's confused. The car in which 14 of the 25 in operation in 2015 were involved in traffic accidents. The car which until recently always operated with a human driver, who took over for 25% of miles driven. It will probably be operating smoothly in total autonomous mode real soon now.

MntGoat
MntGoat

There is no data that Waymo's accident rate is as good as the publics. There are no true fully self driving cars (with no safety drivers in them) in commercial use yet. Not one. And most test vehicles are likely driven in "easy" controlled environments AND with safety drivers. So there is zero data. I do believe of course that one day they will be far safer then human drivers.

kpmyers
kpmyers

Hey Mish, Autonomous Tech will be here, but only after the LIDAR components atop the car become more secure, less expensive and have a lower profile. The equipment looks very easy to steal. Or, it will get damaged in the process of stealing. At $7.5K - 8K per pop, it offers an attractive target to thieves. Here's an article that discusses the cost of the LIDAR equipment

PatS
PatS

What if autopilots in cars kill only half the peokple that humans do? Are we prepared to accept that robots will kill people, even at a lesser rate than humans? My inclination is that will not be enough of a reduction to be acceptable, it will need to be more than 90% reduction to be accepted.

My reasoning is based on human social behavior. You can get remorse and apology from a human, and you can punish and get a form of justice, but you will not get any of that with machines. When deaths and harm from machines become more common people will push back, because when tragedy strikes we need someone to blame.

Clintonstain
Clintonstain

Per Mish, automated driving is just around the corner. No matter that automated driving is already proving deadly with cars, it shouldn't be a deterrent to employing the same technology with semis weighing 20 times as much immediately. Anyone who thinks differently is a luddite simpleton.

pgp
pgp

People struggle to know when and how to use simple cruise control so it's not surprising that semi-automated driving would be even more perilous. Putting untrained humans in a supervisory position when operating complicated machinery is always a recipe for disaster. Tesla's mistake, in it's poorly planned haste to impress the market is to offer semi-automation at all.

Stuki
Stuki

By far the biggest externally observable difference between Tesla’s autopilot and Waymo’s botcars, is simply that the Teslas have been let out f the cage. Too early, according to some :(

Tesla needs to keep pushing the New-New hype, as this is their only revenue source. Waymo/Google can afford to treat the whole exercise as a cautiously approached Big Science project “for the public good.” Confident that if they ever manage to land their man on the moon, it will be such a game changer that whatever costs and effort they have poured into it, will at that point pay off handsomely regardless. While in the meantime, they can sit back, making more money off people searching for “Tesla crashes,” than Tesla will likely ever make.

pi314
pi314

If we exclude risky drivers, i.e. those who pay high insurance premiums, human driver accident rate will be significantly lower. Until autonomous vehicle can beat that rate in all driving conditions, why should anyone be in an autonomous vehicle?

asteester
asteester

I watch the Waymo cars here in Chandler, AZ everyday. They seem to drive great. I'm a software engineer and I'm fascinated by the tech and the public safety challenge. I really appreciate what Waymo does. I went to their open house and learned that they record all road tests so that they can be replayed in the future thousands of times with different tweaks to the algorithms. Also, the sensor data looks redundant which helps when one sensor can't be trusted because of conditions.

asteester
asteester

At first the cars were just going down city streets at 40. They would wait for big gaps in traffic if turning left across traffic from a side street. You can see how they stick to the lanes. Never saw them confused. Then they started showing up in neighborhoods where there are no lines, parked cars , etc. Now, in the last month they have started entering and exiting the highway. I saw one do a u-turn at a light but I'm not sure if it was in autonomous mode. Plus Waymo built a town on an old airstrip to get a lot of things right before hitting the real streets.

asteester
asteester

Waymo also has a way for its drivers to record incident data by voice via a button on the steering wheel as opposed to Uber who made them type it. Also, Uber had their "Brake on object detection" feature turned off when they hit and killed that woman in Tempe. Because they said it was getting too many false positives and was suddenly braking for nothing. Too Jerky they said. I just really think Waymo is doing it right and could go live someday (soon?).


Global Economics

FEATURED
COMMUNITY