How Many People need to Die in Self Driving Car Tests?

It was reported that a driver-less Uber car struck a pedestrian and killed her. It was the first fatality of its kind...

How Many People Need to Die in Self Driving Car Tests?

It was reported that a driver-less Uber car struck a pedestrian and killed her. It was the first fatality of its kind with a driver-less car. However, not the first accident or fatality of its kind when dealing with the driver-less or hands free technology. Here is the problem with this technology. It is a goldmine for the car manufacturer who get this technology right. There are many players rushing through the process to be the first and to grab market share.

The first company to get this right stands to make enormous amount of money. Because of that, the greed factor is in full effect. This is nothing new to the automotive industry. They exemplify that greed daily. They rush through the manufacturing process, cut corners, use faulty parts, just to get inventory on the lots and sales in the books. They are constantly introducing new technologies that are obviously not adequately tested. Is it working?

All you have to do is look at the millions of automotive recalls year in and year out. These recalls exemplify technology and processes that are not ready for the road. Car manufacturers put drivers at risk everyday with faulty parts and irresponsible processes. Just the other day, Ford put out a recall on millions of cars that have a faulty steering wheel assembly. It appears that the steering wheel can just come off of the steering column while driving.

With driver-less technology you can't foresee and prepare for all of the different scenarios that could occur. Thus, mistakes in judgment are made by the driver-less technology. The developers learn what not to do through trial and error using unsuspecting passengers as guinea pigs. How many deaths/accidents will have to occur before they get this technology right? With automakers rushing the process there are bound to be enormous amounts of flaws and mistakes that could have been tested had they slowed down and been thoughtful with the process. The last to market will not be first. However, the first will need deep pockets to survive all of the lawsuits as attorneys sue for negligence.

The story about this tragic accident is a good example of what is wrong. Uber is "testing" the technology on the vary roads we walk on and drive on. Do you want to be on the road with driver-less cars? How about on the highway with driver-less 18 wheelers?

Comments
No. 1-4
BobBrooks
BobBrooks

Editor

@B-investor - good point - the parts are always be in play- the car companies? Who knows

B-investor
B-investor

Atkinsww1 brings up the moral question that AI developers and society is struggling with. What would you do in a half second,, hit the pedestrian or swerve and you go off a cliff and die? Now put it in software code. I’d rather own Nvidia than the car company or uber. And if you didn’t know Nvidia makes the super fast semiconductor brains that go into power video gamers PCs.

BobBrooks
BobBrooks

Editor

@atkinsww1 - The way Uber reacted to the situation it was clearly the fault of the car. They are keeping the details of the story pretty close to the vest. Yes, there are so many variables. A Tesla wrecked in CA because of the light that was magnified off of the bumper of an 18 wheeler. How do you program for that one? I think the driver of the car died in that particular case.

atkinsww1
atkinsww1

A couple of thoughts. 1) Do we know for sure what happened? We had a dear friend who killed a person that stepped into traffic from between two cars. Our friend was deemed not at fault but had to live with what happened; 2) I think we like the randomness of a human. We don't know what a human will do in any situation. Hit the kid, the dog, drive into a tree, run off a cliff, any are possible. With programming, even "AI", someone has to predetermine who gets hit in a situation, and who gets to make that decision.

Stories