Yep, gotta love people huh?. This is pretty much why there have been accidents with autopilot, people don't use it like it's supposed to be used, or over-trust something that is self-admitted to not be a fully driverless function.
And the man was a former SEAL. You'd think he'd have more discipline than that.
I heard they programmed some autopilot cars (in case it had to make a choice), to go for a solid object instead of a human when doing evasive actions. So basically someone added code to kill the driver. Nice job for a programmer.
I sure hope the human the car is trying to steer around is not one like this :-)
Good god I'd hope the car would realize that it shouldn't be leaving the road way and ergo not have to make the decision to strike an inflatable Santa or the house belonging to the family of said inflatable Santa.
I think that's making the best of a bad situation. Look at it this way.
The car has a choice, strike a pedestrian, or hit something else.
If it strikes the unprotected pedestrian, the driver will probably not be injured. The pedestrian will most likely be killed if the speed of the vehicle is above 30 mph. If the speed is below 30, chances are he/she will probably be seriously injured.
If the car avoids the pedestrian and strikes something solid, the driver of the car has the protection of the car's crumple zones which are designed to absorb crash energy, the supplemental restraint systems (air bags) designed to protect the driver from the internal components of the car's interior, and the seat belt which is designed to secure the driver in the proper position so that he/she doesn't go bouncing around the interior of the car and can be in the optimal position for the other safety systems to do their job.
In addition, if the speed of the collision is under 30 mph, the driver of the car will most probably walk away. If the speed is under 60 mph, the driver may be injured and even hospitalized but will probably survive.
Now couple all that with the fact that the driver of the automobile and the autonomous vehicle itself must yield to pedestrians it makes the most sense for a programmer to design the system to avoid pedestrians even if it means striking a more solid object.
All that aside, ask yourself this, would you rather have your car strike a pedestrian at 45 mph and know that you were responsible, in some way, of taking that person's life or would you rather hit a wall at 45 mph, go to the hospital, and recuperate to go on and lead a relatively normal life?
What's going to suck is that the dude's insurance (auto, and possible life as well) will avoid paying since he was most likely breaking the law as well as not following the instructions included with the vehicle (thus, Tesla is off the hook).
Have you ever heard the phrase "A disclaimer is only as strong as the paper it's written on?" Did you know that there are certain rights you cannot sign away? I'm not an attorney and don't know all the details but Tesla is far from off the hook in a civil case.
As for the insurance company not paying because he was breaking the law, I know my insurance company has paid when it was reported I was going over the speed limit and was involved in an accident. I know when people have died because they were on their cell phone not paying attention and wrecked their car, the insurance companies have paid. Not sure him watching Harry Potter will absolve them from paying the family whatever he was insured for.
I don't understand how the radar/sonar (whichever) system didn't kick in... like you see on the ?Mazda? commercial where they say if his the highest crash rating EVER because it simply avoids accidents...
Maybe those shorter ranged systems did detect the truck but it was too late to stop the vehicle?
Maybe that's why Tesla said the bit about the truck being too high? (IE the non visual avoidance systems couldn't see it either?)
I wondered about that two and have come up with some theories.
- The truck was out of the sensors angle of view horizontally, meaning, that since the truck was "crossing the T" if you will of the Tesla, it was too far off to the side initially to be seen by the vehicle. This, to me, seems like a horribly bad design if this was the case.
- The truck was out of the sensor's line of sight. There might have been something between the sensor and the truck so that the sensor didn't see the truck until it was too late...again though, this seems unlikely since there was no indication the sensor EVER saw the truck and the report makes it sound like there was nothing blocking the line of sight of the driver or car.
- The truck was initially out of the sensor's range by distance. Again, this doesn't make sense as one would think the sensor should be able to see far enough ahead so that it could stop in time for the speed it is traveling if something got in the car's path.
- The sensor straight up malfunctioned. This seems like the most logical but then, why would the system continue to operate? Leads me to believe there is a condition the programmers and designers didn't anticipate that, under that specific condition, the sensor is basically blind. That could be the issue of the truck being very close in color to the sky. Still, you would think it would see the wheels and other darker bits of the truck and trailer and still detect the hazard.
I thought about the issue of the truck being too high...after all the report makes it sound like everything below the windshield level of the car went clean under the truck. I'm sure most of the sensors would be in that area. However, the car traveled under the trailer between the last axle on the tractor and the first axle on the trailer. That means the tractor and it's three axles SHOULD have been detected by the sensor and triggered a braking event.