Saturn Skyline Forum banner

1 - 20 of 42 Posts

·
Super Moderator
Joined
·
11,040 Posts
Discussion Starter #1
Model S Autopilot crash under NHTSA investigation

A deadly crash that killed the driver of a Model S is being investigated by the National Highway Transportation Safety Administration (NHTSA), Tesla announced on Thursday. At the time of the accident, Tesla’s autopilot software was engaged. This is the first such incident in over 130 million miles where Autopilot was activated, the company said.

Tesla described how the accident occurred in a blog post.


What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact cause the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.


Before autopilot can be engaged, Tesla is explicit with drivers that the technology is still beta software and is merely an assist feature, not something drivers should rely on when driving. As such, Tesla’s visual warnings tell drivers to keep their hands on the wheel at all times should they need to take over control from the software. Frequent checks are also done to ensure a driver’s hands are on the wheel.

Tesla expanded further on the Model S’s autopilot features and why human intervention is still required in certain situations.


As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.


It’s unclear whether or not the driver of the Model S had their hands on the wheel and if they were alert to the situation on the road. Many pundits have criticized Tesla’s autopilot feature for falsely giving drivers the impression that the software is doing more than it really is. The NHTSA’s investigation will figure out whether or not Tesla’s software “worked according to expectations.”

In other words, was this human error or is Tesla’s software to blame?
NHTSA launches probe into Tesla self-driving car death

The National Highway Traffic Safety Administration said Thursday it has launched a preliminary investigation into a fatal highway crash involving a 2015 Tesla Model S that was operating with its automated driving system activated.

It’s believed to be the first U.S. death in a vehicle engaged in a semi-autonomous driving feature.

The federal regulator said that the agency received reports from Tesla about a crash in Williston, Florida, near Gainesville, on May 7 with the vehicle operating in autopilot mode. NHTSA said preliminary accident reports say the crash happened when a semitrailer turned left in front of the Tesla at an intersection “on a non-controlled access highway” and the driver died due to injuries in the crash.

NHTSA said the investigation will “examine the design and performance of any automated driving systems in use at the time of the crash.”

Tesla, in a blog posting, said the death is the “first known fatality in just over 130 million miles where Autopilot was activated.”

NHTSA, in a statement, said the Office of Defects Investigation has launched the investigation and will gather more data about the incident and other info about automated driving systems.

“The opening of the preliminary evaluation should not be construed as a finding that the Office of Defects Investigation believes there is either a presence or absence of a defect in the subject vehicles,” the regulator said in a statement.

The review involves about 25,000 vehicles, the agency said.

Tesla, in the blog posting, said the crash happened on a divided highway, with the semitrailer driving across the highway perpendicular to the Model S. The Palo Alto, California-based company said it learned Wednesday that NHTSA would launch the investigation.

“Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” Tesla said. “The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.”

Tesla said it disables Autopilot by default and “requires explicit acknowledgment that the system is new technology and still in a public beta phase before it can be enabled.”

The car company says that when drivers activate Autopilot, it reminds drivers that it is an “assist feature” and requires drivers to keep their hands on the steering wheel at all times and that drivers are responsible for maintaining control of the vehicle.

Tesla said each time the system is in use, the car reminds a driver to keep hands on the wheel and to prepare to take over the driving at any time. The company said the system also makes “frequent checks” to make sure a driver’s hands are on the wheel, providing visual and audible alerts if they aren’t detected on the steering wheel. The car will gradually slow down until hands are detected, according to the blog.

Tesla said it was saddened by the news of the driver’s death.

“Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” the blog says.

the technology is “still in a public beta phase before it can be enabled.”

“We do this to ensure that every time the feature is used, it is used as safely as possible,” Tesla said. “As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert.”

Tesla stock closed up 1 percent Thursday to $212.28 a share, though the stock was trading lower after hours.
 

·
Moderator
Joined
·
4,973 Posts
It will certainly get a lot of attention.

I can understand the car's vision system failing to pick up the white trailer against a bright sky, but it seems unlikely that an alert driver would have totally missed it.
 

·
Registered
Joined
·
236 Posts
Yeah this seems like a failure of the autopilot system due to the circumstances, but in no way the accident is Tesla's fault. I don't know how you can easily miss a huge trailer while driving normally. I've personally seen someone driving a Model S while reading a book with Autopilot, so my guess is this guy driving was completely not paying attention.
 

·
Registered
Joined
·
2,025 Posts
Of course this is the car maker's fault, they built it. Hope they get sued out of existence. If they can't build an electric vehicle that goes 300 miles and costs less than $20,000 retail then they are wasting their time. The autopilot component is just the dream cherry on top. Tesla is but a billionaire's little pet project. Driverless cars are a silly fantasy designed by geeks living in a bubble, many of whom do not like to drive. They are designed for millenials, many of whom do not or cannot drive; and promoted by hucksters and an adoring hyperactive mass media. PT Barnum would be proud of Elon Musk.
 

·
Registered
Joined
·
1,279 Posts
Of course this is the car maker's fault, they built it. Hope they get sued out of existence. If they can't build an electric vehicle that goes 300 miles and costs less than $20,000 retail then they are wasting their time. The autopilot component is just the dream cherry on top. Tesla is but a billionaire's little pet project. Driverless cars are a silly fantasy designed by geeks living in a bubble, many of whom do not like to drive. They are designed for millenials, many of whom do not or cannot drive; and promoted by hucksters and an adoring hyperactive mass media. PT Barnum would be proud of Elon Musk.
:surprise:

:|

:lol:

:p
 

·
Premium Member
Joined
·
1,567 Posts
I have driven heavy trucks with adaptive cruise control and lane departure warning. Believe me, there are bugs to be worked out. :surprise:
 

·
Super Moderator
Joined
·
11,040 Posts
Discussion Starter #7
Of course this is the car maker's fault, they built it. Hope they get sued out of existence. If they can't build an electric vehicle that goes 300 miles and costs less than $20,000 retail then they are wasting their time. The autopilot component is just the dream cherry on top. Tesla is but a billionaire's little pet project. Driverless cars are a silly fantasy designed by geeks living in a bubble, many of whom do not like to drive. They are designed for millenials, many of whom do not or cannot drive; and promoted by hucksters and an adoring hyperactive mass media. PT Barnum would be proud of Elon Musk.
Not exactly the car makers fault if they have provided adequate information to the owner addressing the fact that this is not a set and forget feature...which the articles claim they did. Of course, we'll see what the outcome is after lawyers and judges have their way with it.

Seems like you're asking a lot from a fledgling technology. Electric cars are still very new in the whole grand scheme of things. The S90 is the first real usable electric vehicle. A 290 mile range is pretty good for an electric car today. I mean, keep in mind when the automobile was about the age of electric cars today, going over 100 was a HUGE deal and you had to be a bit of a mechanic, have a roadside tool kit, and carry a couple spare tires.

Driverless cars are not a fantasy. They are working hard to make them a reality. How fast that will happen and how much of an impact they will have on what we know driving to be today remains to be seen but there is no doubt in my mind that between Musk and the folks at Google, driverless cars will become more common. Things like adaptive cruise and lane assist or lane drift warnings are already becoming more prevalent in new cars and these items are the gateway to more autonomous vehicles. The big issue I see is when they become more mainstream and you have folks like us who ENJOY driving our cars.

As for an electric car costing less than $20k retail, go look at the price of a new car and see how many even have models starting at under $20 without them being some sort of entry level commuter vehicle.
 

·
Moderator
Joined
·
4,973 Posts
If Tesla made an error here it was in calling their system "Autopilot". No matter how many warnings you issue, the name still suggests a hands-off control system.

That said, anyone would be incredibly foolish to trust the current technology to the degree that this driver apparently did.
The latest news I heard was that he was watching a video of some sort while the car drove itself, so he certainly wasn't doing any "hands-on" monitoring of the car.
Beyond that, the incident happened on a non-controlled-access highway.

This isn't all because of inappropriate reliance on technology, of cours.
I remember vividly cruising down an interstate 20-odd years ago, being passed by a car, looking over, and seeing the driver reading a newspaper that he had spread across the steering wheel and dashboard.
Then there was the day that I was driving to work in rush-hour traffic, passed a car, and saw the driver eating breakfast with a bowl of cereal in one hand and a spoon in the other.
Then .....
 

·
Super Moderator
Joined
·
11,040 Posts
Discussion Starter #9 (Edited)
LOL John, oh yea....so many "distracted drivers" out there doing things that just make you wonder if they are trying to get killed.

And the autopilot thing...I seem to recall a story I heard once that was about a man in the 70s who bought a motor home with this new fangled feature called "Cruise Control". He was driving his new motor home on the highway, set the cruise control, and then left the driver's seat to go make lunch in the Motor Home's galley. He was shocked when the motor home ran off the road.

Apparently he though Cruise Control was Autopilot. Now, I don't know the factuality of this story but every time I hear about self driving cars, I think about the guy I call Mr. Lunchables.
 

·
Registered
Joined
·
4,377 Posts
Driver did not have his hands on the steering wheel.
It is their fault.
end of discussion.

Autopilot will only work if all vehicles have autopilot and they are registering their paths. This is why Airplane flights are highly regulated. It is to avoid one plane flying through the path of another.

We are no where near that on the streets yet.
 

·
Super Moderator
Joined
·
11,040 Posts
Discussion Starter #11 (Edited)
Autopilot will only work if all vehicles have autopilot and they are registering their paths. This is why Airplane flights are highly regulated. It is to avoid one plane flying through the path of another.

We are no where near that on the streets yet.
Don't think we ever will be considering the number of cars on the road and the small size of total square miles of road available versus the number of planes in the sky and the spacious cubic miles of atmosphere available to them.

Of course this is the car maker's fault, they built it.
If Tesla made an error here it was in calling their system "Autopilot". No matter how many warnings you issue, the name still suggests a hands-off control system.
Driver did not have his hands on the steering wheel.
It is their fault.
end of discussion.
You know, this brings up an interesting debate.

I am certain there is a law on the books in just about every state that says something to the effect of "a driver must be in control of their vehicle at all times." Now how that is worded will be important in a case like this.

Right now autonomous driving cars are so new the laws currently on the books probably do not address them...of course there could be some laws out there that by the very way they are worded could possibly be interpreted to cover such vehicles but I doubt they would be very specific. Thus from a criminal standpoint, the car itself would not be to "blame" for contributing to the accident as the driver of that car, by law, should have always been in control of that vehicle...in theory.

On the civil side though, I do believe the family of the Tesla Driver would have a case. Like John mentions, the fact that Tesla refers to the feature as "autopilot" could be interpreted to mean that no active supervision of the controls are necessary and thus be somewhat liable for damages.

On the other hand, Tesla is very up front that while the car can "drive itself" this doesn't mean the driver can take a nap and enjoy the ride. They specify that the driver must retain attention at all times. IIRC, even with aircraft autopilot systems a pilot must be at the controls at all times monitoring what is happening around him so that he can instantly take over control of the aircraft if something goes wrong. Ergo even though the system is referred to as an "Autopilot" system, that...in and of itself...doesn't necessarily mean that the driver need not continue monitoring or take action while the system is in control of the vehicle.

I am certain this will go to the courts and it will be a major factor in the future of autonomous vehicles.
 

·
Registered
Joined
·
4,377 Posts
You know, this brings up an interesting debate.

I am certain there is a law on the books in just about every state that says something to the effect of "a driver must be in control of their vehicle at all times."
You are correct. This is why, in the eyes of the law, it is almost impossible to not be at fault if you hit someone from the rear.
 

·
Registered
Joined
·
236 Posts
SkyVue2....... seriously, get the sand out of your... ahem. I know future shock is pretty harsh for some people but jeez, you sound like a raving lunatic with that statement, basically the same as the people who were against cars while clutching onto their horse and carriages.
It's the wave of the future, the prices are dropping very soon (The Model 3 is proof), driverless cars are working currently but not in production yet, and the technology helps a lot more than just "Millenials", it's mostly to help old people who can't and shouldn't be driving due to their physical/mental state.
There's always going to be cars you drive yourself, but that'll become more specifically for people that love to drive, not for your average person who just wants to get from point A to B.

Police: Driver In Tesla Autopilot Crash Had Portable DVD Player In Car, May Have Been Watching Movie (Updated)

The guy was watching Harry Potter while driving, completely not paying attention to the road.
The car itself tells you explicitly to NOT do that since it's effectively a beta, so this is really all the driver's fault in just about every way.
 

·
Super Moderator
Joined
·
11,040 Posts
Discussion Starter #14 (Edited)
I have no problem with what's been posted so far but let's be careful not to take this personally and let this thread get out of hand. Just trying to be proactive here and get ahead of the snowball. LOL

You are correct. This is why, in the eyes of the law, it is almost impossible to not be at fault if you hit someone from the rear.
Criminally yes, I agree with you. However, in a civil matter where it's the driver's family suing Tesla in something like a wrongful death suit, the burden of proof is lower and the lines between who is to blame become less clear. This is partially why you hear about lawsuits like the famous lady burns herself because the McDonald's coffee cup was not assembled correctly and caused over-hot coffee to spill on her and severely burn her. While we all know coffee is hot, what most people don't hear about this case is the coffee in question was considerably hotter than what is considered a safe temperature and what is McDonald's corporate policy on the issue. So it will come down to whether or not this driver's family pursues a civil case or not and what the outcome of that case is. When more details are known, what is criminally on the driver 100% may wind up being 50% Tesla's fault in a civil matter.

It's the wave of the future, the prices are dropping very soon (The Model 3 is proof), driverless cars are working currently but not in production yet, and the technology helps a lot more than just "Millenials", it's mostly to help old people who can't and shouldn't be driving due to their physical/mental state.

There's always going to be cars you drive yourself, but that'll become more specifically for people that love to drive, not for your average person who just wants to get from point A to B.

Police: Driver In Tesla Autopilot Crash Had Portable DVD Player In Car, May Have Been Watching Movie (Updated)

The guy was watching Harry Potter while driving, completely not paying attention to the road.
The car itself tells you explicitly to NOT do that since it's effectively a beta, so this is really all the driver's fault in just about every way.
The first part of this is very insightful. I know when we get older and get to the point where we SHOULDN'T be driving, it is hard to give up that independence. Autonomous cars would certainly allow many seniors to remain on the road, keep their independence, and be safer for both themselves and others on the road.

Personal story time. My father developed dementia at a very early age (he died from it at the age of 69). Shortly after he had retired, he was driving home from church and got lost. Had no idea where he was. Eventually he figured it out and got home safely. That night, my father gave up his keys and never drove an automobile again. Sadly, he is in the minority when it comes to Seniors and driving. Many stay on the road far longer than they should and are reluctant to give it up. It is understandable but autonomous cars could certainly help with this problem.

The last part...Harry Potter...REALLY?
 

·
Registered
Joined
·
236 Posts
The last part...Harry Potter...REALLY?
Yep, gotta love people huh?. This is pretty much why there have been accidents with autopilot, people don't use it like it's supposed to be used, or over-trust something that is self-admitted to not be a fully driverless function.
 

·
Registered
Joined
·
1,279 Posts
What's going to suck is that the dude's insurance (auto, and possible life as well) will avoid paying since he was most likely breaking the law as well as not following the instructions included with the vehicle (thus, Tesla is off the hook).

I don't understand how the radar/sonar (whichever) system didn't kick in... like you see on the ?Mazda? commercial where they say it has the highest crash rating EVER because it simply avoids accidents...

Maybe those shorter ranged systems did detect the truck but it was too late to stop the vehicle?

Maybe that's why Tesla said the bit about the truck being too high? (IE the non visual avoidance systems couldn't see it either?)
 

·
Super Moderator
Joined
·
11,040 Posts
Discussion Starter #17 (Edited)
Yep, gotta love people huh?. This is pretty much why there have been accidents with autopilot, people don't use it like it's supposed to be used, or over-trust something that is self-admitted to not be a fully driverless function.
And the man was a former SEAL. You'd think he'd have more discipline than that.

I heard they programmed some autopilot cars (in case it had to make a choice), to go for a solid object instead of a human when doing evasive actions. So basically someone added code to kill the driver. Nice job for a programmer. :) I sure hope the human the car is trying to steer around is not one like this :)

Good god I'd hope the car would realize that it shouldn't be leaving the road way and ergo not have to make the decision to strike an inflatable Santa or the house belonging to the family of said inflatable Santa.

I think that's making the best of a bad situation. Look at it this way.

The car has a choice, strike a pedestrian, or hit something else.

If it strikes the unprotected pedestrian, the driver will probably not be injured. The pedestrian will most likely be killed if the speed of the vehicle is above 30 mph. If the speed is below 30, chances are he/she will probably be seriously injured.

If the car avoids the pedestrian and strikes something solid, the driver of the car has the protection of the car's crumple zones which are designed to absorb crash energy, the supplemental restraint systems (air bags) designed to protect the driver from the internal components of the car's interior, and the seat belt which is designed to secure the driver in the proper position so that he/she doesn't go bouncing around the interior of the car and can be in the optimal position for the other safety systems to do their job.

In addition, if the speed of the collision is under 30 mph, the driver of the car will most probably walk away. If the speed is under 60 mph, the driver may be injured and even hospitalized but will probably survive.

Now couple all that with the fact that the driver of the automobile and the autonomous vehicle itself must yield to pedestrians it makes the most sense for a programmer to design the system to avoid pedestrians even if it means striking a more solid object.

All that aside, ask yourself this, would you rather have your car strike a pedestrian at 45 mph and know that you were responsible, in some way, of taking that person's life or would you rather hit a wall at 45 mph, go to the hospital, and recuperate to go on and lead a relatively normal life?

What's going to suck is that the dude's insurance (auto, and possible life as well) will avoid paying since he was most likely breaking the law as well as not following the instructions included with the vehicle (thus, Tesla is off the hook).
Have you ever heard the phrase "A disclaimer is only as strong as the paper it's written on?" Did you know that there are certain rights you cannot sign away? I'm not an attorney and don't know all the details but Tesla is far from off the hook in a civil case.

As for the insurance company not paying because he was breaking the law, I know my insurance company has paid when it was reported I was going over the speed limit and was involved in an accident. I know when people have died because they were on their cell phone not paying attention and wrecked their car, the insurance companies have paid. Not sure him watching Harry Potter will absolve them from paying the family whatever he was insured for.

I don't understand how the radar/sonar (whichever) system didn't kick in... like you see on the ?Mazda? commercial where they say if his the highest crash rating EVER because it simply avoids accidents...

Maybe those shorter ranged systems did detect the truck but it was too late to stop the vehicle?

Maybe that's why Tesla said the bit about the truck being too high? (IE the non visual avoidance systems couldn't see it either?)
I wondered about that two and have come up with some theories.


  • The truck was out of the sensors angle of view horizontally, meaning, that since the truck was "crossing the T" if you will of the Tesla, it was too far off to the side initially to be seen by the vehicle. This, to me, seems like a horribly bad design if this was the case.

  • The truck was out of the sensor's line of sight. There might have been something between the sensor and the truck so that the sensor didn't see the truck until it was too late...again though, this seems unlikely since there was no indication the sensor EVER saw the truck and the report makes it sound like there was nothing blocking the line of sight of the driver or car.

  • The truck was initially out of the sensor's range by distance. Again, this doesn't make sense as one would think the sensor should be able to see far enough ahead so that it could stop in time for the speed it is traveling if something got in the car's path.

  • The sensor straight up malfunctioned. This seems like the most logical but then, why would the system continue to operate? Leads me to believe there is a condition the programmers and designers didn't anticipate that, under that specific condition, the sensor is basically blind. That could be the issue of the truck being very close in color to the sky. Still, you would think it would see the wheels and other darker bits of the truck and trailer and still detect the hazard.
I thought about the issue of the truck being too high...after all the report makes it sound like everything below the windshield level of the car went clean under the truck. I'm sure most of the sensors would be in that area. However, the car traveled under the trailer between the last axle on the tractor and the first axle on the trailer. That means the tractor and it's three axles SHOULD have been detected by the sensor and triggered a braking event.
 

·
Moderator
Joined
·
4,973 Posts
Tesla's response to the radar not not seeing the truck was that the car was aimed right at the gap between the wheels, and it is programmed to not see anything above a certain height so that overhead signs won't cause the car to stop. It is also probably programmed to only look for things directly in front of the car. I am guessing that they will maybe raise the radar cutoff height to at least the height of the car .....
 

·
Super Moderator
Joined
·
11,040 Posts
Discussion Starter #19
I am human. So I decide when it is happening. An autopilot can not do that. Many things play a role in a human decisions. I might react differently when my kid is sitting next to me, I might prefer driving into a wall over a ravine, but do not prefer a wall over water, I might prefer hitting a pony over an 18wheeler, but not when my son is riding that pony, I wonder if an autopilot can ever make decisions like that.
But your comment wasn't about a human driving a car vs autopilot...it was about an engineers decision to have the autopilot choose to hit a solid object vs a human being on foot.

Tesla's response to the radar not not seeing the truck was that the car was aimed right at the gap between the wheels, and it is programmed to not see anything above a certain height so that overhead signs won't cause the car to stop. It is also probably programmed to only look for things directly in front of the car. I am guessing that they will maybe raise the radar cutoff height to at least the height of the car .....
I'm not buying that unless the truck was stopped. For the truck to cross his path then the tractor had to come first and that big tractor of a semi-truck should be more than visible to the Tesla's radar.

Now, if they are saying that the Tesla saw it as a gap in traffic instead and figured it could make it, then I question two things. One, the space between the rear axle of the tractor and the front axle of the trailer isn't that big. Maybe a car length. And if its seeing that as an acceptable gap to drive through I would say that is not acceptable. Then second is the issue of it not seeing above that height. This is part of the fourth point I made above...a condition the engineers and programmers didn't think about.
 

·
Moderator
Joined
·
4,973 Posts
Until the details of the incident are known we are just speculating. Sensors and control algorithms for this sort of thing have to be detuned to let the vehicle function. If one sensor is blinded, and the other is fooled by a temporarily stationary vehicle, the programming has nothing to work with. Hence the requirement for the driver to stay in control.
 
1 - 20 of 42 Posts
Top