Saturn Sky Forums: Saturn Sky Forum

Saturn Sky Forums: Saturn Sky Forum (https://www.skyroadster.com/forums/)
-   Roadster/Small RWD Market/Other Vehicle General Discussion (https://www.skyroadster.com/forums/f11/)
-   -   This could set self driving cars back a bit. (https://www.skyroadster.com/forums/f11/could-set-self-driving-cars-back-bit-78290/)

Robotech 06-30-2016 06:00 PM

This could set self driving cars back a bit.
 
97 Attachment(s)
Model S Autopilot crash under NHTSA investigation

Quote:

A deadly crash that killed the driver of a Model S is being investigated by the National Highway Transportation Safety Administration (NHTSA), Tesla announced on Thursday. At the time of the accident, Tesla’s autopilot software was engaged. This is the first such incident in over 130 million miles where Autopilot was activated, the company said.

Tesla described how the accident occurred in a blog post.

What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact cause the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.

Before autopilot can be engaged, Tesla is explicit with drivers that the technology is still beta software and is merely an assist feature, not something drivers should rely on when driving. As such, Tesla’s visual warnings tell drivers to keep their hands on the wheel at all times should they need to take over control from the software. Frequent checks are also done to ensure a driver’s hands are on the wheel.

Tesla expanded further on the Model S’s autopilot features and why human intervention is still required in certain situations.

As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.

It’s unclear whether or not the driver of the Model S had their hands on the wheel and if they were alert to the situation on the road. Many pundits have criticized Tesla’s autopilot feature for falsely giving drivers the impression that the software is doing more than it really is. The NHTSA’s investigation will figure out whether or not Tesla’s software “worked according to expectations.”

In other words, was this human error or is Tesla’s software to blame?
NHTSA launches probe into Tesla self-driving car death

Quote:

The National Highway Traffic Safety Administration said Thursday it has launched a preliminary investigation into a fatal highway crash involving a 2015 Tesla Model S that was operating with its automated driving system activated.

It’s believed to be the first U.S. death in a vehicle engaged in a semi-autonomous driving feature.

The federal regulator said that the agency received reports from Tesla about a crash in Williston, Florida, near Gainesville, on May 7 with the vehicle operating in autopilot mode. NHTSA said preliminary accident reports say the crash happened when a semitrailer turned left in front of the Tesla at an intersection “on a non-controlled access highway” and the driver died due to injuries in the crash.

NHTSA said the investigation will “examine the design and performance of any automated driving systems in use at the time of the crash.”

Tesla, in a blog posting, said the death is the “first known fatality in just over 130 million miles where Autopilot was activated.”

NHTSA, in a statement, said the Office of Defects Investigation has launched the investigation and will gather more data about the incident and other info about automated driving systems.

“The opening of the preliminary evaluation should not be construed as a finding that the Office of Defects Investigation believes there is either a presence or absence of a defect in the subject vehicles,” the regulator said in a statement.

The review involves about 25,000 vehicles, the agency said.

Tesla, in the blog posting, said the crash happened on a divided highway, with the semitrailer driving across the highway perpendicular to the Model S. The Palo Alto, California-based company said it learned Wednesday that NHTSA would launch the investigation.

“Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” Tesla said. “The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.”

Tesla said it disables Autopilot by default and “requires explicit acknowledgment that the system is new technology and still in a public beta phase before it can be enabled.”

The car company says that when drivers activate Autopilot, it reminds drivers that it is an “assist feature” and requires drivers to keep their hands on the steering wheel at all times and that drivers are responsible for maintaining control of the vehicle.

Tesla said each time the system is in use, the car reminds a driver to keep hands on the wheel and to prepare to take over the driving at any time. The company said the system also makes “frequent checks” to make sure a driver’s hands are on the wheel, providing visual and audible alerts if they aren’t detected on the steering wheel. The car will gradually slow down until hands are detected, according to the blog.

Tesla said it was saddened by the news of the driver’s death.

“Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” the blog says.

the technology is “still in a public beta phase before it can be enabled.”

“We do this to ensure that every time the feature is used, it is used as safely as possible,” Tesla said. “As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert.”

Tesla stock closed up 1 percent Thursday to $212.28 a share, though the stock was trading lower after hours.


JohnWR 06-30-2016 07:35 PM

It will certainly get a lot of attention.

I can understand the car's vision system failing to pick up the white trailer against a bright sky, but it seems unlikely that an alert driver would have totally missed it.

Tech_God 07-01-2016 08:11 AM

Yeah this seems like a failure of the autopilot system due to the circumstances, but in no way the accident is Tesla's fault. I don't know how you can easily miss a huge trailer while driving normally. I've personally seen someone driving a Model S while reading a book with Autopilot, so my guess is this guy driving was completely not paying attention.

SkyVue2 07-01-2016 09:04 AM

Of course this is the car maker's fault, they built it. Hope they get sued out of existence. If they can't build an electric vehicle that goes 300 miles and costs less than $20,000 retail then they are wasting their time. The autopilot component is just the dream cherry on top. Tesla is but a billionaire's little pet project. Driverless cars are a silly fantasy designed by geeks living in a bubble, many of whom do not like to drive. They are designed for millenials, many of whom do not or cannot drive; and promoted by hucksters and an adoring hyperactive mass media. PT Barnum would be proud of Elon Musk.

Nova-Exarch 07-01-2016 12:07 PM

Quote:

Originally Posted by SkyVue2 (Post 1083546)
Of course this is the car maker's fault, they built it. Hope they get sued out of existence. If they can't build an electric vehicle that goes 300 miles and costs less than $20,000 retail then they are wasting their time. The autopilot component is just the dream cherry on top. Tesla is but a billionaire's little pet project. Driverless cars are a silly fantasy designed by geeks living in a bubble, many of whom do not like to drive. They are designed for millenials, many of whom do not or cannot drive; and promoted by hucksters and an adoring hyperactive mass media. PT Barnum would be proud of Elon Musk.

:surprise:

:|

:lol:

:P

Rusty Boltz 07-01-2016 12:41 PM

I have driven heavy trucks with adaptive cruise control and lane departure warning. Believe me, there are bugs to be worked out. :surprise:

Robotech 07-01-2016 01:35 PM

97 Attachment(s)
Quote:

Originally Posted by SkyVue2 (Post 1083546)
Of course this is the car maker's fault, they built it. Hope they get sued out of existence. If they can't build an electric vehicle that goes 300 miles and costs less than $20,000 retail then they are wasting their time. The autopilot component is just the dream cherry on top. Tesla is but a billionaire's little pet project. Driverless cars are a silly fantasy designed by geeks living in a bubble, many of whom do not like to drive. They are designed for millenials, many of whom do not or cannot drive; and promoted by hucksters and an adoring hyperactive mass media. PT Barnum would be proud of Elon Musk.

Not exactly the car makers fault if they have provided adequate information to the owner addressing the fact that this is not a set and forget feature...which the articles claim they did. Of course, we'll see what the outcome is after lawyers and judges have their way with it.

Seems like you're asking a lot from a fledgling technology. Electric cars are still very new in the whole grand scheme of things. The S90 is the first real usable electric vehicle. A 290 mile range is pretty good for an electric car today. I mean, keep in mind when the automobile was about the age of electric cars today, going over 100 was a HUGE deal and you had to be a bit of a mechanic, have a roadside tool kit, and carry a couple spare tires.

Driverless cars are not a fantasy. They are working hard to make them a reality. How fast that will happen and how much of an impact they will have on what we know driving to be today remains to be seen but there is no doubt in my mind that between Musk and the folks at Google, driverless cars will become more common. Things like adaptive cruise and lane assist or lane drift warnings are already becoming more prevalent in new cars and these items are the gateway to more autonomous vehicles. The big issue I see is when they become more mainstream and you have folks like us who ENJOY driving our cars.

As for an electric car costing less than $20k retail, go look at the price of a new car and see how many even have models starting at under $20 without them being some sort of entry level commuter vehicle.

JohnWR 07-01-2016 03:30 PM

If Tesla made an error here it was in calling their system "Autopilot". No matter how many warnings you issue, the name still suggests a hands-off control system.

That said, anyone would be incredibly foolish to trust the current technology to the degree that this driver apparently did.
The latest news I heard was that he was watching a video of some sort while the car drove itself, so he certainly wasn't doing any "hands-on" monitoring of the car.
Beyond that, the incident happened on a non-controlled-access highway.

This isn't all because of inappropriate reliance on technology, of cours.
I remember vividly cruising down an interstate 20-odd years ago, being passed by a car, looking over, and seeing the driver reading a newspaper that he had spread across the steering wheel and dashboard.
Then there was the day that I was driving to work in rush-hour traffic, passed a car, and saw the driver eating breakfast with a bowl of cereal in one hand and a spoon in the other.
Then .....

Robotech 07-01-2016 03:33 PM

97 Attachment(s)
LOL John, oh yea....so many "distracted drivers" out there doing things that just make you wonder if they are trying to get killed.

And the autopilot thing...I seem to recall a story I heard once that was about a man in the 70s who bought a motor home with this new fangled feature called "Cruise Control". He was driving his new motor home on the highway, set the cruise control, and then left the driver's seat to go make lunch in the Motor Home's galley. He was shocked when the motor home ran off the road.

Apparently he though Cruise Control was Autopilot. Now, I don't know the factuality of this story but every time I hear about self driving cars, I think about the guy I call Mr. Lunchables.

Elff 07-01-2016 04:04 PM

Driver did not have his hands on the steering wheel.
It is their fault.
end of discussion.

Autopilot will only work if all vehicles have autopilot and they are registering their paths. This is why Airplane flights are highly regulated. It is to avoid one plane flying through the path of another.

We are no where near that on the streets yet.

Robotech 07-01-2016 04:23 PM

97 Attachment(s)
Quote:

Originally Posted by Elff (Post 1083802)
Autopilot will only work if all vehicles have autopilot and they are registering their paths. This is why Airplane flights are highly regulated. It is to avoid one plane flying through the path of another.

We are no where near that on the streets yet.

Don't think we ever will be considering the number of cars on the road and the small size of total square miles of road available versus the number of planes in the sky and the spacious cubic miles of atmosphere available to them.

Quote:

Originally Posted by SkyVue2 (Post 1083546)
Of course this is the car maker's fault, they built it.

Quote:

Originally Posted by JohnWR (Post 1083770)
If Tesla made an error here it was in calling their system "Autopilot". No matter how many warnings you issue, the name still suggests a hands-off control system.

Quote:

Originally Posted by Elff (Post 1083802)
Driver did not have his hands on the steering wheel.
It is their fault.
end of discussion.

You know, this brings up an interesting debate.

I am certain there is a law on the books in just about every state that says something to the effect of "a driver must be in control of their vehicle at all times." Now how that is worded will be important in a case like this.

Right now autonomous driving cars are so new the laws currently on the books probably do not address them...of course there could be some laws out there that by the very way they are worded could possibly be interpreted to cover such vehicles but I doubt they would be very specific. Thus from a criminal standpoint, the car itself would not be to "blame" for contributing to the accident as the driver of that car, by law, should have always been in control of that vehicle...in theory.

On the civil side though, I do believe the family of the Tesla Driver would have a case. Like John mentions, the fact that Tesla refers to the feature as "autopilot" could be interpreted to mean that no active supervision of the controls are necessary and thus be somewhat liable for damages.

On the other hand, Tesla is very up front that while the car can "drive itself" this doesn't mean the driver can take a nap and enjoy the ride. They specify that the driver must retain attention at all times. IIRC, even with aircraft autopilot systems a pilot must be at the controls at all times monitoring what is happening around him so that he can instantly take over control of the aircraft if something goes wrong. Ergo even though the system is referred to as an "Autopilot" system, that...in and of itself...doesn't necessarily mean that the driver need not continue monitoring or take action while the system is in control of the vehicle.

I am certain this will go to the courts and it will be a major factor in the future of autonomous vehicles.

Elff 07-01-2016 04:32 PM

Quote:

Originally Posted by Robotech (Post 1083834)
You know, this brings up an interesting debate.

I am certain there is a law on the books in just about every state that says something to the effect of "a driver must be in control of their vehicle at all times."

You are correct. This is why, in the eyes of the law, it is almost impossible to not be at fault if you hit someone from the rear.

Tech_God 07-01-2016 04:37 PM

SkyVue2....... seriously, get the sand out of your... ahem. I know future shock is pretty harsh for some people but jeez, you sound like a raving lunatic with that statement, basically the same as the people who were against cars while clutching onto their horse and carriages.
It's the wave of the future, the prices are dropping very soon (The Model 3 is proof), driverless cars are working currently but not in production yet, and the technology helps a lot more than just "Millenials", it's mostly to help old people who can't and shouldn't be driving due to their physical/mental state.
There's always going to be cars you drive yourself, but that'll become more specifically for people that love to drive, not for your average person who just wants to get from point A to B.

Police: Driver In Tesla Autopilot Crash Had Portable DVD Player In Car, May Have Been Watching Movie (Updated)

The guy was watching Harry Potter while driving, completely not paying attention to the road.
The car itself tells you explicitly to NOT do that since it's effectively a beta, so this is really all the driver's fault in just about every way.

Robotech 07-01-2016 05:02 PM

97 Attachment(s)
I have no problem with what's been posted so far but let's be careful not to take this personally and let this thread get out of hand. Just trying to be proactive here and get ahead of the snowball. LOL

Quote:

Originally Posted by Elff (Post 1083858)
You are correct. This is why, in the eyes of the law, it is almost impossible to not be at fault if you hit someone from the rear.

Criminally yes, I agree with you. However, in a civil matter where it's the driver's family suing Tesla in something like a wrongful death suit, the burden of proof is lower and the lines between who is to blame become less clear. This is partially why you hear about lawsuits like the famous lady burns herself because the McDonald's coffee cup was not assembled correctly and caused over-hot coffee to spill on her and severely burn her. While we all know coffee is hot, what most people don't hear about this case is the coffee in question was considerably hotter than what is considered a safe temperature and what is McDonald's corporate policy on the issue. So it will come down to whether or not this driver's family pursues a civil case or not and what the outcome of that case is. When more details are known, what is criminally on the driver 100% may wind up being 50% Tesla's fault in a civil matter.

Quote:

Originally Posted by Tech_God (Post 1083866)
It's the wave of the future, the prices are dropping very soon (The Model 3 is proof), driverless cars are working currently but not in production yet, and the technology helps a lot more than just "Millenials", it's mostly to help old people who can't and shouldn't be driving due to their physical/mental state.

There's always going to be cars you drive yourself, but that'll become more specifically for people that love to drive, not for your average person who just wants to get from point A to B.

Police: Driver In Tesla Autopilot Crash Had Portable DVD Player In Car, May Have Been Watching Movie (Updated)

The guy was watching Harry Potter while driving, completely not paying attention to the road.
The car itself tells you explicitly to NOT do that since it's effectively a beta, so this is really all the driver's fault in just about every way.

The first part of this is very insightful. I know when we get older and get to the point where we SHOULDN'T be driving, it is hard to give up that independence. Autonomous cars would certainly allow many seniors to remain on the road, keep their independence, and be safer for both themselves and others on the road.

Personal story time. My father developed dementia at a very early age (he died from it at the age of 69). Shortly after he had retired, he was driving home from church and got lost. Had no idea where he was. Eventually he figured it out and got home safely. That night, my father gave up his keys and never drove an automobile again. Sadly, he is in the minority when it comes to Seniors and driving. Many stay on the road far longer than they should and are reluctant to give it up. It is understandable but autonomous cars could certainly help with this problem.

The last part...Harry Potter...REALLY?

Tech_God 07-01-2016 05:18 PM

Quote:

Originally Posted by Robotech (Post 1083898)
The last part...Harry Potter...REALLY?

Yep, gotta love people huh?. This is pretty much why there have been accidents with autopilot, people don't use it like it's supposed to be used, or over-trust something that is self-admitted to not be a fully driverless function.


All times are GMT -4. The time now is 03:48 PM.

Powered by vBulletin®. Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.

User Alert System provided by Advanced User Tagging v3.1.0 (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.
vBulletin Security provided by vBSecurity v2.2.2 (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.

 
For the best viewing experience please update your browser to Google Chrome