Saturn Sky Forum banner

1 - 20 of 111 Posts

·
Super Moderator
Joined
·
11,238 Posts
Discussion Starter #1
https://www.mercurynews.com/2018/01/22/tesla-on-autopilot-slams-into-parked-fire-truck-on-freeway/


This was from 1/22/2018.

A Tesla Model S reportedly on “Autopilot” smashed into the back of a fire truck parked at a freeway accident scene Monday morning, authorities said.

The union representing Culver City firefighters whose truck was hit around 8:30 a.m. on Interstate 405 in Culver City tweeted that the Tesla driver said he had been using Tesla’s Autopilot system, which performs automated driving tasks.

The California Highway Patrol and Culver City Fire Department confirmed the southbound Tesla had struck the fire truck, but could not immediately confirm whether the vehicle had been on Autopilot.

The fire truck had been parked in the left emergency lane and carpool lane, blocking off the scene of a previous accident, with a CHP vehicle behind it and to the side, said Culver City Fire Department battalion chief Ken Powell.

Both emergency vehicles had their lights flashing, Powell added.
Will be looking for follow up stories on this to confirm the car was on autopilot and if Tesla has any investigation into why it missed a fire truck in front of it. I have a feeling the fire truck was "stealthed". What do I mean? Look at the way it is parked and look at the F-117 Stealth Fighter. Then remember the F-117 looks that way to scatter incoming radar waves making it invisible to radar. I bet the same thing happened with the Tesla's collision radar due to the angle the first truck was parked on the freeway. Thus it was "invisible" to the Tesla's radar.

Things like this have to be figured out before automated cars become more everyday.
 

·
Registered
Joined
·
952 Posts
I think the issue with tesla's vehicle avoidance system is that it gets disabled once you interfere with it, if you slam on the accelerator you will hit the object in front of you no matter what. Blaming the autopilot is the obvious excuse nowadays if you do something stupid.
 

·
Registered
Joined
·
4,377 Posts
From that article
Tesla, after the incident, said Autopilot is “intended for use only with a fully attentive driver.” The Model S owner’s manual has numerous warnings that attention to the road is vital while using Autopilot and other Tesla semi-autonomous driving functions.

IMHO, this is the fault of the driver.
It is also a reason why I think these systems aren't even close to being ready either. I hope this sets them back a bunch. If you don't want to drive, take a train, uber or Lyft
 

·
Premium Member
Joined
·
1,293 Posts
DELETED COMMENTS ----

Will just say not a set back to automation, just more confirmation that some vehicles may not really have all the technology to be autonomous.
 

·
Registered
Joined
·
300 Posts
Charles Lindburgh survived his famous First non-stop flight over the Atlantic by staying awake and alert. To do this he had incorporated a slight instability in his airplane so that it required constant correction. Perhaps the more demanding driving is, the safer it may be. Knowledge that you are legally liable isn’t going to prevent lack of attention due to boredom.
 

·
Registered
Joined
·
394 Posts
Charles Lindburgh survived his famous First non-stop flight over the Atlantic by staying awake and alert. To do this he had incorporated a slight instability in his airplane so that it required constant correction. Perhaps the more demanding driving is, the safer it may be. Knowledge that you are legally liable isn’t going to prevent lack of attention due to boredom.
Maybe that's what automation of cars needs....a "dead man's pedal" like in trains!!! Not making fun of your post by any means. You are correct in what you're saying. Unfortunately, anytime big business does something to make life easier, there's always that small group that does not pay attention to the warnings. I can give many examples of this! My next response would be this.... If Tesla has that stated that the driver/passenger must remain alert in their owner's manual, and GM is going to release some taxi service cars without any controls visable to the passengers....who's at fault now? I have all sorts of questions regarding automation of vehicles. How does the car know if it's been in a minor fender bender that doesn't produce enough shock/G's to register an accident? What if the car is going down the road and hits a big pothole (because third world war torn countries have better roads right now then we do here in Detroit) and it sets off the G sensors? Does the car come to a stop in the middle of the freeway because of this? Someone I know that lives in another country owns a Tesla and they have a 30 minute commute on the freeway where they set their car to autopilot and read the Wall Street Journal. There's so many unknowns right now with autonomous cars... :confused::dunno:
 

·
Super Moderator
Joined
·
11,238 Posts
Discussion Starter #7
DELETED COMMENTS ----

Will just say not a set back to automation, just more confirmation that some vehicles may not really have all the technology to be autonomous.
What I mean as a setback to automation is not the ability but rather the proper application. Like with all such things when we think we know what we need something happens that makes us realize there are possibilities we haven't covered.

Most believed the Tesla system was a near perfect system. This shows it's not and is probably ONE reason why Tesla has that warning. Things like this show people that there has to be redundant systems that have different strengths from one another to function properly. For instance, if these to vehicles both had radar systems and then talked to one another, the Fire Engine would still be invisible to the Tesla but the Tesla may not be invisible to the Fire Engine and the Fire Engine could have communicated back to the Tesla that a collision would be eminent if the Tesla didn't take different actions.
 

·
Premium Member
Joined
·
1,293 Posts

·
Super Moderator
Joined
·
11,238 Posts
Discussion Starter #9

·
Premium Member
Joined
·
1,293 Posts
I have all sorts of questions regarding automation of vehicles. How does the car know if it's been in a minor fender bender that doesn't produce enough shock/G's to register an accident? What if the car is going down the road and hits a big pothole (because third world war torn countries have better roads right now then we do here in Detroit) and it sets off the G sensors? Does the car come to a stop in the middle of the freeway because of this? Someone I know that lives in another country owns a Tesla and they have a 30 minute commute on the freeway where they set their car to autopilot and read the Wall Street Journal. There's so many unknowns right now with autonomous cars... :confused::dunno:
The answer to most questions that start off with how does a car know.... is the same way you know. What ever inputs you use the car has, and it can run through the decision process way faster than you can. The car would know its been in a accident cause the lidar units would have been following the object that hit it for somewhere's near a 1/8th of a mile or so, and calculated its velocity and trajectory up to the point it hit the car. Same with the pot hole, except there the car knows precisely where its wheels are so it in theory could be programmed to avoid the pothole by an an inch but if it had to hit it, it would have calculated that already, then from the spike in tire pressure, the changes in acceleration of the active suspension sensors, calculated how hard an impact and what if anything might have been damaged. I am pretty sure like you the car will be programmed not to stop in the middle of the road even if it has a flat but to run on the rim until it can get safely off the road.
 

·
Premium Member
Joined
·
1,293 Posts
Not talking automotive experts as I know they know better. I'm talking about your average Joe on the street that has read the Tesla hype. I should have been more clear but I stand by my statement with that additional variable defined.
So what you really meant is another set back in the perception of car "automation" cause this crash was not a set back to car "Automation"

On another note, I wonder when Tesla will finally sell the Model 3 it promised would be for sale last sept or whenever it was. Last I heard they were still only making the more expensive battery level.
 

·
Premium Member
Joined
·
4,154 Posts
My opinion is that almost no one cares. People generally don't pay attention anyway so I don't see it as a setback.

I have flown from Dulles to Denver in a triple 7 with the aircraft in full autopilot mode including the take off and landing.

Its not a question of can it be done, but what does it cost. With unlimited funds and unconstrained configuration a self driving car could be done today in my opinion. It would cost a lot and would probably look like crap because of all the sensors hanging off it.

See cruise missile. They fly autonomously over the ground for hundreds or even thousands of miles and arrive at a very small window. Same basic technology could be applied to any fixed surface route with the addition of hazard and object avoidance sensors. I can see an effort to build interstate terminal to terminal route data bases for long haul trucks for example. But it costs a butt load of money to build that data base and does not take you to grandma's house.

The Tesla system is designed to be a smart aide to drivers. Manual control is required at all times. Every time I see an "autonomous" vehicle in a crash I cringe because unless they are designed to operate without a driver, the driver is the key part of the system that is there to keep it from making fatal mistakes. And the driver is ALWAYS responsible. Like the captain of the ship. Along with full authority comes full responsibility.

personally I don't like "self driving cars" for me. I don't need it. But for those other idiots out there I want THEM in hands off mode because they are idiots and a hazard to me.
 

·
Premium Member
Joined
·
1,552 Posts
From that article
Tesla, after the incident, said Autopilot is “intended for use only with a fully attentive driver.” The Model S owner’s manual has numerous warnings that attention to the road is vital while using Autopilot and other Tesla semi-autonomous driving functions.
Here's why I think that statement, which Tesla loves to get in print, is BS:

If they really wanted to require a "fully attentive driver", they'd program the car to pull over, stop, or emit an obnoxious sound in the cabin if the car didn't detect human hands on the steering wheel for more than 10 seconds or so.

But they don't. So this statement strikes me as something they love to say while winking at owners.
 

·
Registered
Joined
·
318 Posts
I would agree that it will set back the public perception of it as "safe", but I'm not sure that if it is found that there was driver error (beyond improper use), that it will slow "the industry" down. The industry (Tesla in this case) is poring over all the data as we read this thread, and likely has the "problem" narrowed down. So then, the public perception has probably suffered a greater setback than anything. I LOVE TO DRIVE, and I think it would be hard NOT to drive a car like a Tesla w/ no autopilot (I mean aside from the range anxiety that would surely fade). There are applications where an autonomous car (or a car anywhere along the continuum of "autonomous") would work wonderfully, but as many have pointed out humans get inattentive, so accidents are likely still going to happen.

I'd like to see data on average accidents per mile on autonomous vehicles vs average accidents per mile on "regular" vehicles. Problem is that if the comparison showed that autonomous cars were safer or "as safe" as regular ones, then humans would likely get even MORE inattentive/cavalier than they already are.
 

·
Registered
Joined
·
104 Posts
My first thought was, anyone involved in an accident with an autonomous abled car can blame the car now for their own mistakes. What if the Tesla driver was actually driving, but distracted by their phone or something?
Maybe they should have a sensor in the steering wheel that detects contact with the hand, kind of like the sensors that read pulse on exercise equipment. So if the car is on autopilot, the sensor activates, and if it detects that the hand is removed it alerts the driver to put their hand back on the wheel, and if not the car pulls over to the side of the road and turns off autopilot.
 

·
Super Moderator
Joined
·
5,743 Posts
This might be more of a setback for Tesla's decision to not use LIDAR in favor of vision.

I just read that Magna is converting one of their military radar systems for automotive use. It is supposed to be 100 times more effective than anything currently in use.
 

·
Super Moderator
Joined
·
5,743 Posts
I am far from an expert in this field, but the industry press (Ward's, AD&P, etc) have had a lot of articles about it lately, and I have picked up a lot of background information.

There are three main types of sensor used in vehicle automation:

Vision uses cameras to collect information. A camera is a passive device that collects light (visible and infra-red) emitted or reflected by an object.

LIDAR (LIght Detection And Ranging) uses laser light to illuminate objects in front of it and then reads the reflection of that light.

RADAR (RAdio Detection And Ranging) uses radio (microwave) to illuminate objects in front of it and then reads the reflection of that light.

Ultrasound is a fourth sensing technology but my understanding is that it is mainly used for short-range applications like parking and low-speed object detection.

Tesla has focused on vision backed up by radar for its vehicle guidance, while most of the rest of the industry has turned more to LIDAR. There is a lot of debate over which is better, and no clear winner.

LIDAR is more accurate than vision, and both are more accurate than the current generation of automotive RADAR. Vision and LIDAR are both hampered by dust, fog, rain, etc while RADAR is generally not. LIDAR sensors have traditionally been significantly larger and more expensive than either vision or RADAR, but the new technology is both smaller and less expensive. RADAR is not currently very discriminating and loses small objects in the presence of large ones but the next generation is said to be 100 times more discriminating than the current generation and able to track that many individual objects.
 

·
Premium Member
Joined
·
1,293 Posts
Right, which is what prompted the GM exec's response to Musk's comment about his cars are level 5 ready now... linked in an earlier post on this thread.
 
1 - 20 of 111 Posts
Top