Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Tesla "autopilot" disengages shortly before impact 9

Status
Not open for further replies.

MartinLe

Civil/Environmental
Oct 12, 2012
394

"On Thursday, NHTSA said it had discovered in 16 separate instances when this occurred that Autopilot “aborted vehicle control less than one second prior to the first impact,” suggesting the driver was not prepared to assume full control over the vehicle."

Where I a legislating body, I would very much demand that safety critical software needs to be as transparent as any other safety feature. Which would limit "AI"/machine learning applications in these roles.
 
Replies continue below

Recommended for you

Did the programmers never attend driver ed?

That's a bit of a dodge, isn't it? Just because you personally don't know anything about driving doesn't mean you can't hire someone who does and who can lay out the requirements for defensive driving.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
IRS... I thought it was humour...

So strange to see the singularity approaching while the entire planet is rapidly turning into a hellscape. -John Coates

-Dik
 
Possibly, but every joke has a grain of truth. Nevertheless, companies like Tesla, Uber, Waymo, Amazon, etc., have to duty to do solid systems engineering, and the first two seemingly have failed miserably to do so, particularly in the case of Tesla, which has continued to experience failings in their algorithms, because they don't seem to have done the systems engineering. The lack of integrated and comprehensive driving logic appears to not exist at all.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
JohnRBaker said:
But from the photo, it's obviously NOT a Tesla

Mazda3

"If you don't have time to do the job right the first time, when are you going to find time to repair it?"
 
IRS... concur...

So strange to see the singularity approaching while the entire planet is rapidly turning into a hellscape. -John Coates

-Dik
 
As they should. The article references a fatal, rear-end collision with a motorcyclist while using Autopilot in Utah last month at night
This is something that even my company's rudimentary obstacle avoidance logic would have avoided back in 1994 using conventional programming. Tesla's overly-hyped software should have had no problems warning the driver and preventing the collision in what should have been a nearly empty road (4 lanes + HOV) at 1 am in the morning

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
The second article has the key conflict highlighted.

If indeed "Auto pilot" and "full Self Driving" are features able to be used, why would it require active driver supervision??

For too long Tesla have been trying to have it both ways - A system which drives the vehicle with the driver barely paying any attention and with its hands off the wheel ( how else can it turn itself??) and yet needing "officially" for the driver to be alert and providing active supervision.

According to Tesla's website, both technologies "require active driver supervision," with a "fully attentive" driver whose hands are on the wheel, "and do not make the vehicle autonomous."

Remember - More details = better answers
Also: If you get a response it's polite to respond to it.
 
" why would it require active driver supervision??"

Because those features are SAE Level 2, and Level 2 requires active driver supervision, which means that the driver is technically supposed to be driving and only aided by the "feature."

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
L2 is well defined, it is driver assistance. L3 is amazingly badly defined, I can't make any sense of it. However Honda and Merc have both released L3 cars, sounds like a lawyers picnic to me.



Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
Looking on the bright side the riders in an autonomous vehicle, won't ever be blamed for a wreck, since those cars are not suppose to have any sort of control in them, no steering wheel no brakes. Yes would be like riding in an elevator with no buttons. How many will trust that vehicle? Especially with all the problems that exist now with auto pilot systems in cars or planes. How many would like to trust their lives to nano separated electronics, and sensors that can be fooled? Yes then the millions of lines of code to run it all.
 
L3 is amazingly badly defined

I don't think that, but I don't think it likely that any car can really meet Level 3 at least not now, which requires the car to do ALL of the traffic condition monitoring and make all the real-time decisions, particularly when it recognizes that it cannot complete a driving task. Of course, any car system designed with a "don't hit anything" overarching mandate would come closer to Level 3 than Tesla.

In any case, it has to exist on the waterfall toward truly full autonomy

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
I am no fan of software that takes decisions how the car should be driven in relation to it's surroundings. I cannot yet see those systems getting so sophisticated that they would be failure proof under all circumstances. Even if they do attain that level, there still is the risk of other vehicles driven by humans in a way that cannot reasonably be foreseen - think of somebody who wants to kill himself by driving into another vehicle on purpose...

Apart from that, there is another reason I am reluctant to use systems that do take over control over the car you drive. Simple cruise control and the better adaptive systems are nice on long boring drives - but as a driver you might loose concentration and not being able to act when needed. The same goes for navigation systems. In the past I carried a lot of road maps to help me get to where I wanted to go to. Nowadays I let the navigation system tell me how to get there - and thus have no idea where I am - I just react to instructions to turn left or right etc.

My experience with all those features is that they tend to make you "lazy", paying less attention to what happens at the actual moment - what potentially can be very dangerous. Not only when you are mountain climbing or walking in a "rough" neighbourhood, but also when driving a vehicle. You tend to loose attention and that puts you in danger. Nice that a sophisticated system is capable of taking some corrective action - but you should not have got in that situation in the first place.

When Alec Issigonis designed the Mini he purposely designed seats that were rather uncomfortable - with the argument that that would keep the driver attentive. I think he had a point there.
 
When those "levels" were defined, I don't think anyone clued in that safely implementing Level 3 means the car has to cleanly and safely hand over control to the driver under all circumstances. It can't just say "BEEP" "I don't know what to do, please take over" when it's a second away from getting into trouble while travelling 130 km/h.

If the car driving on Level 3 automation is reaching the end of circumstances under which it can be used, it has to handle a driver who is not responding, e.g. by switching on emergency flashers and pulling over into a breakdown lane.

If it encounters sudden heavy fog, or snow, or ice, that has to be handled safely even if the driver takes no action when prompted.

I do not think Level 3 can be safely implemented. If it can do Level 3, it can do Level 4.
 
That is they will assume some actual objects are just colors and textures on the surface of the road.

That's why we have two eyes, but it appears that Tesla's forward cameras are placed closer together than desirable for long range triangulation, so it's indeed possible that parallax isn't strong enough to weed that sort of thing out. That said, the cited optical illusion is flat on the ground, so even Tesla's cameras should be able to figure out the "girl" has no height

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
I had a friend with only one working eye. He was able to drive his VW bug all over the place. Safely.

So the spacing of the two forward cameras may not be all that important.

Driving a car is done over time. A single camera can take samples over time IF IT IS MOVING, and use those for triangulation.

All that said, I am not impressed with a person who refuses to add lidar to his navigation system because it threatens his ego. I'd add smell-a-vison if it would lessen the chance of a problem!



spsalso
 
I had a friend with only one working eye. He was able to drive his VW bug all over the place. Safely.

But, does he drive like he might have driven had he two eyes? One can certainly drive with limitations on eyesight, or limbs, etc., but those ought to drive with much more caution and defensively. The fact that Teslas continually get into stupid accidents says that its software does not drive defensively.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
"But, does he drive like he might have driven had he two eye?"

Yes, he did. I did not know he was missing a working eye until sometime later. I sat next to him for several hours, as he drove, and never suspected he had only one working eye.

I think my point stands. Well, points:

1. An entity can safely travel California's roads and freeways with only one visual data gathering device.

2. A person can be so full of themselves that...


spsalso
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor