Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Tesla Autopilot 2

Status
Not open for further replies.

GregLocock

Automotive
Apr 10, 2001
23,127
1
38
Orbiting a small yellow star
I think it's worth breaking out Tesla from Uber, both the hardware and software are different.

So, the second crash of a Tesla into a stationary firetruck looks like more than a coincidence. Even if Autopilot wasn't engaged (the police say it was) Automateic Emergency Braking should have stopped this.

From tesla's website

Standard Safety Features
These active safety technologies, including collision avoidance and automatic emergency braking, have begun rolling out through over-the-air updates

Automatic Emergency Braking

Designed to detect objects that the car may impact and applies the brakes accordingly


Side Collision Warning

Warns the driver of potential collisions with obstacles alongside the car


Front Collision Warning

Helps warn of impending collisions with slower moving or stationary cars


So questions that need to be asked, are which of these were fitted to the crash vehicle? AEB is widely available on other cars, but according to Tesla forums it is possible that it was remotely disabled. According to one user you can set AEB to warn only. That is bizarre choice of UI design.

Anyway, I think so far there have been three AP collisions with large objects on the road in front of the car in good viewing conditions, the idiot with the truck crossing the road, and two fire trucks.


Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
Replies continue below

Recommended for you

I think Tesla's system is Level 2 (and seemingly a poor implementation, at that), but it presents itself to the end user (who knows no better) as Level 3, and it's marketed as having the hardware for Level 4 or 5.
 
DrawOh said:
Level 4 is not a functional concept.

On the contrary- level 4 is, in my opinion, what most manufacturers are likely to ever obtain, and may be the limit of what is actually possible in the physical universe which we currently occupy.

Level 4 is level 5 with restrictions- for example, a level 4 vehicle operates as level 5 when it is sunny and beautiful out, but cannot operate at level 5 in a blizzard.

Think about the simply massive range of conditions under which it is possible for a human being to successfully navigate a vehicle; operation under that COMPLETE set of conditions, with ZERO exceptions for weather (no matter how severe), obstacles, inconsistent surface conditions (think off-road applications) etc etc. That is not just a tall order- it's massive. Gigantic. It is not only impossible right now, regardless of cost, it might never be possible at all.

Level 5, truly, is not just a car with no steering wheel- it's a car with no steering wheel that still gets you there in a winter storm in Buffalo. That's a long way off and maybe never attainable.
 
I agree with you, but that opinion of ours busts a whole lot of bubbles in tech and political circles.

Who here believes that production Tesla cars actually have "hardware for self-driving capability"? I think it will be found that they don't have sufficient sensors, and they don't have sufficient computing power, and it will never handle exceptions well enough. The list of such "exceptions" is long ... in fact, it is indeterminate - and that's only the things that we know we don't know.
 
Level 4: The car is highly automated. As the driver/passenger, I get in and indicate my destination, and I do something other than drive. I watch a movie. I read a book or magazine. I make out with my passenger. I do work on my laptop. I watch out the side window, perhaps taking pictures of interesting buildings as they go by. How rapidly can I transition to control of the vehicle in an emergency? An alert teenager waiting for an instruction to brake needs three quarters of a second. How much time do I need to go "What the f&&k?" "Holy f&&k!" and hit the brakes or change lanes? If I am not focused on driving, I don't want to be responsible. Okay, you the manufacturer are responsible for any accidents, including the ones that occur when I take control of the vehicle. I need to impress my female passenger somehow! Do you really want to be responsible?

You have misunderstood L4. You are describing L3.





Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
Not according to SAE. Nothing on the streets comes even close to this. btw, this version is free to download, after creating an account

J3016_201609 Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles said:
NOTE 1: The user does not need to supervise a level 4 ADS feature or be receptive to a request to intervene while the ADS is engaged. A level 4 ADS is capable of automatically performing DDT fallback, as well as achieving a minimal risk condition if a user does not resume performance of the DDT. This automated DDT fallback and minimal risk condition achievement capability is the primary difference between level 4 and level 3 ADS features. This means that the user of an engaged level 4 ADS feature is a passenger who need not respond to requests to intervene or to DDT performance-relevant system failures.


TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
1GregLocock,

The situation is that you build a car. I get in it and head for a crowd of school children. Someone is responsible for ensuring no one gets run over. At L5, there is no question that you are responsible. I have no access to controls. At L3, I am responsible, and my full attention must be on my driving.

If you are responsible for an L4 car, why have controls? A good design approach would be for the steering wheel, accelerator and brakes to be connected to the computer, not the car. This gives the robot the option of ignoring driver inputs. If the car is L5, it would be a good idea to have a plug-in control pack, allowing an authorized human to drive the vehicle if necessary.

I am sitting in your L4 car and I say to my passenger "Hold my beer and watch this." In court afterwards, my lawyer points out that all the evidence against me is from logs generated by software that you wrote. You are not a disinterested party.

If you are building an L4 car, what is your attitude towards the passenger/driver? Somewhere in these levels, in L4 in my opinion, responsibility is transferred from the driver to the manufacturer. This is very much more important than fine technical details.

--
JHG
 
The field of AI has had repeated 'Great Disappointments' over the decades. This is likely to be another example.

Famously, "A.I. is hard." - where "hard" is 'Comp Sci-speak' for exceedingly difficult.

They'll eventually figure out that, "A.I. outdoors is even harder."

 
"If you are responsible for an L4 car, why have controls"

Per SAE's definition, L4 is not 100% capable, AND, at all levels, including L5, the driver still has the right to drive the vehicle themselves:

EXAMPLE 1: The person seated in the driver’s seat of a vehicle equipped with a level 4 ADS feature designed to automate high-speed vehicle operation on controlled-access freeways is a passenger while this level 4 feature is engaged. This same person, however, is a driver before engaging this level 4 ADS feature and again after disengaging the feature in order to exit the controlled access freeway.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
From the definition "Driver involvement: In a shared car restricted to a defined area, there may not be any."

Outside of that area, yes, it will be the driver's job to avoid emergencies. Inside that area, it is not. Major OEMs have already said that for L4 when under the control of the robot that they will have responsibility for the outcomes. That's why you won't see L3s from major OEMs.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
Just to throw a little more gas on the Tesla autopilot fire...

There will be an interesting talk at BlackHat this year:


Dan - Owner
Footwell%20Animation%20Tiny.gif
 
betrueblood
Gaze Detection...I like that. How does it work (do you need a clean windshield, etc.)?

Something much easier: Marijuana smoke detection. How about that one? Or, periodic breath-a-lizer?
 
"Driver involvement: In a shared car restricted to a defined area, there may not be any."

This isn't saying the driver can't be involved inside the defined area.
 
Deeper look into the actual NTSB report reveals another point that Tesla will be sure to harp upon, the driver set the cruise speed to 75 mph, which is why Autopilot accelerated into the barrier when it perceived the slower moving car that was in front it was no longer there, because that car correctly followed the lane.

So far, there have been lots of Tesla accidents that are simply explainable by a functional mode that was completely oversold, misunderstood by its owners, and completely inappropriate as any serious means of aided driving.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 

Maybe this will make it all better.

I think not. The term "self driving features" bugs me. What's the list of "features" that a good human driver has, which permits them to drive (mostly) without crashing? If software is able to implement that list of "features", does that make them a good driver? What happens if one "feature" is missing or isn't fully developed - what are the repercussions?

Is this going to continue to encourage drivers to switch their brains off?
 
Deeper look into the actual NTSB report reveals another point that Tesla will be sure to harp upon, the driver set the cruise speed to 75 mph, which is why Autopilot accelerated into the barrier when it perceived the slower moving car that was in front it was no longer there, because that car correctly followed the lane.

So far, there have been lots of Tesla accidents that are simply explainable by a functional mode that was completely oversold, misunderstood by its owners, and completely inappropriate as any serious means of aided driving.

Obviously, the car wasn't correctly following the lane. It was following the wrong side of a line.

As for your other point. Funny how you crapped on for pointing out that the consumers don't understand the levels OR the consumer language converting the levels into a description of what was sold to them, then you basically bring up my same point again.
 
Status
Not open for further replies.
Back
Top