Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Tesla Autopilot 2

Status
Not open for further replies.

GregLocock

Automotive
Apr 10, 2001
23,127
1
38
Orbiting a small yellow star
I think it's worth breaking out Tesla from Uber, both the hardware and software are different.

So, the second crash of a Tesla into a stationary firetruck looks like more than a coincidence. Even if Autopilot wasn't engaged (the police say it was) Automateic Emergency Braking should have stopped this.

From tesla's website

Standard Safety Features
These active safety technologies, including collision avoidance and automatic emergency braking, have begun rolling out through over-the-air updates

Automatic Emergency Braking

Designed to detect objects that the car may impact and applies the brakes accordingly


Side Collision Warning

Warns the driver of potential collisions with obstacles alongside the car


Front Collision Warning

Helps warn of impending collisions with slower moving or stationary cars


So questions that need to be asked, are which of these were fitted to the crash vehicle? AEB is widely available on other cars, but according to Tesla forums it is possible that it was remotely disabled. According to one user you can set AEB to warn only. That is bizarre choice of UI design.

Anyway, I think so far there have been three AP collisions with large objects on the road in front of the car in good viewing conditions, the idiot with the truck crossing the road, and two fire trucks.


Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
Replies continue below

Recommended for you

Gaze detection: good idea, but does it go far enough? How about brain wave detection, to see if the driver is going to sleep? Or better, if his/her intelligence is below a standard level deemed necessary to drive (of course, this would prevent the majority of drivers in my area from even starting the car).
 
"Gaze detection: good idea, but does it go far enough? How about brain wave detection, to see if the driver is going to sleep?"

Most systems that can do gaze detection can do blink detection as well; blink detection, particularly the rapidity of eyelid open/close, can reliably detect onset of sleepiness, and even micro-sleeps.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
Spartan5,

I have responded to this previously. The SAE automation levels are useful for discussions about engineering and control. For the purposes of a consumer/driver, this is a binary state -- automated, and not automated. Words like "Automatic" and "Robot" should not be anywhere on the controls. Any word stronger than "Assist" should be banned from the controls of anything other than a full robot.

--
JHG
 
IRstuff said:
The consumer does not need to know what they mean, they only need to clearly know what their purchase can do, or not do.

I don't get your point. Knowing what their purchase can do is similar to knowing what those levels mean and where their purchase falls into that list. So, are you saying they still need to know the same thing as those levels, but worded differently????
 
LionelHutz said:
Knowing what their purchase can do is similar to knowing what those levels mean and where their purchase falls into that list. So, are you saying they still need to know the same thing as those levels, but worded differently????

Go to the middle of nowhere in Idaho and ask a random person at a grocery store what Tesla Autopilot can do, and you might get an answer that approximates the truth.

Ask this same person to explain the difference between level 3 and level 4 operation according to SAE J3016. You'll get a blank stare.
 
Perhaps one or more of those exquisitely-defined Levels are simply very bad ideas.

Perhaps in some future edition of the SAE standard J3016, they'll be forced to add notes such as:
"(This level has been proven dangerous and is not recommended.)", or
"(This level has been banned in all major jurisdictions.)"

Imaginary dialog:
"This seems like a very dangerous design concept."
"No, it's in accordance with SAE J3016 Level 3."
"Well I suppose that if it's been precisely defined, then it must be okay."


 
I'll try again. AFTER you re-write the "geek speak" level description into consumer language in an attempt to tell the consumer what their product will do, certain levels are still very bad ideas because the resulting product will not be properly understood or used by the consumer. It doesn't matter very much what the "geek" thinks the level should mean when providing the consumer with a product that is a result of that level is a very bad idea.
 
That presumes that the "bad" idea is simply tossed out there without safeguards. Automation, in general, isn't a bad idea; the implementations seen so far are simply not adequately safeguarded, mainly because the manufacturers haven't bothered to provide robust safeguards. We can look back at the development of the car itself for some lessons. We get into a car, push a button or turn a key, and the car starts, but the original cars weren't that straightforward; cranks and chokes and priming, etc., are all the things that fell by the wayside over the course of the maturation of cars. We're even eliminating "turn-key" as a benchmark of automation by eliminating the traditional key.

We've got industrial machines that can easily kill their operators, but under most circumstances, the high-school educated operator is able to safely operate the machines.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
The safety claims by Musk are hypocritical. He starts by stating the limitations that the users must accept, and concludes with spurious statistics.

"Schiefgehen wird, was schiefgehen kann" - das Murphygesetz
 
One point that Tesla makes is that it was strictly intended to be used on highways.

But, duh, there are lots of databases that tell you that sort of thing, and the Autopilot ought to not allow itself to be turned on when it's not anyway close to a highway.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
drawoh said:
Spartan5,

I have responded to this previously. The SAE automation levels are useful for discussions about engineering and control.
That's why I posted them. I had thought that that is what this was.
 
Spartan5,

Indeed it is, however, this is a confusing thing to communicate outside the engineering world. The differences between the levels are minor, and we do not seem to be worrying about them here. No one has discussed say Level[ ]3 versus Level[ ]4, for example. There is a lot of confusion in the outside world about how responsible drivers are when they use automated vehicles. The human interface is a critical part of most technologies.

When technology creates a hazard, there cannot be confusion over who is responsible for mitigating it. Either the driver is 100% responsible, or the manufacturer of the car is 100% responsible.

--
JHG
 
I have another thought here. Perhaps there is a maximum level of automation that can be permitted when the driver is responsible for safety. The driver must be kept alert.

--
JHG
 
Machine vision sensors work pretty well to detect drowsy and missing drivers. If needed, they could even detect heartbeat to eliminate fake heads.

I suspect, though, if autonomous cars become the majority, there will be movement to eliminate human driving altogether, except on special roads.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
IRstuff,

You can ban the human drivers from the road, but what about bicycles, pedestrians, pets, loose shopping carts and Bambi and Bullwinkle? If you build an access limited track occupied entirely by robots, you have all sorts of opportunity to make the robots do cool stuff. If the robots are out in the real world where we are, they have to cope with unpredictable elements.

--
JHG
 
drawoh,
The levels are important for just the reasons you mention. There is extensive need for regulation tied to each of those levels. From functional requirements, to how they are marketed even. There is not good reason that these lower level systems can't utilize geofences and other active driver monitoring systems to minimize their potential for abuse or misuse.

It seems a lot of people are talking past each other in some instances in these threads because their not differentiating between the levels.

I mean, it shouldn't even be newsworthy that Tesla's are plowing into things because they are just a piddly Level 2 system and they're doing exactly what they are going to do when the driver isn't 100% paying attention. Tesla markets it like Level 4 though, and has extremely limited safeguards in place to keep it from being used that way.
 
Spartan5,

Let's look carefully at those levels.

Level 5: The automated car is in control, responding only to destination instruction from the passenger. If the car causes an accident, the manufacturer is legally and morally responsible. No problems.

Level 4: The car is highly automated. As the driver/passenger, I get in and indicate my destination, and I do something other than drive. I watch a movie. I read a book or magazine. I make out with my passenger. I do work on my laptop. I watch out the side window, perhaps taking pictures of interesting buildings as they go by. How rapidly can I transition to control of the vehicle in an emergency? An alert teenager waiting for an instruction to brake needs three quarters of a second. How much time do I need to go "What the f&&k?" "Holy f&&k!" and hit the brakes or change lanes? If I am not focused on driving, I don't want to be responsible. Okay, you the manufacturer are responsible for any accidents, including the ones that occur when I take control of the vehicle. I need to impress my female passenger somehow! Do you really want to be responsible?

Level 3: Conditional Automation. "The driver must be available to take over at any time." To me, that means the driver is gripping the steering wheel and watching out the front window. The best way to keep focused on driving is to steer the car. Perhaps the robot can help parallel park or back in. The instrument panel can ring buzzers and remind the driver that they are responsible if the car hits something. I think this is where we are having accidents.

Level 2: Partial automation. Same as level 3. I have no problems with the car watching out for and signalling hazards.

Level 1: No automation. The driver is responsible.

For Levels 1 to 3, the driver is responsible for safety, and must do nothing other than drive the car. There is minimal opportunity to take advantage of automation. For level[ ]5, the manufacturer of the car is responsible for safety. Level[ ]4 is not a functional concept.

--
JHG
 
Status
Not open for further replies.
Back
Top