Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

Tesla "autopilot" disengages shortly before impact 9

Status
Not open for further replies.

MartinLe

Civil/Environmental
Oct 12, 2012
394
0
0
DE

"On Thursday, NHTSA said it had discovered in 16 separate instances when this occurred that Autopilot “aborted vehicle control less than one second prior to the first impact,” suggesting the driver was not prepared to assume full control over the vehicle."

Where I a legislating body, I would very much demand that safety critical software needs to be as transparent as any other safety feature. Which would limit "AI"/machine learning applications in these roles.
 
Replies continue below

Recommended for you

In case anyone still has doubts, here's the aerial photo from Google Maps of the rest stop

reststop_z6wsty.jpg


TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
From the crash photo, it looks like the car hit the rear of the trailer almost dead-on. At a high speed.

So the car had to VERY ABRUPTLY make a hard left.

If the computer was driving, I expect it "thought" it was still on the roadway when it had veered off into the rest stop. So it maintained speed. Perhaps it saw the trees at the end of the lane, and "figured" it had to turn left to avoid them. Ran out of options.

Like many/most here, I don't understand why the manufacturer(s) did not design in the first law of (car) robotics: Don't hit anything, especially at a high speed.

I could suggest that our elected government officials forcefully remind them that they should do so, but.......



spsalso
 
Or what would happen if insurance companies had the balls to say if you crash under autopilot we don't cover you, or at least your deductible is 100x.

= = = = = = = = = = = = = = = = = = = =
P.E. Metallurgy, consulting work welcomed
 
I think, eventually, the insurance companies are going to require autopilot on... certified roads or in parking lots.

But one must admit that the drivers in this case had time while the car exited a freeway offramp designed to allow even large trucks to slow from freeway speed before entering a parking lot before striking the truck. The driver has to have been drunk or asleep.

If I remember correctly, the Walter Huang crash was a much more disturbing story. The car steered into the beginning of a center divider and eviscerated itself. Supposedly the driver was playing video games which doesn't help his case BUT... other drivers were able to recreate the situation and had to take the car out of autopilot to keep the same incident from happening again. There were YouTube videos but I can't find any right now.
 
Autopilot or no, the driver / occupants about had to be checked out in one way or another.

The problem with sloppy work is that the supply FAR EXCEEDS the demand
 
The car steered into the beginning of a center divider and eviscerated itself.


That's because the Tesla lane following routine is stooopid. It has tons of information that could inform it to stay in the correct lane, but they choose to ignore all the ancillary data and only rely on a single point of failure algorithm that's only marginally usable. That is coupled with the fact that the collision avoidance routine is also stooopid, because they designed it to make assumptions about the things it detects and bases future behavior on an algorithm that's not 100% reliable. The gore point incident referenced here, the collision with the semi in Florida, and the collision with the back end of a private jet all point to deep flaws in the systems engineering of the programs.


TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
I think self-driving is going to be the future but the roads have to be built to support it. IFR airports require certification. Self driving roads will be the same.

Also, self driving only cars are idiotic as there will always be an outlier situation that requires intervention. An unpaved road is the most obvious example.
 
Collision detection is plausible. Eventually we will invent an array of sensors that can identify an object in front of a car reliably. As humans we aren't good at preventing collision but we as humans do well at panic breaking which reduces the severity of the collision. That is not without a semi-automatic aid called ABS. Self driving cars will be better at preventing collision in the first place.

However, it's the lane keeping that requires abstract thought due to varying conditions, inconsistent markings, worn markings, etc... We aren't currently at a computing level that can support such a thing.
 
Eventually we will invent an array of sensors that can identify an object in front of a car reliably.

That is the crux of the stupidity of the current algorithms; you don't need to identify a solid object to know not to hit it. Even in the cases of the Teslas colliding with the truck in Florida and the recent plane collision, there was plenty of information that the Teslas had, even with their meager sensor suites to determine that there was going to be a collision, it was simply stupid algorithm construction that led to ignoring the possibility of a collision. In the case of the Uber fatal collision in Arizona, the sensors and algorithms clearly detected a moving object on a collision path, but the the algorithms ignored the collision possibility because they detected different objects on each detection and blithely ignored the collision possibility until it was too late to even brake.

We've had decent enough sensors for at least 10 years; the main issue is cost, and the lack of right-thinking systems engineering. Just consider the Tesla gore-point collision in Silicon Valley; the car, as a whole, "knew" it was on a road with bend, "knew" other cars were following the road, so why would the lane-following ignore both pieces of information and basically follow spurious lane markings and leave the road?

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
and more, "Safety officials have opened a 37th probe into a Tesla crash after a couple's car slammed into the back of a Walmart truck, shearing its roof off.

The National Highway Traffic Safety Administration (NHTSA) is investigating the wreck, which happened at the Paynes Prairie Rest Stop just south of Gainesville, Florida, on Wednesday."


So strange to see the singularity approaching while the entire planet is rapidly turning into a hellscape. -John Coates

-Dik
 
Notwithstanding the location, the roof was sheared off... and possibly even the occupants, too.

So strange to see the singularity approaching while the entire planet is rapidly turning into a hellscape. -John Coates

-Dik
 
SnTMan said:
Autopilot or no, the driver / occupants about had to be checked out in one way or another.

Blaming the occupants is easy. But it is the 'autopilot' that creates this problem and in this case was the direct cause of the accident. They might have been on a three hours into a trip down the interstate with no required interaction for a couple of hours. It is easy for ANYBODY to check out during that time. They might have checked back in after 5 seconds but if the car is going 70mph and you suddenly realise you are in a rest stop then your decision making part of you brain still takes a few seconds to get back into gear.

The insidious part of 'autopilot' isn't just the time taken to realise that something isn't quite right it is the time take to assess the situation and engage an appropriate reaction. This same issue is present even in driving without autopilot. My ability to QUICKLY react to a complete unexpected circumstance is far better on twisty mountain roads than a dead straight highway with no traffic.

It is interesting that they chose the word auto PILOT. When pilot has been most commonly used for aeroplanes and to marine navigation. In both cases you would generally have tens of seconds or even minutes to react to an surprise advese event while otherwise travelling in a benign environment. On our roads you have seconds. In this case the adverse event (not following the interstate) was directly CAUSED by a poorly behaved autopilot.

IMO if TESLA was a traditional automaker it would have been reigned in but authorities long ago. But somehow tech companies for the last decade have been given free reign.

The TESLA 'AI' seems to be a good LEVEL 1 system that opperates as a LEVEL 2 system but whose users can readily treat as a LEVEL 3 system. It is a recipe for the occasional unforced catastrophe.
 
human909, not to be argumentative, but I don't believe it is known whether "Autopilot" was engaged or not, although I'd have to think, yes.

My point was that I believe an actively ebgaged driver would have been aware that a rest area was being approached, that entering / exiting traffic was likely, that enhanced awareness was warranted. Such enhanced awareness should have permitted sufficient reaction time to DO SOMETHING: Steer the car back to the highway, apply the brakes, steer around obstacles or some combination.

Instead the car appears to have struck the parked trailer dead-center at 75 mph, thereabouts. Nobody did nothing.

EDIT; Apparently

The problem with sloppy work is that the supply FAR EXCEEDS the demand
 
SnTMan said:
My point was that I believe an actively ebgaged driver would have been aware that a rest area was being approached, that entering / exiting traffic was likely, that enhanced awareness was warranted.
I got your point. And I agree that they would not have been an actively engaged with the driving.

Like I said. Blaming the occupants is easy. The 'driver' if you want to call the person that is almost certainly at fault as it seems they likely weren't driving.

But we all know about HUMAN failures as drivers. I thought this was a discussion about engineering failures. Which Tesla's autopilot has many. Hence I quickly shifted to discussing the safety failures of Tesla.
 
Status
Not open for further replies.
Back
Top