Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations GregLocock on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Tesla Autopilot, fatal crash into side of truck 6

Status
Not open for further replies.
I wouldn't see it loosing it's lane sensor without disconnecting, that would just be baffling if it didn't have that feature. It could entirely be that the autopilot did disconnect at impact and that explains the long travel. Without a human or computer to operate the brakes it could have coasted the distance. The accident report shows the Tesla traveling roughly straight after the crash toward the bank, I suspect it accidentally threaded between the two trees and then struck the poll which finally took all it's speed and spun it out.

The Tesla Model S is only 56.5" high according to google while a typical semi-trailer is 52 inches. Thus, if it missed the wheels it could easily pass under while smashing the roof down but still keep much of it's speed.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
 
It would have to miss the trailer's landing gear as well. Hitting that might even be worse than hitting the trailer's wheels.

Apparently the trailer in question did not have the side-skirts that are being pitched for fuel economy reasons, that would have presumably registered with the autopilot and that it would have then taken whatever actions were appropriate. Can Tesla's autopilot execute a high speed lane change maneuver and back again at an autocross level of intensity?

Never mind that the failure to see an obstruction 52" above the pavement when the Tesla itself is taller than that is really troubling. Even if you're willing to sacrifice the roof and optimistically assume that it would get cleanly sheared off, the top of any occupants' heads is likely to be at or slightly higher than 52" as well.


Norm
 
Norm said:
Can Tesla's autopilot execute a high speed lane change maneuver and back again at an autocross level of intensity?

I don't know about this extreme of a maneuver but the driver in the fatal accident previously posted a video of his Tesla taking an abrupt maneuver onto the shoulder to avoid a truck that tried to occupy the same space as the Tesla. However, I believe the safest action to being cutoff will always be to brake straight ahead.

Agreed that missing an obstruction below the roof of the car seems like something that shouldn't happen.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
 
Why is there repeated mention of the sensor being incapable or seeing objects as high as the bottom of the trailer? Was there any factual statement released about height limits of the sensor?

I only saw a statement saying the sensor was unable to differentiate the white truck from the bright clear sky behind it. This would seem to insinuate the problem is not one of geometry but of optical analysis.

Have I missed something else?
 
Tesla PR Letter said:
The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer

Elon Musk twitter post said:
Radar tunes out what looks like an overhead road sign to avoid false braking events

The impression (my own, not cited) is that the visual camera didn't distinguish the trailer due to the bright sky/light while the radar incorrectly categorized it as an overhead sign (possibly because the camera was not recognizing the truck's trailer).

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
 
@TehMightyEngineer: I see, I interpreted a bit differently before, but I think you're probably right.

Personally, I think the biggest fault is in the user's decision of bestowing control to a system not deserving of complete control. It's never been advertised as being able to replace driver attention, and you have to hit an 'ok' button (or similar) every time you turn on the auto-pilot feature, agreeing that you, as the driver, understand you have to keep your hands on the wheel and pay attention, etc etc.

However, it's also called AUTO PILOT which, in common parlance, basically means you can relinquish control and let the machine take over. (see below) I think the name is a bit unfortunate, but I don't know how much responsibility that puts on Tesla for any 'deception'. Obviously we're all accustomed to just hitting 'ok' without reading EULA-like text on a screen, so the effectiveness of those dialog windows only exists in a courtroom.

It's unfortunate. I don't put any blame on Tesla, personally. They never advertised their system as being a foolproof object avoidance guarantee - just that it's an augmented system designed to assist a driver in their normal driving habits. I think people have trouble discerning the current state of technology from the very public discussions from Tesla on what they /want/ it to eventually be. I am guessing early adopters may be more prone to conflating car capabilities as they are most likely very eager to see the new technology and may have their rose tinted glasses on at times.

Or maybe it was simply a mistake that will be common to the laziness of apathy that all drivers are capable of.

URL]
 
JNieman: I'd agree with that assessment (and Volvo engineers had a similar sentiment earlier, though obviously they have bias given that Tesla is a competitor).

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
 
The two hypotheses seem to be the sensors were looking too low, or couldn't see a light colored trailer against a light colored sky. What about those dark colored things called tires that must have passed several times at sensor level in front of the high portion of the trailer?
 
TME said:
However, I believe the safest action to being cutoff will always be to brake straight ahead.
One of the primary intentions of ABS is to allow maneuverability without requiring the driver to modulate the braking effort himself, so the actions of steering and braking shouldn't be considered mutually exclusive.

FWIW, Autopilot should be better at mixing the two because it isn't limited by such subjectives as panic or unfamiliarity with what the car is capable of doing and how to make that happen.

I've watched that video where the Tesla avoided getting hit, but getting around the much bigger truck would have required something more extreme at least in lateral travel if not necessarily in lateral and yaw accelerations.


Norm
 
Norm said:
FWIW, Autopilot should be better at mixing the two because it isn't limited by such subjectives as panic or unfamiliarity with what the car is capable of doing and how to make that happen.

Very good point, this was the primary reason my normal response while diving would be to brake straight ahead but you would expect (hope?) that an autopilot could have a more proactive response without worsening the situation.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
 
I totaled my little Ranger pickup a while back. How it happened was that I was in the right lane on a freeway, had traffic behind me and to my left, and a truck moving very slowly merged in front of me. The mistake I made was assuming that the truck had an acceleration lane, when in fact, due to long-term construction, there was none, so he had to go into the lane or stop. The key to avoiding that wreck was to anticipate that somebody else is going to do something and hit your brakes even when there's currently zero obstructions in your path. It makes me curious to what extent they can work that into the software.

A similar issue: You're driving down a freeway at speed in the middle lane, and for some reason, traffic is backed up and stopped in the right lane. Your lane is perfectly clear. What do you do? Well, past experience shows that somebody in that right lane is going to pull in front of you, and if you wait until it happens to do anything, you have problems.

In this case, a major part of avoiding that wreck would have been realizing what was going on when that truck started moving. Not seeing it when it was right in front of the car is bad enough. Not seeing it move into that position at a typical snaily truck pace is worse.

It'll be interesting to see what this costs Tesla.
 
JStephen: that's one aspect where I actually imagine a full driverless car would excel. For example, google's car has shown that it can keep track of objects that are normally obscured from the driver, and keep track of them in all directions. Thus, a car sufficiently programed and experienced might realize that the likely outcome of the situation you were in would be vehicles merging into your lane and may be able to anticipate it to some degree.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
 
The following is speculation, but probably true: The Autopilot would have seen the truck (the tractor part of the tractor-trailer), and it would have seen the truck's rear wheels under the front of the trailer. The truck's tires are presumably black, and presumably perfectly visible. These image features would have been exiting the lane towards the Tesla's right, and the Autopilot then presumably decided that the space behind the truck was empty. But even then, the system seemingly decided to do a 65 mph flyby mere feet behind the black tires that were still moving across the road in front of it. That's not something that any sensible driver would ever do.

If it's a system design failure, then the nicest thing we can say is that they're not done yet. It would be preferable if it were a simple hardware failure, as opposed to such a blatant system design failure. TBD what really caused this failure.

I half-expect that the US DoT will request that Tesla remotely disable the Autopilot feature on the entire fleet until it's finished, tested and properly certified. If this requires major hardware modifications, then it opens a new can of worms.

Very old advice: "A.I. is hard." (<- Many ships have crashed upon those rocks, so to speak.)

 
Is there any reason to believe that all of the sensors on the car were working, properly calibrated, pointed the right way, unobstructed, etc.?
Probably something that Tesla engineers can pull out of a log file, but is there also a diagnostic that the driver can access? Or a warning?



STF
 
What do you mean not finished? What does it claim to do that it cannot?
 
"What does it claim to do that it cannot?"

Legal disclaimers aside, there's an implied capability of driving down the road without crashing into the sides of trucks that happen to be painted white. One can argue if this was an 'implied capability', or a naïve assumption. In any case, it's a blatant system failure of something. I'm sure that Tesla is looking into it...

There's been a huge amount of hype about 'self-driving cars', in the context of "A.I.". These sorts of failures are quite revealing. It all goes back to "A.I. is hard."

It's not very interesting (in terms of technology) to get into the legal disclaimers, or if the trucker failed to yield. The system failure (it not braking) is the main point. It's relevant to the hype, the actual state of development, the fitness for purpose, and how many years into the future until they'll be ready for widespread deployment on public roads.

I acknowledge that others may have differing opinions. Which is fine.
 
I suppose an interesting question is whether you can build a good enough driver by accreting lots of small tasks and skills, or do you need a mighty monolithic platform? Tesla have obviously decided that the first approach may be sufficient

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
I have laser cruise control on my van. I don't use it often because it will just slow down behind a slow poke driver. So I just switch it off, overtake and switch it back on only when lane is clear or when cars ahead are moving good. I will not have confidence using an autopilot driving system.
 
It makes sense to put the blame on the driver, but performing BETA testing with peoples lives is a dumb thing to do.

Reading about the system further, it uses the windshield mounted camera as an "eye" to classify objects, lane markers, signs etc. Lots of image gathering and learning has been done so the fancy processing can figure out what it's actually looking at. It classifies what the objects are so the system can know what these objects might do or what type of hazard they present. The camera is the primary system used to detect objects and plan the driving path. The detection system comes from Mobileye with Tesla apparently doing their own self-learning algorithms. The system also has a forward facing radar unit in the grill. To me, that says the radar unit, as a minimum, is a backup sensor used to ensure physical objects in front of the car are properly detected so the car doesn't drive into them.

So, sure the camera didn't realize it was a trailer and possibly classified it as an overhead sign by mistake. But, the radar unit also failed to tell the system there was a blockage the (whole) car could not drive past and the white of the trailer has no relevance to how it operates. If the radar unit didn't pick out the object then that also tells me it's not looking high enough to ensure objects aren't in the was of the roof of the car.

The system detecting objects probably 30-40' apart on each side of it's intended path and deciding it was OK to drive between them at full speed also seems to be a logic failure. With objects limiting the space for the car to pass, at some point it should decide the path is clear but has a lower level of "safeness" so proceed with an appropriately elevated level of caution.

I agree, "A.I. is hard."
 
LionelHutz said:
but performing BETA testing with peoples lives is a dumb thing to do.

I wouldn't say I disagree with this but I would say there are exceptions. Statistically (not enough data I know, but for arguments sake let's say there is) it appears that the Tesla autopilot results in fewer accidents when used properly, according to Tesla. To me this is not very different from the FEMA and SAC joint venture response to the Northridge Earthquake. They were effectively implementing steel seismic designs that were still in a "beta" testing phase. However, they showed that a flaw existed in current designs and their fix was determined to result in improved performance even while still in a "beta" testing phase.

I don't fault Tesla for offering this "beta" testing feature as long as it was demonstrated prior to offering it that it was able to statistically improve safety. If they did offer it without qualifying it's safety and just lucked out that it was safer than the average driver, then I entirely agree that it was a dumb thing to do.

Professional and Structural Engineer (ME, NH, MA)
American Concrete Industries
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor