Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations GregLocock on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Tesla Autopilot, fatal crash into side of truck 6

Status
Not open for further replies.
Is this a darwin award worthy event? Trust your life on a computer.
 
I see that speculation is starting to appear in the media that the driver might have been watching a DVD at the time of the accident. That wouldn't help any "diverse redundancy" claims in the safety case.

A.
 
TME "...video... capture of what the vehicle/driver saw?"

It's a near-certainty that the Tesla Autopilot system captures video and stores it upon a crash. So there should be video. As well as lots of related data.

Given that the Autopilot drove straight into the side of a truck, I wonder how devastatingly embarrassing the video would be, or if it shows something that provides some rational explanation.

This is certainly 'a learning moment' for the over-hyped Self-Driving Car industry, the regulators and public.

 
AI trying to kill itself because it realizes what it artificially is.
 
I assume that the detectors are mounted in the grill.
The detectors saw the space under the trailer and deduced that there was enough space for them to pass under the trailer.
All the detectors did make it safely under the trailer. It was only the upper part of the car above the detectors that did not clear the trailer.
As for reflectors, not much use if the trailer is backlit and there is no source of illumination to reflect.

Bill
--------------------
"Why not the best?"
Jimmy Carter
 
This is NOT a case of the autopilot causing a crash, this is a case of the autopilot failing to prevent a crash caused by others.
The truck failed to yield right of way. So many times I run up to an 18 wheeler who pulls out because there is enough time for cross traffic to stop or even slow down but not enough for the truck to clear the crossing, using the adage of might makes right.

Hopefully the driver of the truck was properly cited, involuntary manslaughter?

True, this particular driver pushed the Darwin button a few times before and got lucky as seen by the videos posted. This time not so lucky.

Hydrae
 
"The truck failed to yield right of way."

I've read quite a few of the news items on this incident and I've seen nothing to indicate any such thing. The consistent story is that the Tesla failed to brake either to slow down or stop while the truck made a perfectly normal left turn across traffic (I'm interpreting this to mean: plenty of room).

This happened weeks ago. If the tucker had failed to yield, then he'd likely have been ticketed by now based on local police investigation.

It's possible I've missed something somewhere.
 
VE1BLL said:
It's a near-certainty that the Tesla Autopilot system captures video and stores it upon a crash. So there should be video. As well as lots of related data.

Except the car never realized it had even been in a wreck. It continued driving at the same speed for over three hundred yards!! It finally failed to stay on the road and went off-roading at high speed. It steered safely between two big trees and then.... absolutely pegged a power pole, stoving-in the front-end about 5 feet.

I think this points out the need for 'secondary sensors' that should be present to detect a crash and get the whole show stopped.

Keith Cress
kcress -
 
OMG...

If the car didn't notice the roof being sheared off, then they've got some redesign to do. Perhaps: Aircraft can have 'frangible switches' scattered around various crash-sensitive locations.

Even still, assuming the Tesla finally noticed the last pole, the circular buffer for video is likely at least several minutes, and these days could easily be hours.



 
""This is certainly 'a learning moment' for the over-hyped Self-Driving Car industry, "

That'd be the industry that has consistently pointed out that Tesla's autopilot is not a level 4 autonomous car. so a fairly uninformed comment at best.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
That's contemporary product development though. Put something out there woefully unfinished, let the early users cobble together their own use cases, then bend the original design to suit the new requirements. If all goes well, the orignal design will still be malleable enough to bend into the new shape. Who'd ever have thought about planning for remote operation (selfie sticks) when putting cameras into telephones?

Steve
 
What annoys me about this is that for decades people have been pointing out that autopilots with human oversight are the wrong approach but Tesla's half arsed beta is more of the same. We aren't very good at intervening at a moment's notice when the machine packs up.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
GL "...industry that has consistently pointed out... ... fairly uninformed comment at best."

The complete quote also included "...the regulators and public."

"This is certainly 'a learning moment' for the over-hyped Self-Driving Car industry, the regulators and public."

The sentence should be parsed as follows:
1) The Self-Driving Car industry is over-hyped.
2) This is 'a learning moment' for industry*, regulators, public.

(* 'Industry' includes Tesla, but others may learn as well.)

Both of these points are defensible. This tragic incident provides clear evidence on both points.

I wasn't intending to blame all members of the industry for the over-hype, if that was your concern. ...and apologies if I've misinterpreted your concern.

The 'technology press' is likely the main contributor to the over-hype. But some in industry seem to use hype as a marketing tool.

None of these groups are monolithic. Some parts of industry may be responsible and cautious, while others in industry are literally running 'beta' tests on public highways; even while some members of the public may have too much misplaced faith in the over-hyped and still-immature technology.

Apologies that the quoted sentence caused concerns. Parsed correctly and understanding who is 'industry', I think that it's defensible.

PS: Forum posts by their nature are compact, and subtle meaning and intent can be lost. It requires far too much time to craft posts that cannot be misinterpreted.


 
VE1BLL

Look at the Washington post link in Bimr's post. It has the crash report.

The intersection is uncontrolled meaning no lights therefor those turning left yield right of way to oncoming traffic. But does have to yield to cross traffic which has a stop sign.

You can also look at the intersection on google earth 29°24'38.71" 82°32'22.44".

Hydrae
 
That would be 29°24'38.71"N 82°32'22.44"W (The original co-ordinates defaulted to somewhere in the Himalayas!)

This intersection looks as flat as can be with nothing that would have obstructed visibility between the two vehicles.
 
"...Washington post link in Bimr's post. It has the crash report."

The Washington Post has a map (which I've seen before) extracted from a report. But I don't see any crash report, or even any link to a crash report.

This happened weeks ago. If the tucker had failed to yield, then he'd likely have been ticketed by now. There's no mention anywhere that I've seen of the trucker being ticketed. Therefore it seems likely that there was adequate room.

If the truck had suddenly pulled into the Tesla's path, and the Tesla then applied full brakes, then that would be a different story. That's not what happened here.

 
There wasn't enough room, otherwise the car wouldn't have hit the truck. The truck giving enough room for the car to safely yield it's right of way if the car or it's occupant had actually applied the brakes isn't the same as the truck giving enough room to turn left in front of the car.

I'm with a few others here. Tesla was beta testing their system on the public and it caused someone to die.

I also agree with the comments about people not being capable of suddenly taking over the controls when they're not really paying attention. Even airplane pilots who have trained for that situation have issues when this occurs.

 
Sort of brings up the question that are these computers hackable, or do they have some sort of fail safe for a rouge program. Not because the person behind the wheel may not notice, but because some have been talking about driving without a person behind the wheel.

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor