Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Tesla "autopilot" disengages shortly before impact 9

Status
Not open for further replies.

MartinLe

Civil/Environmental
Oct 12, 2012
394

"On Thursday, NHTSA said it had discovered in 16 separate instances when this occurred that Autopilot “aborted vehicle control less than one second prior to the first impact,” suggesting the driver was not prepared to assume full control over the vehicle."

Where I a legislating body, I would very much demand that safety critical software needs to be as transparent as any other safety feature. Which would limit "AI"/machine learning applications in these roles.
 
Replies continue below

Recommended for you

spsalso,

My cats are smaller than breadboxes. I don't want anybody hitting them either.

Don't hit anything!

--
JHG
 
[IRstuff said:
I'm not disagreeing, but someone needed to tell Uber's software gurus that little tidbit, and they possibly could have avoided killing a pedestrian on a nearly empty roadway.

The Uber car wisely hit a homeless person who did not have a family with a good lawyer. How do you work that out from a LiDAR signal?

--
JHG
 
Interesting little snip here.
I like the multiple shots of musk from 2014 saying "next year..., next year..."

And they make the point that Tesla is only level 2 and that the public are basically doing the beta testing.
Uber and Lyft I think have given up.

I'd missed this car jam story.
Remember - More details = better answers
Also: If you get a response it's polite to respond to it.
 
Until ALL of the driving automation developers strictly follow some level of safety protocol, such as "don't f*ing hit anything," they should be held on a short leash.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
"short leash":

How about an automatic payout (from manufacturer) of $100 million dollars to the people/estate of anyone in one of their cars where the "auto pilot" was on within 5 seconds of the event. Maybe a few bucks less for your basic fender-bender.

Payout to be made within 14 days.

THAT would get my attention!

I am sure our Congress will consider this matter and come up with a plan, also within 14 days.

spsalso
 
By that logic, should a perfectly functioning autopilot simply turn itself off without human input when an unavoidable crash is five seconds out?
 
My choice of 5 seconds was made based on what I guessed would be an adequate reaction time by a human, when the auto pilot gives up.

I was trying to get something between "Let's have it turn off a millisecond before the crash so we can deny responsibility." and "The autopilot's been off for 5 minutes and the stupid human crashed all on its own."

I expect the NTSB could come up with a nice number with an afternoon of consideration plus a few beers and snacks. We should probably video the event, should their produced final results be incoherent.



spsalso
 
In road traffic, it is rare that a collision situation can be identified as "unavoidable" 5 seconds before it happens. If there's 5 seconds, you have enough time to stop from any legal road speed except on slipperiness, in which case, you're going too fast to begin with, and at a more appropriate speed, 5 seconds is still enough time to stop.

Developing situations can sometimes be identified 5 seconds out ... but that's while actions can still be taken to prevent the bad outcome.

It's more common that (let's say) another driver fails to signal a left turn and makes the left turn in front of an oncoming driver who has not enough time to stop and no escape route available, maybe a second before impact ... it has happened to me. I used that second-or-so to cut speed from 60 km/h (legal) to probably 30-ish by the time of impact. I knew I was going to hit the other car, but I didn't know that until they started their unsignalled turn directly in front of me a second or so before impact.

If there's not enough time to react then the collision is 100% the fault of the one that violates the right of way. If there's enough time for the other vehicle to take avoidance action without difficulty and it fails to do so, then a portion of the real-world responsibility is on them, even if the legal fault remains with the one who did the right-of-way violation.

If a self-driving system fails to identify a tractor-trailer across its path multiple seconds before impact and it nevertheless leads to a collision ... that's on the self-driving system for failing to identify and react properly.
 
Tesla has been ordered to recall 1.1 million cars, NOT because of problems with the 'autopilot' but because the windows go up too fast:

Tesla ordered to recall more than a million US cars


John R. Baker, P.E. (ret)
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without
 
I do tend to agree with Tesla here that "recall" is what everyone else needs to do whereas they send out updates over the web to change the software, much like your laptop does. Seems to be a simple change to some small part of the code.

Remember - More details = better answers
Also: If you get a response it's polite to respond to it.
 
Bad news for Elon and Tesla:

321,628 Telsa Vehicles Recalled Due To 'Heightened Risk Of Crash,' NHTSA Says


That being said, a 'recall', when it comes to a Tesla, often only involves an on-line software update.

If you're interested, here's the full safety recall report:



John R. Baker, P.E. (ret)
Irvine, CA
Siemens PLM:
UG/NX Museum:

The secret of life is not finding someone to live with
It's finding someone you can't live without
 
If it wasn't an autopilot fail, explain to me how a driver drives right into a big red truck covered with red flashing lights, that's in the company of OTHER big red trucks with red flashing lights.

We'll have to wait for the computer logs, but a sleeping or distracted driver could certainly be a possibility. I've driven conventional cruise controls on newer cars, and they all would have come to screeching halts all on their own in this situation.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
It was 4:30 in the morning if I remember correctly. A non self-driving Toyota Corolla killed a pedestrian on the same spot at the same time the day before. That didn't make national news.
 
I-680 is one of those famous California freeways. Five lanes each direction. There ARE pedestrians on those freeways, but they are rare. And discouraged.

So a Toyota Corolla killing a pedestrian at the location the day before is a pretty incredible event.

Isn't it?

And perhaps not related to the possibility of "auto-drive".

Say. How does Tesla's auto-pilot work with pedestrians presenting themselves on a freeway?


spsalso
 
"People do that almost all the time; just need to check YouTube."

Well, not so much as you might think. I did a search on Youtube for "car hits fire truck on freeway". Maybe one or two. Hardly all the time.

If I am incorrect on this, correction would be appreciated.


spsalso
 
They decided in the UK to implement smart highways where they used the emergency lane at peak periods.

Its now getting phased out as the driver quality is not up to it.

It does though work in Germany. But then again the driving license training is more akin to commercial pilot training than what I did for my car and articulated lorry license.

In fact a private pilot gets less instructional hours and has less restrictions post qualification than car license students do in Germany.

I really don't know what's involved in drivers ed in the USA but I know in my day in the UK it was pretty poor. But mates kids that have been through it recently in UK it does have more body to it. Driving in Germany is clinical and way way less stupidity than I have seen in both the UK and USA. Plus they enforce the rules. If you don't move to the outside of the lane on the highway to allow emergency response to get to an incident up the lane divider you will get a seriously large fine and if you have a German license you will have it removed for a period until you get retrained at your expense.
 
In the US there are 50 states, 2 territories, and the capital district all with their own similar, but not identical drivers license training requirements. Results vary, not perhaps as good as Germany, but much better than the few less developed (not EU, Not US) locations I visited.

We do have a few locations where a shoulder is used as a traffic lane, it is not a popular solution. Movable barricades /reversing lanes work better <my opinion>
Screenshot_from_2023-02-19_08-10-12_kq22cm.png

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor