Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

2 dead in Tesla accident "Noone wasdrivingthe car" 15

Status
Not open for further replies.

MartinLe

Civil/Environmental
Oct 12, 2012
394
DE

“no one was driving” the fully-electric 2019 Tesla when the accident happened. There was a person in the passenger seat of the front of the car and in the rear passenger seat of the car.

the vehicle was traveling at a high speed when it failed to negotiate a cul-de-sac turn, ran off the road and hit the tree.

The brother-in-law of one of the victims said relatives watched the car burn for four hours as authorities tried to tap out the flames.

Authorities said they used 32,000 gallons of water to extinguish the flames because the vehicle’s batteries kept reigniting. At one point, Herman said, deputies had to call Tesla to ask them how to put out the fire in the battery.
 
Replies continue below

Recommended for you


x50 in a 'pile up'?

Rather than think climate change and the corona virus as science, think of it as the wrath of God. Feel any better?

-Dik
 
So what good is the "L3" rating, other than to provide a false sense of security. L2,3 and 4 levels should not exist. The line of responsibility is blurred. No one knows who's in charge, resulting in no one in charge. The captain is 4 sheets to the wind, drunk in the bunk, i.e. car hits tree a number of meters off the road. It was better when a driverless car could not get out of the driveway. They only hit the garage door, or the neighbor's post box.

So in the near future nobody will survive any crash that dislodges the battery? I think I know what it is to feel loke a marshmellow in waiting. That car was A-bomb toast. I'd rather get hit by a drone strike.
 
L 3 should not exist, agreed.

Someone said that Teslas test drivers (and the drivers in L3 vehicles) are Teslas moral crumpling zone - since the driver "must at all times be able to take control at no notice" any accident is the drivers fault, absolving the manufacturers of liability (if it actually works like that in court, IDK)

@JStephen
speculative, maybethe autopilot can navigate curved roads, just not reliable enough to rate as an actual autopilot (as was seen, sadly)? My feeling is that with a lot of the so called AI, machine learning thingies, "decent performance on many problems" and "horrible performance on some specific prolems" can exist within the same "AI" and you can never know unless you test with the correct problem.
So at least is my impression but I only read pop-sci about "AI"
 
The function of pocket parking a car has been around for a long time.
Can be done without hands on the steering wheel, no need for interaction or monitoring from the driver.
And are as I see it included under L3.
Maybe the description in dik's post is a bit concise and not quite so clear?

Best Regards A


“Logic will get you from A to Z; imagination will get you everywhere.“
Albert Einstein
 
Martinle

The definition posted by Duke says the driver must be capable of taking control WITH notice.

Now how much notice is needed is the crucial thing. Ten seconds is a lot different to one second. Of course you can't take control if you're either asleep or in the back seat...

If you look at aircraft incidents when the autopilot suddenly switches off this frequently leads to a crash as the pilots are just not fully engaged and they can't catch up with events fast enough to save themselves.

Same thing with level 3. It is just unfeasible to expect people to go from no control to full control in the time scale you need.

Remember - More details = better answers
Also: If you get a response it's polite to respond to it.
 
If you have people in the car, but not actually driving, the AI needs to believe that they are engaged in other "business", probably monkey business. I mean like that's the whole point of having something like that, right. That might have been what those guys were up to. Hey, let's test the "DUI feature". And do you really want "to drive" without your hands on the wheel? It surely takes more mental effort to supervise the AI driving your car than if you did it yourself. You always know what you are going to do in the next few seconds, but with AI it is a continuous guessing game. That just increases driver stress and after an hour of the AI getting it right, you're going to get tired of playing that game and drift off into other passtimes....BANG!

People do not have the attention span and focused concentration capabilities of a robot. That's good, because we can think of other fabulous things that robots can't, but bad when we need to be concentrating like a robot, especially without a constant demanding necessitity of having to be totally engaged while doing it. People's minds are not a continuous loop made to supervise automated tasks. Well ..at least mine is not. It will be a novel experience of which we will soon tire. Then turn into something kike watching a robot stack blocks. Then not so interesting, then riding into the long leaf pine trees. I dont think I want to be a supervisor of robots. CP3O can you take over here?
 
The Tesla auto pilot follows lines. You can find all kinds of online Tesla "oops" videos where AP failed due to not being able to follow the lines. It fails without visible lane marking lines that keep going in the direction of the lane. I doubt there were any lines here, so what was it following is the question. Maybe it was following the concrete curb like a line and then lost it when the curb turned. At any rate, if the AP was engaged then this outcome isn't a surprise to me because it can't handle driving on a small street like that one. I only call it auto pilot because Tesla calls it that, it's far from a real auto pilot.
 
Not sure how much of this is Tesla's fault. Musk is saying it's not, that autopilot was not engaged and it did not have FSD (Full Self Driving). So how did it go down this short road with no lines and crash at high speed?

Elon Musk, the CEO of Tesla Motors, tweeted about the crash on Monday, saying that data logs recovered showed that autopilot was not enabled and the car did not purchase FSD.

Your research as a private individual is better than professionals @WSJ!

Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD.

Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.

— Elon Musk (@elonmusk) April 19, 2021

----------------------------------------

The Help for this program was created in Windows Help format, which depends on a feature that isn't included in this version of Windows.
 
SAE J3016 is an ENGINEERING document, and describes what each level and why it exists. The are clear reasons why Tesla's "Autopilot" is NOT L1, from an engineering perspective. Unfortunately, Tesla and others put marketing spin on things, and the drivers who believe them just don't have the horsepower to understand the nuances and gradations; not that it makes that much difference, since even without the marketing spin, Tesla drivers would still get lulled into a false sense of security, simply because of the perceived behavior of the driving assistant.
j3016_bpu3go.gif


TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
and because the 'average Joe' is going to use L3... it should not be allowed. The government is remiss.

Rather than think climate change and the corona virus as science, think of it as the wrath of God. Feel any better?

-Dik
 
I see that EU has restricted what these systems can be called, more in line with reality.

You can stop these fires, if you get them cold enough. Liquid Nitrogen would do it.

= = = = = = = = = = = = = = = = = = = =
P.E. Metallurgy, consulting work welcomed
 
The standard looks more like a taxonomy than whether you should or shouldn't have an L3 system. Authorities having jurisdition can decide that, and refer to the standard when regulating car automation. Murder is both defined and outlawed.

It doesn't look like there was anything left in the back seat to give a ticket to, but if there was, the cops would have ticket them for some sort of negligent operation.
 
1503-44 said:
So what good is the "L3" rating, other than to provide a false sense of security.

I have claimed in other threads that there are not six levels of automation, but two.

[dt]Level 1[/dt]
[dd]The controls are a touch screen, a keyboard and/or a microphone. The passenger(s) tell the car where they want to go. There is no passenger control of the vehicle. If there is an accident, the manufacturer/owner of the vehicle is legally responsible.[/dd]
[dt]Level 0[/dt]
[dd]The driver controls the steering wheel, gearshift[ ](?) and the pedals, and they continuously watch out the windows. A robot may function as a back-set driver. If there is an accident, the driver is legally responsible.[/dd]

--
JHG
 
And jeebus, if that's what's left after it hit a tree I don't want it. I used to work for a guy said he flipped a SAAB 900 end over end, released the seatbelt, fell to the ceiling and opened the door to get out.
 
May have to bring back that old saw about electric cars, "It's the batteries st**pid". :)

The problem with sloppy work is that the supply FAR EXCEEDS the demand
 
The primary problem with this whole this is about Tesla's marketing.

Their product, even with FSD, is SAE Level 2. They have marketed and sold their system as SAE level 4. It is not even close to that from a development standpoint, and their vehicles will continue to kill people until either A) Teslas operating at true level 4 capability become available or B) someone finally gets people to realize who is culpable here (Tesla) and their vehicles incorporate more safeguards, eliminate automated features, or both.
 
It's really a matter of expectation; we don't berate GM or Toyota if someone using cruise control runs into another car or kills a pedestrian; Tesla's Autopilot is not much more than a fancier cruise control, and had it been marketed as such, the expectations and abuses would be solely the fault of the drivers.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
Does marketing create culpability or does the legal illegible fine print (sales contract) erase legal responsibility?

It seems there is a potential lawsuit here.

The result of the fire is every bit as bad as the Ford Pinto gas tank problem.
Grimshaw v. Ford Motor Company, 1981, The American Museum of Tort Law

[URL unfurl="true" said:
https://www.tortmuseum.org/ford-pinto/[/URL]]In 1978, following a damning investigation by the National Highway Traffic Safety Administration, Ford recalled all 1.5 million of its 1971–76 Pintos, as well as 30,000 Mercury Bobcats, for fuel system modification. Later that year, General Motors recalled 320,000 of its 1976 and 1977 Chevettes for similar fuel tank modifications. Burning Pintos had become a public embarrassment to Ford. Its radio spots had included the line: "Pinto leaves you with that warm feeling." Ford’s advertising agency, J. Walter Thompson, dropped that line.
 

I'm not quite so sure... not if he's marketing it and making money from implying that it is something better.

Rather than think climate change and the corona virus as science, think of it as the wrath of God. Feel any better?

-Dik
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top