Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Self Driving Uber Fatality - Thread II 8

Status
Not open for further replies.

drawoh

Mechanical
Oct 1, 2002
8,878
Continued from thread815-436809

Please read the discussion in Thread I prior to posting in this Thread II. Thank you.

--
JHG
 
Replies continue below

Recommended for you

Well, now that brings me back to my earlier post, of how do we know this wasn't intentional on the part of the autonomous car?
 
The rise of the machines?

"Schiefgehen wird, was schiefgehen kann" - das Murphygesetz
 
JStephen said:
...how do we know this wasn't intentional on the part of the autonomous car?

Old rule of thumb, "Never attribute to malice that which can be adequately explained by stupidity."

:)
 
Maybe we need a special forum just for Bench Racing.


Mike Halloran
Pembroke Pines, FL, USA
 
The posting was well crafted, and I assume, not fake news... too much at stake. If the NTSB is critical of Tesla... I can envision a court case to follow... also, too much at stake. Darwin takes another...

Dik
 
Allowing 6 seconds of no hands on the wheel is a very long time and very far distance travelled for a system that needs hands on the wheel at all times.
 
...about 175 yards at 60 mph... a reasonably long distance...

Dik
 
“Today, Tesla withdrew from the party agreement with the NTSB because it requires that we not release information about Autopilot to the public, a requirement which we believe fundamentally affects public safety negatively."

Perhaps it is just me looking at this incorrectly but I have no need to know anything about the Tesla Autopilot and I certainly do not stay on top of any news release from Mr. Musk about his cars and their features. However, I would like to have safe cars on the road and it is becoming rather obvious that there are issues with the Autopilot.

I think it would be appropriate to require that the CEO is on board his vehicle when it is subjected to independent testing on a testing site specifically designed to test the the self-driving features of that company's auto. What is happening today is running tests of poorly designed software and hardware on public road with ordinary people as guinea pigs. If nothing else, it is unethical.

I think there is an issue with the thought process in the companies that develop these system. They are software companies who appears to have the philosophy that if something is not right, you issue a patch on Tuesday and then all is good. All I can say is that is not how things are done in the chemical engineering world.
 
j_larsen,
I agree that all snake oil salesmen should test their products on themselves.
 
"I think it would be appropriate to require that the CEO is on board his vehicle when it is subjected to independent testing on a testing site specifically designed to test the the self-driving features of that company's auto. What is happening today is running tests of poorly designed software and hardware on public road with ordinary people as guinea pigs. If nothing else, it is unethical. "

Musk and Tesla point out that their Autopilot is NOT for autonomous driving. In fact, and in history, the driver needs to be FULLY engaged and PAYING ATTENTION. In every case, the driver was NOT paying attention, and there are those that go out of their way to disable/defeat the car's mechanisms for forcing the driver to stay engaged. The fatality in Mountain View was on a pathological part of the freeway involved in roadwork, and normal drivers were having problems with the lane divergence, and the Tesla did warn the driver, but it's likely that the few seconds of warning was insufficient for the driver to re-engage and figure what they needed to do, or they were so distracted that they didn't even pay attention to the warnings.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
"...the driver needs to be FULLY engaged and PAYING ATTENTION. In every case, the driver was NOT paying attention..."

Expecting a driver who isn't driving to "be FULLY engaged and PAYING ATTENTION" is far too much to ask. One of main causes of wrecks now is drivers who are supposed to be driving who aren't paying attention.
 
Tesla's problems are precisely why the big boys won't bother with L2 or L3 AVs in the field. Expecting human intervention /as a situation develops in seconds/ runs counter to the way real people use 'autopilot' like features. I must admit I wish Tesla had its own thread on this rather than messing up the Uber one.


Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
I have some sympathy for Tesla here. By the time due process has taken its time to do all the stuff it needs to do before it can issue a definitive statement exonerating the vehicle completely (let's, just for the sake of argument, assume that the car was blameless) and Tesla is released from their bond of silence, the company is quite likely to have collapsed under the weight of unrefuted rumour. Breaking cover, getting themselves chucked off the investigation, but being able to be part of the open discussion is probably the only choice they had left.

CEOs riding the tests? Wasn't that how the Verrückt testing was done (see other thread)?

A.
 
zeusfaber,

The CEO is not to design the test. Others will determine the testing to be done to fully challenge the design.

The issue with Tesla I have is with how the say that the driver has to be fully engaged at all times. It is simply impossible for anyone to within a second or so figure out what the Autopilot couldn't figure out and then take corrective action.
 
"The issue with Tesla I have is with how the say that the driver has to be fully engaged at all times. It is simply impossible for anyone to within a second or so figure out what the Autopilot couldn't figure out and then take corrective action.

If you are fully engaged, then you will see the car in front of you swerving to avoid hitting the barrier, which your Tesla detected, but doesn't seem to know what to do about it. The issue is that when things are going well the temptation to look at other things is too high.


I think "Autopilot" was an an engineering disaster, in that it should have never been named that, because crap like these accidents will continue to happen for a long time to come. It took the image recognition community 40 years to get to the point of detecting and identifying objects, and even then the performance is nowhere close to perfect. Likewise, people have worked on collision avoidance for nearly 60 years, and it's still not ready for the real world. We keep trying and we keep adding sensors, and some things work really well, but others, not so much.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
CEOs riding the tests? Wasn't that how the Verrückt testing was done (see other thread)?

Igor Sikorsky is likely the last CEO to have tested his own stuff.


Mike Halloran
Pembroke Pines, FL, USA
 
Derek L Schmidt said:
Investigators obtained video recordings of HENRY and SCHOOLEY in a raft going airborne during their personal test run on the Verrückt prototype.

(here, p10)
 
This Autopilot is just the tip of the 'berg. As it improves, there is a financial viability that drivers will become redundant and there will be fleets of driverless vehicles and laws will be passed to allow this. Just a matter of time, my friends.

Dik
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor