Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Self Driving Uber Fatality - Thread III 4

Status
Not open for further replies.
Replies continue below

Recommended for you

I'll tell you something that is completely useless, that is the modern cruise control that decides to disregard your selected speed when there is traffic in front that triggers non-speed-holding subroutines.
I can decide for myself when to turn off cruise control... if it decides for itself, I don't need it.
I get that new cars are designed to be safely operated by morons. That's one of the reasons I like old cars.

"Schiefgehen wird, was schiefgehen kann" - das Murphygesetz
 
hemi,

I think I've mentioned it in here before, but... the auto-braking on my wife's Ridgeline (and likely similar systems on other cars) does NOT take into account when someone pulls in front of you from the adjoining lane. I learned this fact the hard way.

The system's stock reaction is to brake hard, which basically means the person riding your rear is going to run right up your tailpipe. A human would like let off the gas to gain some distance.

Dan - Owner
Footwell%20Animation%20Tiny.gif
 
IRstuff said:
No; it's more a question of how much power you are willing to throw out. The Velodyne HDL-64 has a horizontal pulse rate of 34.4 kHz, and therefore can accommodate the 2.7 us TOF for a 400-m range

The Velodyne HDL-64 costs $100000. For that, I can buy a really cool non-robot car. Their VLP-16 is $8000, which is more manageable, but it only does 600,000 points per second. I am analysing for a single laser and receiver LiDAR, and I am assuming that the range is limited by time of flight between the laser pulses. Laser power is limited by the need of the system to be class[ ]I. You can always use a larger aperture receiver lens and scanner.

--
JHG
 
Not sure why 'slams on brakes' equals getting rear ended. The software needs only to supply sufficient braking to be stopped just before contact. If the answer to the chance of being rear ended is to not do that, then the car will get rear ended anyway, after hitting the object it did not avoid in the first place. It would be an extremely naive system to identify a potential collision 10 seconds out and lock up the tires and leave 200 feet of distance to the target.

Besides, the trailing vehicle operator should be allowing for the possibility anyway and looking ahead one or more vehicles to see what they are doing and what's in front of them rather than measuring the distance to the bumper of the car they are following in single digit centimeters. If it's a matter of being at-fault for running into someone or avoiding that and getting to sue a tailgater, I am in favor of collecting from the tailgater.
 
3DDave said:
Not sure why 'slams on brakes' equals getting rear ended. The software needs only to supply sufficient braking to be stopped just before contact. ...

If you or a robot make a full panic stop, you are going to be rear-ended by people behind you who are not paying attention, apparently you will be hit by Teslas, and you will be hit by people whose brakes and tires are not as good as yours. My assumption is that a robot must see hazards in time to decelerate at a rate of 0.4 to 0.5g. As I recall from driver training, a teenager paying attention, needs about three quarters of a second to hit the brakes. Getting rear-ended is not your fault, but your car is still smashed, and you still have whiplash.

--
JHG
 
"It would be an extremely naive system to identify a potential collision 10 seconds out and lock up the tires and leave 200 feet of distance to the target."

The problem is not a system that locks up the brakes leaving 200 feet, it's the system that's set up to wait until it has just enough distance to stop before applying the brakes hard, which is how they are currently programmed, if I understand correctly.
 
Hotrod10, I considered that. Still, it does not guarantee that the following car is forced into a collision. That driver is just as capable of paying attention as anyone else. At this point it's like complaining about how violent an airbag deployment is and saying it's just better not to crash and ditch the airbag completely.
 
"Still, it does not guarantee that the following car is forced into a collision."

If the driver of the following car is maintaining a safe following distance, it won't. How often does that happen, though?

"At this point it's like complaining about how violent an airbag deployment is and saying it's just better not to crash and ditch the airbag completely."

Some research has indicated that passenger side front air bags do little to improve safety for a passenger properly restrained by a lap and shoulder belt, but have caused fatalities in some cases.
 
Some research does not include the fact that the reason for airbags was poor lap and shoulder belt use compliance. It also fails to account for side impact mitigation, which lap and shoulder belts do not help with at all.
 
article said:
But because of the way the software is designed, emergency braking manoeuvres are not enabled while the car is in self-driving mode, in order to reduce what Uber calls "erratic vehicle behaviour."

I find this remarkable. Your robot can see and track objects okay, but cannot identify them reliably. The robot's default behaviour must be to stop or slow down. Any algorithm that determined an object can be smashed into must be reliable to near certainty.

Robots will be a pain to drive behind.

--
JHG
 
If there is a conflict between the autonomous driving system and the emergency braking system, it should be the autonomous driving system that gets switched off.
 
OK, at least, the sensing system performed exactly designed, including detecting the pedestrian at over 500-ft distance, as expected.

The rest is a complete fubar; it basically allowed nearly 5 seconds to lapse before figuring out that it couldn't brake and needed to warn the driver. No wonder Uber is shutting down that operation; the people that came up with that logic should never be allowed to work on anything safety related again. This is one of the few times I think licensing such engineers ought to be a requirement.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
"Robots will be a pain to drive behind"

The logical outcome of this incident will be that the default response to ANY detected object will be a maximum-ABS stop in response to a piece of paper detected ahead of the vehicle.

And the logical outcome of that, is that it will only be a matter of time before said robot-vehicle will be flattened by a dump truck that can't stop as quickly ... or will be identified as the originator of a 50-car motorway pile-up because cars behind swerved every which way and started tagging each other and then a semi-truck flips over on top of the whole mess.
 
IRstuff,

It is possible the article is confused. These are journalists, not engineers.

My interpretation of the article is that the default behaviour is to not slow down. They don't want to do "erratic vehicle behaviour". I think erratic vehicle behaviour will have to be inherent in a robot controlled vehicle.

In the Tesla thread, someone mentioned a car running through a puddle and kicking up a splash that a following robot interpreted as an object. If it is an object, don't hit it. Unless you have a very reliable algorithm that identifies splashes and puffs of smoke, the thing is an object.

If there is a driver in the vehicle, the alarm must go off the moment the object is identified, simply to provide adequate reaction time.

--
JHG
 
Status
Not open for further replies.
Back
Top