Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Self Driving Uber Fatality - Thread I 17

Status
Not open for further replies.

drawoh

Mechanical
Oct 1, 2002
8,878
San Francisco Chronicle

As noted in the article, this was inevitable. We do not yet know the cause. It raises questions.

It is claimed that 95% of accidents are caused by driver error. Are accidents spread fairly evenly across the driver community, or are a few drivers responsible for most accidents? If the latter is true, it creates the possibility that there is a large group of human drivers who are better than a robot can ever be. If you see a pedestrian or cyclist moving erratically along the side of your road, do you slow to pass them? I am very cautious when I pass a stopped bus because I cannot see what is going on in front. We can see patterns, and anticipate outcomes.

Are we all going to have to be taught how to behave when approached by a robot car. Bright clothing at night helps human drivers. Perhaps tiny retro-reflectors sewn to our clothing will help robot LiDARs see us. Can we add codes to erratic, unpredictable things like children andd pets? Pedestrians and bicycles eliminate any possibility that the robots can operate on their own right of way.

Who is responsible if the robot car you are in causes a serious accident? If the robot car manufacturer is responsible, you will not be permitted to own or maintain the car. This is a very different eco-system from what we have now, which is not necessarily a bad thing. Personal automobiles spend about 95% (quick guesstimate on my part) parked. This is not a good use of thousands of dollars of capital.

--
JHG
 
Replies continue below

Recommended for you

A human wouldn't be issued a license (based on poor vision) if they couldn't see an obstacle (or better: lack of empty road) beyond 50 or 75 m.

magoo2.jpg
 
50m is for an asphalt type surface at a guess. I wonder where wool or other natural fabrics in a dark color fall?

Next cab off the rank (haha) is the radar system. Do these vehicles have them and what is the spec?

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
It will be interesting to read the NTSB report once it is released. What I am missing in all the information that has been published is simple things such as how many sensor are there of each kind, are we looking at 2 out 3 systems, what determines if a sensor is not working correctly? Is there a safety system / computer that monitors the computer that operates the car? What kind of redundant power supply exist for the computer(s)?

Many other questions but did the Uber stop and call 911 after running over the pedestrian?

It is hard enough to make chemical plants safe but at least they are not moving down the road at 70 mph. On the other, I think there are many things from the various safety analysis that are done in chemical plants that could be applied to these robot cars.
 
Doing a bit of reading here.

A couple of quotes from this article: "Also on Monday, the auto-parts maker that supplied the radar and camera on the Volvo SUV that struck and killed the woman last week said Uber had disabled the standard collision-avoidance technology in the vehicle.
"'We don't want people to be confused or think it was a failure of the technology that we supply for Volvo, because that's not the case,' Zach Peterson, a spokesman for Aptiv, said by phone. The Volvo XC90's standard advanced driver-assistance system 'has nothing to do' with the Uber test vehicle's autonomous driving system, he said.
"Aptiv is speaking up for its technology to avoid being tainted by the fatality involving Uber, which may have been following standard practice by disabling other tech as it develops and tests its own autonomous driving system. Experts who saw video of the Uber crash pointed to apparent failures in Uber's sensor system, which failed to stop or slow the car as 49-year-old Elaine Herzberg crossed a street pushing a bicycle."
And
"Meanwhile, a top executive for the maker of sensors used on the self-driving Uber vehicle said she was 'baffled' as to why the tech-outfitted vehicle failed to recognize a pedestrian crossing the street and hit the brakes.
"Marta Thoma Hall, president of Velodyne Lidar Inc., maker of the special laser radar that helps an autonomous car "see" its surroundings, said the company doesn't believe its technology failed. But she's surprised the car didn't detect Herzberg.
"'Certainly, our Lidar is capable of clearly imaging Elaine and her bicycle in this situation,' Thoma Hall wrote in an email. 'However, our Lidar doesn't make the decision to put on the brakes or get out of her way.
"'In addition to Lidar, autonomous systems typically have several sensors, including camera and radar to make decisions," she wrote. "We don't know what sensors were on the Uber car that evening, if they were working, or how they were being used.'"

And meanwhile, an interesting take on the whole situation:
 
This article contains the most detail about the car than anything else I've seen: As has been mentioned a few times now, Uber had at least 3 sensor systems that should have detected the pedestrian. They are basically independent in their operation, and it's up the collision avoidance processor to make the decision about doing something, at which it failed miserable.

The jalopnik article is basically a rant, and however, or whatever, the writer feels about its business practice should not be confused with whether its technology is sound.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
Uber had at least 3 sensor systems that should have detected the pedestrian.

It's hard to be enthusiastic about trusting my life to that.
 
I'm simply saying that there was nothing physically preventing the sensors from detecting the pedestrian at their maximum effective ranges. Since the car did not behave like it detected an obstacle at any time, I can't say for certain that all the sensors didn't fail simultaneously AND the processor failed to detect the fault condition.

Prior to this, I certainly would not have had any doubts about the performance of the sensors, given that scenario. Even a competitor was able use the crappy video released by the police to detect the pedestrian and the bicycle at the first instant they were fully within the headlight illumination; obviously, there could be gaming of that for other reasons.

I know what I would have flowed down as requirements for the sensors, and at that range, the probability of detection would be essentially be 99.9999%, since I would have required at least 99% probability of detection at 300 ft for a pedestrian. That would be 5 seconds for a car at 40 mph, and that would mean at least 25 frames in which the pedestrian was detected. The number of lidar pixels declaring detections would have been in the hundreds.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
Marta Thoma Hall, president of Velodyne Lidar Inc., maker of the special laser radar that helps an autonomous car "see" its surroundings, said the company doesn't believe its technology failed. But she's surprised the car didn't detect Herzberg.
"Certainly, our Lidar is capable of clearly imaging Elaine and her bicycle in this situation," Thoma Hall wrote in an email. "However, our Lidar doesn't make the decision to put on the brakes or get out of her way. In addition to Lidar, autonomous systems typically have several sensors, including camera and radar to make decisions," she wrote. "We don't know what sensors were on the Uber car that evening, if they were working, or how they were being used."
Sometimes execs should just shut their mouths. One the one hand, she doesn't know what sensors were being used or if they were working. Yet she make the boneheaded move of stating "our Lidar doesn't make the decision to put on the brakes or get out of her way." That statement makes the assumption her system was in the loop (maybe it wasn't) and that it failed to do its job. Lawyers LOVE that kind of "self-incrimination".

EDIT: On second read, this could be an issue with the writer/editor. Perhaps what she meant was her system does not have the control (i.e., "say-so") to put on the brakes, rather than "my system didn't recognize the danger". Difficult to say the way the article is written.

Dan - Owner
Footwell%20Animation%20Tiny.gif
 
I interpreted her comment to be 'our systems send the information to the processor, which decides how to react'

The sensors are sensors, not processors.
 
Me too. Sensors just produce data. That data is just an input to the system making the decisions.
 
MacGyverS2000,

An extreme case here is that Velodyne's LiDAR reported an image to the robot driver, and then reported a new image a tenth of a second later. The robot then identifies obstacles and moving objects. A LiDAR will have an on-board computer and should be possible to design one that identifies, tracks and reports objects. This may make it more difficult to integrate the output of multiple LiDARs and cameras.

--
JHG
 
"A LiDAR will have an on-board computer and should be possible to design one that identifies, tracks and reports objects. This may make it more difficult to integrate the output of multiple LiDARs and cameras."

Not by design. At the root, a lidar collects a cloud of returns that simply contain range, azimuth, and elevation. A processor might be included that places the returns in their proper place in the world. Almost no lidars do target recognition, that is the province of the system processor, that integrates the radar and video data into the decision making.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
That's not a theory. It's a smokescreen. The sensor, if mounted with the base level, has an ~25 degree down angle range. From a 5 foot altitude that would be about 12 feet to the pavement before a spot on the pavement disappeared. Anything taller would be proportionally visible closer. This is a better view than from the driver's seat as blocked by the hood.
 
Yeah, that's a ludicrous argument. Moreover, the blind spot is CLOSE to the car, not far, so at 100 ft, there are no blind spots, so what happened to those detections?

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
This narrative that she "came out of the shadows" really bothers me. I heard it repeated on NPR today. That video has been a mixed blessing from a PR standpoint as it pertains to fault for this.
 
"That video has been a mixed blessing from a PR standpoint as it pertains to fault for this."

That's nonsense, I have no doubt that Uber allowed the police to have the video specifically to sway the public into thinking that the accident was unavoidable. The cited article about the settlement exactly describes what Uber had hoped people would think; "when the headlights suddenly illuminated Herzberg in front of the SUV." The person who was thinking on their feet and released that video is going to get a huge bonus at Christmas time.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor