Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Self Driving Uber Fatality - Thread I 17

Status
Not open for further replies.

drawoh

Mechanical
Oct 1, 2002
8,878
San Francisco Chronicle

As noted in the article, this was inevitable. We do not yet know the cause. It raises questions.

It is claimed that 95% of accidents are caused by driver error. Are accidents spread fairly evenly across the driver community, or are a few drivers responsible for most accidents? If the latter is true, it creates the possibility that there is a large group of human drivers who are better than a robot can ever be. If you see a pedestrian or cyclist moving erratically along the side of your road, do you slow to pass them? I am very cautious when I pass a stopped bus because I cannot see what is going on in front. We can see patterns, and anticipate outcomes.

Are we all going to have to be taught how to behave when approached by a robot car. Bright clothing at night helps human drivers. Perhaps tiny retro-reflectors sewn to our clothing will help robot LiDARs see us. Can we add codes to erratic, unpredictable things like children andd pets? Pedestrians and bicycles eliminate any possibility that the robots can operate on their own right of way.

Who is responsible if the robot car you are in causes a serious accident? If the robot car manufacturer is responsible, you will not be permitted to own or maintain the car. This is a very different eco-system from what we have now, which is not necessarily a bad thing. Personal automobiles spend about 95% (quick guesstimate on my part) parked. This is not a good use of thousands of dollars of capital.

--
JHG
 
Replies continue below

Recommended for you

The Google street view of the Tesla accident location shows it was a quite flat stretch of road. The wheels would not have been hidden as the car was approaching.
 
The police released the video. Two things stand out. One is that all reflectors look to have been removed from the bicycle, including the critical ones on the wheels. The other is that the 'projector' head lamps produce such a sharp cut-off that no light is above the cut-off, meaning the contrast is extreme. Frankly, I would place a lot of blame on the head lamp design, which puts far too much light close up and produces so much contrast that anything outside the beam is practically invisible.

I think that the woman would have been more visible if the headlamps were off and the amplification of the image higher.

Not helping is what looks like a black jacket and low levels of lighting from local streetlamps. Also it's not helping that neither the driver nor the pedestrian seems engaged with the situation.

I don't know if an alert driver would have done much better. Even though I know where the victim will be, until less than 1/2 second from impact I can't make out any evidence of her. There was no lighting behind her that was being eclipsed as she went across the road; not even from retro-reflective striping paint on the road.

I expect the NTSB will be interested in this and I look forward to a report as to why the Lidar and radar sensors specifically failed to detect her. There should have been plenty of time for several cycles. It wouldn't even require target path prediction.
 
Slate magazine has linked the video (warning, warning, etc.). In the video, she was not visible until the last split second. I assume the headlights were dimmed, but I would think there should have been more forward visibility than what we see.

It looks like she crossed between areas lit by streetlights. I wonder how well cameras respond to changes in light level. Our (human) eyes cope way better at changing light levels than digital cameras. This may be part of learning to walk in the vicinity of robot vehicles.

--
JHG
 
SnTMan said:
My experience is, the cars get better and better, the drivers get worse and worse.
A Truism if there ever was one!! [thumbsup2]

"Schiefgehen wird, was schiefgehen kann" - das Murphygesetz
 
That's a very bad video for UBER. When driving at night you can see objects in adjacent lanes ahead. Combined with the LIDAR there is no excuse for the car's object avoidance to have failed so badly. It didn't even attempt to slow down.
 
I agree it's bad. They're not showing the LIDAR data, which should have detected the person well before they show up in the video.

The LIDAR should be able to detect obstacles out to well past 200 ft, so it had to have detected the person at least 2.5 seconds before the person was visible in the video. This is a major fubar in the systems engineering.

The Uber should also have a radar, which should also have detected the person.

This is again a context issue, since even a non-moving obstacle in what should have been an unoccupied lane is a major deviation from normality. Moreover, given the range and the obvious motion of the person, the sensors should have been able to easily determine that there was going to be an intersection in trajectories.

The Uber supposedly has a "camera array," and at least some of them should have been configured for low-light.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
Didn't say it did. The Uber has at least one camera as evidenced by the video, but there are also other cameras angled to the side. One would think there would also be low-light cameras as well.

Below is something like what the lidar should have seen. I think Musk is crazy. Humans get into accidents precisely because they don't have enough bandwidth and detection capability; this is where lidar or radar could trivially provide additional sensing and processing capability. Hypothetically, the video is what a human driver might have been able to see, but lidar would have and should have detected the pedestrian and provided warning that something was in the adjacent lane coming up. Had it been working correctly, it should have also determined that the anomaly was moving toward the car's own lane.


lidar-1.jpg


TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
We were flying a helicopter obstacle avoidance lidar in 1994 with only about 100 kHz pulse rate. We weren't trying to detect collisions against movers, though, but today's processors are more than capable of doing so. And, today, at least 10x higher pulse rate should be possible, particularly for only 1/4 of the range that we were achieving.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
She did cross from the path in the center meaning she did cross about 3.5 lanes width of open road before getting hit. That was nothing abrupt about her movement despite what the news reports claim.

That video makes the incident very damning against uber. The car totally failed in detecting an object that was clear visible to it's sensors and was crossing into it's path for some time before the impact occurred. I'd guess she would have been on the road and easily detectable for 4-5 seconds before the impact happened which is lots of time for the car to react.

Feet appear in the video maybe 1.5 seconds before the collision. At 38mph, that is about 85 feet. I've never been in a Volvo with HID projector lights, but I've been in other cars with them and the low beams project enough light to make objects in front of the car visible to a distance that is much greater than 85'. So, I would say the camera taking that video suffers from a contrast limitation limiting the visible distance compared to what a human could see.

I'd bet a driver who was paying attention and is capable of steering avoidance instead of just freaking out and slamming on the brakes could have avoided her. The autonomous driving system should have easily avoided her too.

Blaming the sensors is a non-starter. If that accident was caused by sensor limitations, then better sensing must be developed for these cars.

Musk's argument is hinged on that fact that it's an expensive sensor so the system will be much cheaper without it.
 
Not blaming the sensors; blaming the processing of the data.

Just because it could be cheaper doesn't mean that it's the right answer, particularly if it winds up being no better than a human with terrible night vision.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
@dik, "Does this account for young drivers doing most of the driving?"

The first graph is per million miles so yes I think that it does account for the higher annual mileage of younger drivers.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
One of the factors in driving statistics is that they become heavily skewed with highway miles. There are fewer chances for interactions between pedestrians and vehicles on interstates and rural highways. Even with the following breakdown, a critical segment of information that is missing are the primary and secondary opportunities for collision. For example, this collision was a primary opportunity, where a pedestrian has a clear view and probably can hear the oncoming vehicle and positions themselves in the lane. Secondary opportunities are when pedestrians are on the edge of the lane or on the sidewalk in a location that requires a vehicle to leave it's position in the lane to strike them.

My supposition is that most miles driven have either no pedestrians present at all or have only secondary opportunities. One might argue about places like downtown Manhattan where in some areas and times pedestrians would be trapped if they were to never step into traffic lanes, but they are cognizant that drivers are less likely to yield, but that is more of a parking lot situation than a driving one.

In the following it seems to me that the largest factor is pedestrian behavior. Buses are probably very high because they are attractants and dispersants of pedestrians; lots of people on foot nearby and operate in the lane alongside sidewalks. Heavy trucks are probably low because they don't operate near pedestrians (example: fewer people near warehouses) and pedestrians can easily identify them.

(reformatted from , says is based on 2002 US DOT statistics)
(Edit: RR = Relative Rate)

Passenger cars and light trucks (vans, pickups, and sport utility vehicles) accounted for 46.1% and 39.1%, respectively, of the 4875 deaths, with the remainder split among motorcycles, buses, and heavy trucks.

Compared with cars, the RR of killing a pedestrian per vehicle mile was
7.97 (95% CI 6.33 to 10.04) for buses;
1.93 (95% CI 1.30 to 2.86) for motorcycles;
1.45 (95% CI 1.37 to 1.55) for light trucks, and
0.96 (95% CI 0.79 to 1.18) for heavy trucks.

Compared with cars,
buses were 11.85 times (95% CI 6.07 to 23.12) and
motorcycles were 3.77 times (95% CI 1.40 to 10.20)
more likely per mile to kill children 0–14 years old.

Buses were 16.70 times (95% CI 7.30 to 38.19) more likely to kill adults age 85 or older than were cars.

The risk of killing a pedestrian per vehicle mile traveled in an urban area was 1.57 times (95% CI 1.47 to 1.67) the risk in a rural area.
 
It's taken a long time to get to something that even remotely resembles a true AI, and it's still got a long way to go. AIs are going to get drunk, and aren't going to fall asleep, and that latter feature would have been a godsend back when I was driving back home from college after pulling a week of all-nighters. Micronaps at 90 mph was scary. But, clearly, the Tesla and Uber incidents show that the AIs still have a long way to go before they're as robust as I think they should be. Neither of those two accidents seem to be a fault of the sensor technology; they seem to be a fault of the systems engineering or the programming, as both look to be well within the possible use cases.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
IRstuff,

Take your digital camera out and try shooting in limited light. Film emulsion, CCDs, and CMOSs do not have the bandwidth of a human eye. That woman was invisible to the camera until the last split second. She would not have been invisible to the driver if he had kept his head up.

--
JHG
 
IRstuff, and they don't take their eyes off the road to text either.
Just this morning, I was on the same road where this accident happened- the car in front of me drifted over the lane divider line and stayed there for close to a quarter mile. Too busy texting to even realize that they were taking up 2 lanes.

I have been driving in the area they have been testing these vehicles for many months. I was skeptical when they started doing this, but I have never seen one make what I would consider a dangerous maneuver.

The more we hear about the Tempe accident, the more it sounds like it was probably the pedestrian's fault. There are many large bushes along this stretch of road, it seems likely that the sensors didn't even know the pedestrian was there until it was too late. One of the scariest moment I had while diving (only a couple miles from this site), was when a mountain biker darted out from behind some bushes while I was driving the speed limit. He came to a quick stop and almost went over his handle bars just a few feet in front of me. There was no warning that he was approaching (he was not on a trail)- and I would have had no chance of stopping if he had continued into traffic.
 
"Take your digital camera out and try shooting in limited light. Film emulsion, CCDs, and CMOSs do not have the bandwidth of a human eye. That woman was invisible to the camera until the last split second. She would not have been invisible to the driver if he had kept his head up."

btw, the safety driver was a woman. But, this is not your, or my, digital camera; any intensified camera with IR cut filter removed can see in starlight alone. Moreover, even the tiny bit of light from the headlights would be more than enough for even a moderately intensified camera, or even an HDR camera. It's unimaginable that the engineers wouldn't have at least run HDR, which is even available on cell phones, specifically for this type of use case. HDR, when properly implemented, substantially outperforms the instantaneous dynamic range of the human eyeball. The headlights could clearly illuminate adjacent lanes out to at least 100 ft, so HDR should have picked up the pedestrian in video camera.

And, since this is NOT a Tesla, the lidar, as was pointed out earlier, doesn't need any ambient light. If Uber had depended on using just that video for collision avoidance, they should have never gotten authorization for full autonomous driving, and they should be rightly sued for every penny a good lawyer can get from them.
I'm not even sure what you mean by bandwidth; the human eyeball has about a 150 millisecond averaging time, which is why it's typically happy with 24-fps imagery, while even a cheap Epson camera can do 200 frames a second. The pedestrian was WALKING, not running, not riding, a bicycle across the road, so bandwidth isn't even that relevant.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
"The more we hear about the Tempe accident, the more it sounds like it was probably the pedestrian's fault. There are many large bushes along this stretch of road, it seems likely that the sensors didn't even know the pedestrian was there until it was too late. "

The pedestrian is clearly not looking, so to that extent, they could have avoided the incident, but, again, the pedestrian wasn't moving fast, and the only issue, aside from not paying attention, is that they crossed in the worst possible spot in the section of the road, right past where the street light actually lights the pavement. Moreover, they were in the left hand lane, not hiding behind bushes. The car failed miserably in a number of ways in a foreseeable use-case. The pedestrian's feet are clearly visible within the car's lane at about 50 ft from the point of impact. The car should have been braking or swerving well before the impact. Had the car reacted at all in the half-second before the impact, there might a reasonable argument, but it didn't react, even when the pedestrian was fully illuminated by the headlights.

btw, video such as what is posed on the web, doesn't come close to displaying the true dynamic range of even the cheapest camera. There's almost nothing on the market that doesn't digitize at least 12 bits/color, but most video formats are 8-bit/color. And not to mention that display video uses AGC, which suppresses detail that might otherwise be clearly visible. It's certainly in Uber's financial interest to NOT show what the cameras probably did see.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
Clearly a pedestrian bridge over Mill at this location is needed.
 
drawoh said:
IRstuff,

Take your digital camera out and try shooting in limited light. Film emulsion, CCDs, and CMOSs do not have the bandwidth of a human eye. That woman was invisible to the camera until the last split second. She would not have been invisible to the driver if he had kept his head up.

That vehicle had LIDAR. It should have seen the bike and person regardless. But anything reflective like those shoes or bicycle reflectors would have been screaming at that sensor.

It most likely is a breakdown in processing and programming. Even with our limited vision relative to LIDAR, we can differentiate between a couple of mylar potato chip bags blowing across the road and a pair of tennis shoes. Good drivers even have muscle memory that automatically takes over to avoid those collisions. We can't see animals for anything at night. But those that drive amongst them know that two beady little specs of light low to the shoulder mean to focus our attention if we don't want want to kill someone's pet; and that two beady specs of light at chest height mean to slow down immediately if we don't want to wind up in the body shop.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor