Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Self Driving Uber Fatality - Thread I 17

Status
Not open for further replies.

drawoh

Mechanical
Oct 1, 2002
8,878
San Francisco Chronicle

As noted in the article, this was inevitable. We do not yet know the cause. It raises questions.

It is claimed that 95% of accidents are caused by driver error. Are accidents spread fairly evenly across the driver community, or are a few drivers responsible for most accidents? If the latter is true, it creates the possibility that there is a large group of human drivers who are better than a robot can ever be. If you see a pedestrian or cyclist moving erratically along the side of your road, do you slow to pass them? I am very cautious when I pass a stopped bus because I cannot see what is going on in front. We can see patterns, and anticipate outcomes.

Are we all going to have to be taught how to behave when approached by a robot car. Bright clothing at night helps human drivers. Perhaps tiny retro-reflectors sewn to our clothing will help robot LiDARs see us. Can we add codes to erratic, unpredictable things like children andd pets? Pedestrians and bicycles eliminate any possibility that the robots can operate on their own right of way.

Who is responsible if the robot car you are in causes a serious accident? If the robot car manufacturer is responsible, you will not be permitted to own or maintain the car. This is a very different eco-system from what we have now, which is not necessarily a bad thing. Personal automobiles spend about 95% (quick guesstimate on my part) parked. This is not a good use of thousands of dollars of capital.

--
JHG
 
Replies continue below

Recommended for you

"Clearly a pedestrian bridge over Mill at this location is needed."

So long as it's not the one in Florida.

As an example of what HDR can do, right now, the first image is comparable to what's in the video, but the camera probably saw something like the second image, and this is without headlights. Note that the second image specifically remaps the dynamic range of the HDR into a standard display dynamic range. I think the Uber's cameras should have seen what this second image looks like, and not what's in the video on YouTube.
StLouisArchMultExpEV-1.82_gkcb1t.jpg

StLouisArchMultExpCDR_gxxkdr.jpg


TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
I just saw the dash cam video from the Uber car- I retract my previous statement. The car was in the right lane, and the pedestrian was traveling left to right. She did not come out from behind bushes.
I now agree that the lidar should have seen her.
 
Pedestrian behavior is decidedly irrational. I have to pass a church every day on my way home. It's a large church and they must be bad sinners because they have to attend multiple nights of the week. Their parking is on the other side of the street from the church. There are two cross walks at a traffic light complete with push to walk buttons and complete stoppage of traffic for crossing that are almost completely ignored. There are two more cross walks in the middle of the block with signs in the middle of the road and a cop directing traffic. Never the less, the vast majority choose to jay walk. They step out of clusters of pedestrians on the sidewalk or from between parked cars and just head into the middle of the lanes expecting divine protection from the almighty to keep them safe.

----------------------------------------

The Help for this program was created in Windows Help format, which depends on a feature that isn't included in this version of Windows.
 
I'm actually surprised there isn't an infrared camera incorporated into their sensors. It would seems that it could easily augment the visual light sensors and associated programming, and would have the benefit of cutting through fog and darkness. Finally, it would highlight cars, deer, and people and allow easy recognition by the computer.

Professional Engineer (ME, NH, MA) Structural Engineer (IL)
American Concrete Industries
 
Uber uses LIDAR and radar sensors that don't depend on visible light. It's most likely that the suite of sensors did detect the pedestrian and bicycle (they should have been able to), or at least detected "something there", but chose to ignore them.
 
JHG mentioned, "Film emulsion, CCDs, and CMOSs do not have the bandwidth of a human eye."

The issue here might be the maximum effective contrast ratio.

The human eye is typically much better than cameras to start, *and* can also dart about and quickly peek into the shadows. In the real world, I sometimes hold my hand up to block an overly-bright street lamp, so I can better see into a dark area.

The regulators may have to impose some basic Vision Tests on new self-driving vehicles.

Investigators should consider this when reviewing the video. They might need to ask the next questions:
[ul]
[li]How come your cameras couldn't see the pedestrian?[/li]
[li]Who specified the inappropriate cameras?[/li]
[li]Who is your System Safety Engineer?[/li]
[/ul]

edit: But if "Uber uses LIDAR and radar sensors...", then it may not be a primary issue.
 
I think a vision test is a good idea for a self driving car, as they are required of humans.

I have to ask, would this type of car drive at full speed in fog? Most sane humans would not drive full speed if their vision was impaired.

Also are there any tests of these cars in areas where the roads may not be in the best conditions?

It appears the driving might only be in good conditions, so as to improve reliability numbers.
 
The vision test for drivers is more about being able to read signs, as people are often distracted trying to figure out signs. The lighting is a slight dim office ambient, which no way resembles the nighttime ambient of the Uber accident. There are lots of people with odd vision artifacts at night, which aren't tested by the DMV. I don't recall if they even still do the depth perception test.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
IRstuff,

Another problem with LiDAR is that when there are a lot of them, they will be seeing each other's signals. This would not have been a problem with this accident on a fairly lonely road, but imagine moving through a downtown intersection.

--
JHG
 
Re drawoh's video link: Interesting footage of the safety driver, or whatever they're called...

The problem with sloppy work is that the supply FAR EXCEEDS the demand
 
I'll never understand the appeal of those things. I don't even like to have my gears shifted for me.
 
Archie264, I kind of look at them the way I do busses. A great thing for other people :)

The problem with sloppy work is that the supply FAR EXCEEDS the demand
 
With my deteriorating vision and limbs I'm actually hoping they get these things working by the time I'm no longer able to drive myself. They've got about a dozen years (I hope).

----------------------------------------

The Help for this program was created in Windows Help format, which depends on a feature that isn't included in this version of Windows.
 
"Basic vision tests" will result in the programmers gaming the test. Designs are tweaked to pass specific tests all the time ... sometimes legally/legitimately, sometimes not.

I have a feeling, but nothing more, that the issue here wasn't whether the pedestrian and bicycle were detected, but that the underlying logic chose to ignore it.

If there is a pothole, or a bump, or a painted road marking, or a small piece of debris, lying on the road directly in front of the car, you don't want the self-driving logic to slam on the brakes or take drastic evasive action.

Somehow the system has to distinguish between something it needs to avoid, and something it can safely ignore and drive over or through.

Select wrongly ... and a situation like this one happens.
 
VE1BLL,

Thanks. That was the term I was looking for.

--
JHG
 
She was probably classified as an inanimate fixed object, like a bush. Inanimate fixed objects stay on the side of the road and don't end up in the vehicle lane so they're not a threat. Sure, it was an oddly moving bush, but still a bush is nothing to be concerned about.

IRstuff - I didn't say you were blaming the sensors. I was just making a general observation that that any excuse about her not being detectable is complete BS. You and I are both on the same page believing this was a complete and utter fail for the AI system.

As I already pointed out, the distance where she becomes visible is much closer than the distance I can see when behind the wheel. So the camera that was filming that video definitely had a contrast issue and did not show what a human could see. It very much works in Uber's favor, at least for the people who are clueless about the capabilities of the camera used to film that video and/or the capabilities of the sensor package being used by the AI driving the car.
 
As I said before, I don't believe that the video that was posted is a reasonable rendering of the actual video data that resides in the car. The actual navigation video is probably as damning as the lidar and radar data.

The issue with the bush theory is that it's in the car's lane for at least one second (from the time the feet are visible to the time of impact), and no warnings, no detections, and no braking occurs.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
Regarding the bush theory, not only is she in the travel lane in which she was hit for nearly 2 seconds (4 steps), she had crossed some 35+' of paved travel lanes to get there. Kind of ironic that there is a "BEGIN RIGHT TURN LANE - YIELD TO BIKES" sign right there.

Here is the link to where she was hit.
She had the potential to be seen for over 300' from where the Volvo came out from under the overpass. The second picture is the streetview from that vantage point.

2018-03-22_20_58_35-Google_Maps_nvtcnf.png

2018-03-22_21_02_43-N_Mill_Ave_-_Google_Maps_jmeqq2.png
 
drawoh said:
Another problem with LiDAR is that when there are a lot of them, they will be seeing each other's signals.

Oh will they end up scrambling each other's vision? Like everyone shinign torches in each otehrs face?
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor