Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Tesla "autopilot" disengages shortly before impact 9

Status
Not open for further replies.

MartinLe

Civil/Environmental
Oct 12, 2012
394

"On Thursday, NHTSA said it had discovered in 16 separate instances when this occurred that Autopilot “aborted vehicle control less than one second prior to the first impact,” suggesting the driver was not prepared to assume full control over the vehicle."

Where I a legislating body, I would very much demand that safety critical software needs to be as transparent as any other safety feature. Which would limit "AI"/machine learning applications in these roles.
 
Replies continue below

Recommended for you

GPS is pretty good but it's accuracy can vary up to 5 to 10m on occasion if it looses a satellite or two / runs through some heavily wooded areas.

Sure, that's the subject of a wholly separate rant ;-) Google Maps likewise does not care where you were a second ago, or how you could have possibly slipped from going south to going east on a road that doesn't allow for that sort of movement; I always find it amusing watching it go through gyrations of "Rerouting" because it decided I was on a different road, due to my traveling on the carpool overpass above that road. Most of the time, it works OK, but obviously it's not really "navigating"; it's only instantaneously keeping you on the path to a destination, which isn't quite the same thing. And, it's obviously "letting" you do the actual driving.

Nevertheless, since Tesla is ostensibly some sort of overwatching and overarching function, it SHOULD do the full blown navigation; otherwise, why the extra cost for the electronics and software?

Another nitnoid are these annoying artifacts on certain (most?) roads that seemingly are forks(?) that Google Maps insist exists in the middle of a straight-line section of the road. Pity the poor souls that have an exit immediately after the "fork" that they have to stay on the left of, and then have to take a right-side exit.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
not all gps systems are equal.

Maybe why teslas on galleo is not having the sane issues.
 
One of the big issues with self driving is anything more than cursory reliance on maps and global positioning to influence driving operation. Navigation and driving have separate requirements and are only slightly interrelated. Roads move and maps don't always get updated. Conditions change due to roadworks, accidents, weather, etc... Knowing what road you are on on a map should have only very minor influence on the choices made regarding speed choice and obstacle avoidance etc. Any automated system that is heavily dependent on global positioning for driving is unlikely to be broadly successful.

Again it should be noted that there are several other cars on the road that have abilities that rival Tesla's but they don't seem to have the same issues likely because they have bigger requirement for driver engagement.

Waymo and Cruise seem to be the leaders in automobile automation. With Cruise taking passengers and operating without drivers on public roads. Though they are speed limited and geofenced.
 
Any automated system that is heavily dependent on global positioning for driving is unlikely to be broadly successful.

Which is probably the philosophy that gets Teslas to like to crash into gore point guard rails or slide into semi trucks. Having information that could prevent an accident and ignoring it seems to me to be a cardinal sin

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
I highly doubt Tesla is using GPS to decide where the car should be on the road. For navigation, as in deciding what route to travel, sure. For positioning, no.

As mentioned already, if you look at the highway the solid white line shifts right as the new exit lane starts. The car was simply following this solid line and drove off the exit. This same thing has happened in a number of other incidents with Tesla autopilot, and has been reported many times by owners who have found there are spots on their commute highways the car tries to exit or change lanes when it shouldn't.

Looking at the street view, I'm pretty sure most any newer car could exit into that parking lot at 70mph without much issue.

The part I still don't get is how it ended up turned into the truck. I'm thinking it saw the angled parking lines and tried to turn left into the continuation of what it though was the driving lanes. By the picture, I think it's under the truck at an angle, not straight on.

Remember, it is mostly following the lines on the road. You can find lots of videos with it acting stupid trying to follow various lines on the roads.

Also remember, Tesla is trying to do the autopilot using cameras only. Musk has often spoken out against systems using other types of sensors. Cameras are only as good as the image processing. Do the image processing wrong when trying to determine what objects are seen in the images from the cameras and then you don't even know there is an object in front of the car. Case in point was Tesla attempting to drive under a transport trailer a few years ago. Using something like radar you at least get a signal saying something is out there blocking the path of the car. I understand the return is processed too, but that can be done outside of the "AI" engine. Of course calling the electronics operating these systems AI is as big a misnomer as calling the Tesla driving system autopilot.
 
I really don't understand Tesla / Musks issue with radar or Lidar. It's difficult to see what objection they have when it gives you a different input into the decision making like - this is a solid object - don't drive into it, versus some sort of software determining that it can be ignored or not joining the dots.

My point about the GPS was that the driving software would only realise it was on the "wrong road" once it had left the main carriageway and then would try to plot a route back to the main road.

Remember - More details = better answers
Also: If you get a response it's polite to respond to it.
 
I really don't understand Tesla / Musks issue with radar or Lidar. It's difficult to see what objection they have when it gives you a different input into the decision making like - this is a solid object - don't drive into it, versus some sort of software determining that it can be ignored or not joining the dots.

I think the main objection has always been about cost of the lidar, since the Tesla does use radar, but Musk hides that by claiming he wants only what a human eyeball/brain would have seen.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
These things really should come with ejection seats…

"If you don't have time to do the job right the first time, when are you going to find time to repair it?"
 
Great until they go off in a tunnel, parking structure or under a bridge....

Remember - More details = better answers
Also: If you get a response it's polite to respond to it.
 
..or actually any time you don't desperately need them. Horrible things.

And don't get me started on the HFI nightmare which is having a stripy handle between your legs when you're a bit nervous, know that you have to keep your hands off the flying controls and really have no other place to put them.

A.
 
You'd just get ejected into the bottom of the truck trailer as the car goes under it, ensuring you're dead.
 
The current beta software from Tesla is called "Tesla's Full Self Driving" so that isn't designed to mislead the public when it is only Level 2 driving assist. [ponder]

Good video here.


Some discussion and a video here, no doubt plenty of other videos you can see too.

Here you can see that the Tesla seems to start to stop the car broadside in front of 50mph traffic because its detection software has falsely detected a box truck magically appear in space. (You can see the screen to see what it is detecting.)

The driving didn't give the Tesla time to try to correct its mistake and took over. Who knows how that would have ended up.
 
LOL, that's impressive. I see it consistently does two rather bad driving things.
 
We need these videos to make a whole new category of YT entertainment: lithium combustion, auto-driving hits and misses, and (my personal favourite) seats ejecting in tunnels.

"If you don't have time to do the job right the first time, when are you going to find time to repair it?"
 
Did the programmers never attend driver ed? My instructor would have given a failing grade.
 
I love the quote in one of those links - " but the decision-making is still the equivalent of a 14-year-old who has been learning to drive for the last week and sometimes appears to consume hard drugs....."

Looks about right to me. Think of the AI bit like a new learner driver and you're not far off I think. I wonder if the technology levels can use some these analogies...

Remember - More details = better answers
Also: If you get a response it's polite to respond to it.
 
Did the programmers never attend driver ed?

You're assuming the software developers have a driver's license and that it was issued by a govt with similar "rules of the road" as ours. Many would not fulfill one or both criteria.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor