Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations GregLocock on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Behold... the new Tesla Convertible! 10

Status
Not open for further replies.
IF the human driver "Constantly supervises dynamic driving task executed by partial automation system", then what's the point of having it? What they're describing is a fantasy; it will never happen that way in real life. Having to be ready to take control from the computer at any time would be more tedious than just driving. It's also a lot more dangerous, even if the person is paying attention, because there will natural tendency to hesitate, waiting for the computer to react before the person takes action.
 
That's the definition for Level 2, specifically; Level 3 supposed "Performs the entire DDT," but the operator has to monitor for safe driving, and so forth, Level 5 is the only level that applies to driverless autos.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
HotRod10 said:
IF the human driver "Constantly supervises dynamic driving task executed by partial automation system", then what's the point of having it? What they're describing is a fantasy; it will never happen that way in real life. Having to be ready to take control from the computer at any time would be more tedious than just driving. It's also a lot more dangerous, even if the person is paying attention, because there will natural tendency to hesitate, waiting for the computer to react before the person takes action.
Spot on!
The nearest analogy I can think of is the old school driver training cars with duplicated controls including steering wheel on the instructor's side. And being a driving instructor with student at the controls on the open roads is a stressful job, I'm sure.

"Schiefgehen wird, was schiefgehen kann" - das Murphygesetz
 
Autopilot on a mountain road. Firstly confused by car, then comprehensively confounded by curve: YouTube.

 
Isn't that one of the roads Tesla says not to use their system on? It would be interesting to put a HUD on there to let the driver see what the car thought the plan was.
 
Several times I could feel myself tensing on that drive up the road. Blazing past that PG&E convey was a bit unnerving. I've got a whole new respect for how I'm never letting an autopilot drive me.

BTW There are lots of eye-trackers out there that could easily detect how connected to the road a person behind the steering wheel is. Tesla could use that to bail-out on autonomous mode.

Nice eye-tracker example (YouTube)

Keith Cress
kcress -
 
hemi said:
Spot on!
The nearest analogy I can think of is the old school driver training cars with duplicated controls including steering wheel on the instructor's side. And being a driving instructor with student at the controls on the open roads is a stressful job, I'm sure.
I took driver training in high school, my parents insisted on it to reduce insurance rates. I swear the instructor was on Quaaludes. He would buckle in, wedge himself against the door and light his pipe. All he ever said was left or right or pull over.

Re that Tesla video on the mountain road, all I could think was this thing is going so slow. I would be driving twice as fast and having fun at the controls. Hard to say how much at fault the autopilot was in the crash as the driver seems to have over corrected after the wheel ran off the right side. Obviously NOT a place to use autopilot. It's reactions seem quite sluggish.

----------------------------------------

The Help for this program was created in Windows Help format, which depends on a feature that isn't included in this version of Windows.
 
Autopilot on a mountain road. Firstly confused by car, then comprehensively confounded by curve: YouTube.

The Tesla system mostly uses camera imaging to follows painted lines on the road. I believe it is primarily using the white line on the right side.

In the first veer-off at the intersection the continuing line is not visible in the camera image but the visibility of the line on the side road is quite good. So, the car begins to turn because it decided the side road white line was the continuation of the line it was following.

As for the accident, it doesn't know what to do when the line visibility of all lines is severely restricted due to the crest in the road. It likely picked up on the line turning to the left, but lost track of it enough that it couldn't figure out how much meaning it basically turned a little left as it tried to find the line again.

Overall, the piss-poor performance demonstrated in that video is a clear indicator of how far away from full autonomy the Tesla system is despite Tesla touting it as almost ready for full deployment.
 
What's shown and sold is a Level 2 system, not Level 5, so it should not be considered to be anything like a full autonomy system. Whatever Tesla has for full autonomy has no bearing here, particularly since Muck is always in sales mode.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
In somewhat the same way that Andrew Wiles 129-page proof of Fermat's Last Theorum actually wouldn't fit into the narrow margin of Pierre's old book; by the time they get the A.I. working to provide 'full autonomy' combined with sufficient safety to stay off the Evening News, they'll realize that the existing hardware (i.e. CPU, memory) is insufficient to run the required software. And probably by about the same proof-to-margin ratio.

The good news is that the present crop of vehicles, the ones that are supposedly 'full autonomy' ready, will probably have been recycled into toasters and refrigerators by then. So the liability should be minimal, so long as they don't promise that they'll solve this problem on any defined schedule.

 
I'm not arguing the definitions of the automation levels. I'm just saying Levels 2 thru 4 should not be available to consumers to be driven just anywhere. If they want to test those systems, it should be done using persons doing this as a job (and considering how stressful and tedious it would be, they should be well compensated), and the vehicles should have stickers on them warning that a computer is driving.
 
I don't really see Level 2 as being problematic, per se; it's not really that different than someone taking a Ferrari on the same road and trying to do 60 mph on curves that can't really be driven at anything past 30 mph. If you intentionally violate the operating envelope, then Darwin has to be applied. The consumers shouldn't be driving just anywhere, PERIOD, regardless of what's available.

Unfortunately, we can't ask the dead drivers, "WTF were you thinking was going to happen?"

That said, Tesla, and others, are still a long ways from having even a decent collision avoidance system, much less a decent lane following. Some other aspects of Tesla's system is problematic, since it seems to not care about the actual road it's travelling on; specifically, in the case of the Mountain View crash, the lane following failed, but Tesla has full GPS mapping, and the nav system surely knew that the road didn't go in the direction the lane following algo was following. Seems a bit ad hoc from a systems engineering perspective.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
HotRod10 - I also feel it's completely asinine to expect an average person to pay full attention to all driving tasks while a vehicle is driving itself. Assists that give you a "nudge" while you're driving are far different than self driving systems that might require a "nudge" from you.

IRstuff - I don't understand the point of your post. Considering Tesla's love of beta testing on their customers, I expect whatever lane following system was in use during that video is within one update of their current best system to perform that task.

VE1BLL - I would expect that too. "Sorry, we had the hardware requirements wrong and your car will never do that." Of course, given the current state of things a Tesla may never do it anyways.
 
"...it's not really that different than someone taking a Ferrari on the same road and trying to do 60 mph on curves that can't really be driven at anything past 30 mph."

It's completely different. If you want an analogy, it's like putting your 15 year old on the first day with his learner's permit behind the wheel, putting yourself in the passenger seat, and having him drive on a narrow, winding mountain road or an urban freeway at rush hour, figuring that if something happens that you can reach over and take the wheel. Just because I survived that with my dad, doesn't make it a good idea.
 
Regarding the GPS - I don't think it has the resolution or refresh rate to be able to provide valuable feedback to the algorithm. It could easily be used to set up geofences for where the system is appropriate to be operating (limited access highways).

Regarding eye tracking - Just allowing for the hands be off of the steering wheel for no fewer than 2 seconds would be a good start.

At the end of the day the biggest problem is deceptive advertising and marketing... IMO, the root of many of the problems facing society today. But I digress.
 
"Just allowing for the hands be off of the steering wheel for no fewer than 2 seconds would be a good start."

The Tesla requires a hand on the steering wheel - doesn't seem to have helped. The guy in the incident a couple years ago had his hand on the wheel, but was watching video (a movie, if I remember correctly).
 
"At the end of the day the biggest problem is deceptive advertising and marketing..."

If they were honest about what was expected of the human passenger in the driver's seat, while the computer is driving, it would not be a selling point, because no one would use it.
 
"Regarding the GPS - I don't think it has the resolution or refresh rate to be able to provide valuable feedback to the algorithm."

The point about GPS is that the system should be aware of where it's going (i.e. map following as one input), and that there's no need to follow the other car that exited towards the right.

The raw GPS data is supposed to be combined with other inputs, such as previous curves and wheel speed sensors, to create a more-trustworthy fix than GPS alone. The usual Kalman filtering, etc.

For comprison: I've seen videos that indicate that some Bentley cars will downshift for an upcoming curve, based on a GPS-aided location (and a map of course) view of the path ahead. Also, some Mercedes will provide predictive illumination into (for example) the correct side for an upcoming roundabout, again using GPS combined with some dead reckoning.

Even my (fairly old) 2008 Mercedes correctly deduced my lane choice, going left or right at a split, in a tunnel under a harbour somewhere (i.e. no GPS signal). Apparently the GPS chip itself (probably uBlox) includes inputs for left and right wheel speed sensors, to detect not just motion, but the wheel speed differences implying direction choices. And the GPS chip itself provides the dead reckoning, built in. So I've read.

The Tesla appears to be blissfully unaware of where it is, and where it's going - at least in the immediate short term for planning the next curve. Presumably it has enough map-smart to know how to get where it's going. Supposedly it'll open your garage door for you as you arrive.

The best we can say is that they're not finished yet.

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor