Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations GregLocock on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Behold... the new Tesla Convertible! 10

Status
Not open for further replies.
Note also that the image is only at the correct perspective at one position on the road; if it were an Uber automated car, it might not even worry about it until it was too late anyway.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
I'm not sure that teaching drivers to ignore apparent children on the road is necessarily a good move.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
Yeah, it's a case of crying "wolf" too many times, eventually, the drivers will catch on and ignore, or they'll go down some other street and think a real child is a fake. People should know better.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
The single point perspective image only looks like a child from a single point above the pavement and then will appear to not be a child immediately after that. Successive images should be able to determine that the image is flat to the ground. Depending on the speed of the vehicle there may not even be a braking event.
 
As I said before in an earlier thread on this subject, a self driving car MUST be able to successfully navigate the environment. Not 95% or 98% of the environment, but all of it, or it's a danger to its passengers and everyone else on the road. If the car is doing the driving, you cannot expect the human passenger to be ready to take over if it suddenly becomes necessary, whether it's called "Autopilot" or something else.

I'm all for driver assistance features, but self-driving features, I believe, will cause more dangers than they alleviate. Autonomous emergency braking - awesome! Lane departure warnings - great! Lane keeping assist - I'm leary. I probably won't ever get into a vehicle that is capable of steering itself, even if it's supposedly only to parallel park.
 
Yes, I believe Tesla is approaching this backwards. The navigation and steering on its own should only be added when the collision prevention logic - and that includes the "defensive driving" aspects that seemingly no one has thought of - is virtually bulletproof.

I have no conceptual objection to forward collision mitigation; I'd love to see systems that force drivers to use their turn signals; I wouldn't object to automated stop sign and red light stopping systems provided that they take into account what's behind the vehicle (don't slam on the brakes if the vehicle behind isn't going to be able to stop), motorcyclists everywhere including myself would love to see automated systems that hold the car stopped if the driver attempts to start moving ahead (or turning left) into the path of an approaching vehicle.
 
And they are going to run out of money in 10 months according to Musk unless they economize.

Mike McCann, PE, SE (WA)


 
As indicated in the article cited by the OP, the Tesla Autopilot is an SAE Level 2 system, which requires the driver to be responsible for object and event detection and response (OEDR), not the car.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
Yup. The problem is that the interface with the driver leads the driver to expect it to do things that it cannot reliably and consistently do.
 
With 60 years since being issued with a driver's licence, just leave my steering wheel alone and let me be the brains behind the "safe driving" system- it has served me well thru'out this time.

It is a capital mistake to theorise before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts. (Sherlock Holmes - A Scandal in Bohemia.)
 
"...the Tesla Autopilot is an SAE Level 2 system, which requires the driver to be responsible for object and event detection and response (OEDR), not the car."

I think they've revised the definitions since the last time I saw it. Active steering was considered Level 3 automation last time I looked. Regardless, they can say what they want, but once the car is doing the driving, it needs to be able to do the driving. If you expect the average person to remain alert and ready to take over at any moment for a car that is performing all the normal driving tasks, then you're going to be disastrously disappointed far too often.
 
I'm with some of the others, I am not a fan of self-driving cars nor any other "automation" of the driving task. There are too many idiots not paying attention, that cause accidents even with a mostly aware and nominally intelligent (not artificially intelligent) set of drivers at the wheels. We need to make drivers sit up and pay attention, some kind of automated dope slap, or a detector that disables the ignition if your phone is not placed in the glovebox.

A better plan for AI in cars would be in a monitoring role, and use it to start actively limiting drivers, i.e. if the car you are driving detects that you are not paying attention to the road and causing near misses, it should pull over and call the police, who will then take your license away (burn it in front of you) and have your car towed. Given that us oldsters with clean driving records will be the only ones left who can drive (everybody else having forfeited their rights by trying to facebook at the wheel), and all the millenials (who aren't buying cars anyway) will then need rides, we will be able to absolutely OWN the car-for-hire thing, and charge exorbitant fees for our services.

I have a newish sports car, with some lane departure warning and blind spot monitoring features. These are sort of meh, to me. The lane departure cameras get spoofed by the lane markers that were imperfectly removed by the road crews during construction. When I make a left turn through a gap in traffic, the side monitoring camera beeps because it sees the distant car I'm turning ahead of. Navigation system always wants to put me on highway routes, even directing U-turns, even when the through road I'm on is known (by me, by reading maps that were printed on paper) to be shorter, and usually less trafficy since everybody else's nav systems are directing them to the crowded highway.
 
I saw the same, IRstuff. I haven't found the 2014 version of that chart yet, but I believe it was the one posted in older thread on the same subject. I think I remember that Level 2 in the older version didn't include the "lateral...motion control" i.e. automated steering.
 
That's only Level 2 out of 5 levels of automation too (6 if you count Level 0 as no automation).

Tesla's fatal flaw is to let the marketing wank ("Autopilot") get in the way of engineering this system for safety (of both the Tesla drivers and everyone else on the road). There are simple ways they could monitor driver's attention to ensure that their BASIC safety features are not being abused. Compared to actually driving the car, that stuff is easy. Are the hands on the steering wheel? Are the eyes watching the road. Does this geofence indicate that this is a controlled access highway where the system is appropriate to use?

Instead they say things like:
[ul]
[li]"Full Self-Driving Hardware on All Cars”[/li]
[li]“All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.”[/li]
[li]"THE PERSON IN THE DRIVER’S SEAT IS ONLY THERE FOR LEGAL REASONS. HE IS NOT DOING ANYTHING. THE CAR IS DRIVING ITSELF.”[/li]
[li]"Tesla drives itself (no human input at all) thru urban streets to highway to streets, then finds a parking spot”[/li]
[/ul]

 
HotRod10, I believe you were referring to this which I posted the last time this happened:
Because no two automated-driving technologies are exactly alike, SAE International’s standard J3016 defines six levels of automation for automakers, suppliers, and policymakers to use to classify a system’s sophistication. The pivotal change occurs between Levels 2 and 3, when responsibility for monitoring the driving environment shifts from the driver to the system.

Level 0 _ No Automation
System capability: None. • Driver involvement: The human at the wheel steers, brakes, accelerates, and negotiates traffic. • Examples: A 1967 Porsche 911, a 2018 Kia Rio.

Level 1 _ Driver Assistance
System capability: Under certain conditions, the car controls either the steering or the vehicle speed, but not both simultaneously. • Driver involvement: The driver performs all other aspects of driving and has full responsibility for monitoring the road and taking over if the assistance system fails to act appropriately. • Example: Adaptive cruise control.

Level 2 _ Partial Automation
System capability: The car can steer, accelerate, and brake in certain circumstances. • Driver involvement: Tactical maneuvers such as responding to traffic signals or changing lanes largely fall to the driver, as does scanning for hazards. The driver may have to keep a hand on the wheel as a proxy for paying attention. • Examples: Audi Traffic Jam Assist, Cadillac Super Cruise, Mercedes-Benz Driver Assistance Systems, Tesla Autopilot, Volvo Pilot Assist.

Level 3 _ Conditional Automation
System capability: In the right conditions, the car can manage most aspects of driving, including monitoring the environment. The system prompts the driver to intervene when it encounters a scenario it can’t navigate. • Driver involvement: The driver must be available to take over at any time. • Example: Audi Traffic Jam Pilot.

Level 4 _ High Automation
System capability: The car can operate without human input or oversight but only under select conditions defined by factors such as road type or geographic area. • Driver involvement: In a shared car restricted to a defined area, there may not be any. But in a privately owned Level 4 car, the driver might manage all driving duties on surface streets then become a passenger as the car enters a highway. • Example: Google’s now-defunct Firefly pod-car prototype, which had neither pedals nor a steering wheel and was restricted to a top speed of 25 mph.

Level 5 _ Full Automation
System capability: The driverless car can operate on any road and in any conditions a human driver could negotiate. • Driver involvement: Entering a destination. • Example: None yet, but Waymo—formerly Google’s driverless-car project—is now using a fleet of 600 Chrysler Pacifica hybrids to develop its Level 5 tech for production.

It's a Level 2 vehicle but they market it like it's a Level 4 or 5.
 
I found the older version quoted and linked to "Car and Driver's summary, anyway) in the "Self Driving Uber Fatality - Thread II":


Maybe I misunderstand what they meant by it can "...steer...in certain circumstances." I took that to mean something like the lane keeping assist, that bumps you over and warns you if you cross a lane line, not something that is actively steering the vehicle as a normal function.

By the new definition, I'll include Level 2 in with Automation Levels 3 and 4 as being far more dangerous than nothing in my opinion, except in a controlled environment (roadways with controlled access for vehicles and no access for pedestrians, cyclists, animals, etc.) I don't believe Level 5 can be successfully achieved without some type of Terminator/Skynet scenario taking place.

Please don't misunderstand any of that to say I'm arguing against all automation. I'm all for adaptive cruise control, lane departure warnings, cross traffic warnings, blind spot warnings, automatic emergency braking, etc. that assist the driver. I think those are very helpful.

I was not pleased to find out that Google, Uber, etc. were surreptitiously putting autonomous vehicles out on the road without so much as a sticker that warns other drivers that a computer is driving the car. Cars driven by student drivers are required to have warning stickers, why not an experimental computer driver?
 
Yep, that was it, Spartan5. I don't know whether the technical definition has changed with the revised definitions, but it certainly says something different to me than it did.
 
The first issuance of J3016 was 2014, and Level 2 states: " Executes longitudinal (accelerating, braking) and lateral (steering) dynamic driving task when activated • Can deactivate immediately with request for immediate takeover by the human driver" which is clarified in the 2016 release with, "Lateral vehicle motion control includes the detection of the vehicle positioning relative to lane boundaries and application of steering and/or differential braking inputs to maintain appropriate lateral positioning."

In all versions of J3016, the driver "Constantly supervises dynamic driving task executed by partial automation system," my emphasis added. So, under no circumstances is a driver supposed to be doing ANYTHING else. If Tesla made that clear to the buyers, then most of the accidents are at least partially "driving while distracted."

THAT doesn't absolve Tesla of delivering crappy obstacle avoidance processing, and makes me wonder if there is yet a circumstance where a Tesla will happily plow into a moving, or stopped, vehicle, with minimal or no warning.



TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor