Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Tesla Autopilot 2

Status
Not open for further replies.

GregLocock

Automotive
Apr 10, 2001
23,127
1
38
Orbiting a small yellow star
I think it's worth breaking out Tesla from Uber, both the hardware and software are different.

So, the second crash of a Tesla into a stationary firetruck looks like more than a coincidence. Even if Autopilot wasn't engaged (the police say it was) Automateic Emergency Braking should have stopped this.

From tesla's website

Standard Safety Features
These active safety technologies, including collision avoidance and automatic emergency braking, have begun rolling out through over-the-air updates

Automatic Emergency Braking

Designed to detect objects that the car may impact and applies the brakes accordingly


Side Collision Warning

Warns the driver of potential collisions with obstacles alongside the car


Front Collision Warning

Helps warn of impending collisions with slower moving or stationary cars


So questions that need to be asked, are which of these were fitted to the crash vehicle? AEB is widely available on other cars, but according to Tesla forums it is possible that it was remotely disabled. According to one user you can set AEB to warn only. That is bizarre choice of UI design.

Anyway, I think so far there have been three AP collisions with large objects on the road in front of the car in good viewing conditions, the idiot with the truck crossing the road, and two fire trucks.


Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
Replies continue below

Recommended for you

Spartan5 said:
On a related note to this discussion, I have a car which has a limited emergency braking system (speeds <19 MPH). Above that speed, it will warn me of a collision. While driving the other day, and following at a safe distance, the car in front of me smacked a pothole which created a spray of water and aggregate. Curiously, the radar system interpreted this as an object and warned me of a collision with it. ...

I have been trying to understand the underlying intelligence for this stuff. If the robot detects an object on the road ahead of it, it must take evasive action, in most cases, stopping. Any other action requires the robot to be extremely reliable at identifying safely hittable objects. Just how good is your radar's resolution of objects?

--
JHG
 
There's a risk of over-thinking this. Perhaps these incidents are the outcome of simple and rudimentary design or programming errors. Mistakes of a nature that we could not imaging being involved with ourselves. Occam's razor.

 
That's right. You can't do a 0.7g stop in the middle of traffic just because a plastic bag gets blown across the road (unless everyone is in AVs and they react the same). So the software has to take the world map that has been integrated from the inputs generated by the sensors, and decide which objects are worth ignoring.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
VE1BLL said:
There's a risk of over-thinking this. Perhaps these incidents are the outcome of simple and rudimentary design or programming errors. Mistakes of a nature that we could not imaging being involved with ourselves. Occam's razor.

I think it is being overthought, in the other direction. That isn't to say that there aren't programming errors- the software is written by humans, and there are certainly mistakes.

But...

Current technology being what it is, designing a solution which is capable of everything everyone in this thread wants it to be capable of AND is inexpensive enough for implementation into a mass-production vehicle is not possible.

The primary restrictions are $/Ghz of processing power, and $/pixel of sensor fidelity. Processing is the bottleneck. Programmers have to make compromises and use simpler routines than they would like to, because the processing isn't fast enough to process everything fast enough.
 
In that case the technology should not be used on public roads without adequate safety procedures. Any physical test i do either has to be a standard test or I have to do an FMEA on it and get all sorts of people to sign off on it. I don't see why AV operators should not have to do the same.

Cheers

Greg Locock


New here? Try reading these, they might help FAQ731-376
 
GregLocock, are you suggesting that the "operator" (read "passenger") in an AV will have to be ready to assume control of the vehicle at any time? If so, then what's the point of the automation?
 
HotRod10:

I think it is important that we be clear to differentiate and be mindful of the different levels of autonomy while we continue this discussion.
Because no two automated-driving technologies are exactly alike, SAE International’s standard J3016 defines six levels of automation for automakers, suppliers, and policymakers to use to classify a system’s sophistication. The pivotal change occurs between Levels 2 and 3, when responsibility for monitoring the driving environment shifts from the driver to the system.

Level 0 _ No Automation
System capability: None. • Driver involvement: The human at the wheel steers, brakes, accelerates, and negotiates traffic. • Examples: A 1967 Porsche 911, a 2018 Kia Rio.

Level 1 _ Driver Assistance
System capability: Under certain conditions, the car controls either the steering or the vehicle speed, but not both simultaneously. • Driver involvement: The driver performs all other aspects of driving and has full responsibility for monitoring the road and taking over if the assistance system fails to act appropriately. • Example: Adaptive cruise control.

Level 2 _ Partial Automation
System capability: The car can steer, accelerate, and brake in certain circumstances. • Driver involvement: Tactical maneuvers such as responding to traffic signals or changing lanes largely fall to the driver, as does scanning for hazards. The driver may have to keep a hand on the wheel as a proxy for paying attention. • Examples: Audi Traffic Jam Assist, Cadillac Super Cruise, Mercedes-Benz Driver Assistance Systems, Tesla Autopilot, Volvo Pilot Assist.

Level 3 _ Conditional Automation
System capability: In the right conditions, the car can manage most aspects of driving, including monitoring the environment. The system prompts the driver to intervene when it encounters a scenario it can’t navigate. • Driver involvement: The driver must be available to take over at any time. • Example: Audi Traffic Jam Pilot.

Level 4 _ High Automation
System capability: The car can operate without human input or oversight but only under select conditions defined by factors such as road type or geographic area. • Driver involvement: In a shared car restricted to a defined area, there may not be any. But in a privately owned Level 4 car, the driver might manage all driving duties on surface streets then become a passenger as the car enters a highway. • Example: Google’s now-defunct Firefly pod-car prototype, which had neither pedals nor a steering wheel and was restricted to a top speed of 25 mph.

Level 5 _ Full Automation
System capability: The driverless car can operate on any road and in any conditions a human driver could negotiate. • Driver involvement: Entering a destination. • Example: None yet, but Waymo—formerly Google’s driverless-car project—is now using a fleet of 600 Chrysler Pacifica hybrids to develop its Level 5 tech for production.
 
"Hotrod-no I didn't say that or think that or mean to imply that."

Ok. Sorry, I apparently misunderstood you. What then were you advocating that "AV operators should...have to do..."?

"L2 and L3 are recipes a disaster in my opinion, as people will treat them as L4s."

I agree completely.
 
Greg, I agree with you about L2 and L3. Hell, even L4 is questionable. Roads change all the time so how can anyone 100% ensure the road is still meeting the required conditions for autonomous operation?

But then I also agree with you in that if the technology is not yet capable then they shouldn't be operating on public roads.

 
The main intermediate step is gaze detection, a helpful system for even Level 0 operation that alerts the inattentive driver to a lack of attention. This would not be as easily defeated via water-bottle wedged into the steering wheel. It could easily be wired to the four-way flashers to alert other drivers to an out-of-control car.
 
"Hotrod-no I didn't say that or think that or mean to imply that. L2 and L3 are recipes for disaster in my opinion, as people will treat them as L4s. "

This already happens now- reference the guy in the UK busted for moving to the passenger side of his Tesla mid-drive.

I think the SAE standard for AV 'levels' is reasonable- but I don't think the general public has any real understanding of what those levels mean, or the safety tradeoffs they represent.

For the general public it comes down to 'do I still have to drive or does the car do everything'.

Tesla did themselves no favor in this regard by using the name 'Autopilot' for their system.
 
jgKRI said:
I think the SAE standard for AV 'levels' is reasonable- but I don't think the general public has any real understanding of what those levels mean, or the safety tradeoffs they represent.

If they're meant for a consumer product but the consumers don't understand the levels or what they mean, then they're rather useless.
 
"then they're rather useless"

The consumer does not need to know what they mean, they only need to clearly know what their purchase can do, or not do.

For example, "Autopilot" to a civil pilot mean something that maintains course and speed; no pilot would think that it should automatically and reliably dodge buildings and other airplanes. The general public, on the other hand, may, and does, see something different. Clearly, Tesla didn't fully consider the implications of people ignoring the insufficiencies of Autopilot, in addition to not having a sufficient robust system to start with. With Musk's disdain for lidar, it almost guarantees further failures and collisions.

TTFN (ta ta for now)
I can do absolutely anything. I'm an expert! faq731-376 forum1529 Entire Forum list
 
Status
Not open for further replies.
Back
Top