Model 3 crash on autopilot

Model 3 crash on autopilot

Author
Discussion

mcdjl

5,451 posts

196 months

Thursday 12th December 2019
quotequote all
Lt. Coulomb said:
RobDickinson said:
Car warned me yesterday about a ~4yo kid
Tread carefully, ”pedo guy”...
So Musk has designed a car that can spot small kids but not fire trucks.....

anonymous-user

55 months

Thursday 12th December 2019
quotequote all
RobDickinson said:
I posted the data on the previous page.

But yes teslas are safer than other cars if they do crash
Any other car?

I'm not sure that is 100% true.

DonkeyApple

55,407 posts

170 months

Thursday 12th December 2019
quotequote all
kambites said:
Dave Hedgehog said:
even the best systems are still bad compared to a human driver
Compared to which human driver? There's quite a big variety!
There are a few incidences of people driving into parked police cars with their blues running so clearly it’s on par with the dumbest or most drunk drivers.

The problem with the stats is that there aren’t many Tesla’s on the road yet and they aren’t cheap enough to be in the hands of the dumbest people yet so it’s hard to gauge the real benefits at this moment in time with the current level of tech.

The important thing is to not underestimate the cognitive ability of even the most challenged drivers as it is clearly still much higher than some would believe in comparison to the tech that is available at a price that consumers can afford.

kiseca

9,339 posts

220 months

Thursday 12th December 2019
quotequote all
MarkwG said:
kiseca said:
There is. Number of miles travelled per recorded accident. If Teslas on autopilot have enough miles to make a reliable average, and they travel more miles per crash than average, then the system would be proving to be safer overall than an unassisted human.
Ok, so I'll try explaining differently: if a human driver sees a hazard & avoids it, then there will be no record of the event - there has been no collision to record. If he doesn't see, & doesn't avoid, there will be. The first scenario will be happening all the time. Some of automated systems above acknowledge they cannot handle that first scenario. Therefore, any data that shows they are "safer" that doesn't take that into account, leave me with an uncomfortable feeling about it's voracity. Some manufacturers are reluctant to launch the technology until they've resolved that difficulty, others are happy to. I've no axe to grind for or against automation,however statistics that don't paint the whole picture & are used to push a particular view point bother me.
My point is that the event, i.e. the avoidance of a collision, or what you call your first scenario, is recorded, all the time, by a mile passing uneventfully.

Doesn't matter if it's to avoid a stationary car, fire engine, traffic cone, pothole, debris in the road, or avoiding a car, dog, child or deer running out in front of the car. Or managing not to lose control on an unexpectedly slippery surface. Or driving into a ditch while changing the radio station. Suddenly entering fog. Or any of the thousands of things that make an accident possible.

Your argument is that there are particular conditions where they are not as safe. I'm not disagreeing with that, what I'm saying is that the overall safety needs to be considered if you actually want fewer casualties on the road. Those data are available.

So, unless we are misunderstanding eachother, your last sentence for me doesn't align with the concern you've raised. You suggest you are concerned because the statistics don't paint the whole picture, but your reservation about failing to avoid stationary objects is actually not the whole picture. Crashes per mile is. It takes into account all the times an accident was avoided, all the times there was no need to avoid an accident, which is most of the time, and ranks unassisted drivers against assisted drivers, giving an overall view of which is safer.

Once you start going into particular situations where one is safer than the other, and you definitely want to do that to improve safety further still, you're now not looking at the whole picture, you're looking at a detail and working on that. And for that, you don't really need to compare the computer to a driver to see who wins. You just need to look at accidents caused by AP, and work out ways to ensure that a future generation of AP can avoid those as well, as well as educating current operators of the existing weakness. Which the owner's manual does.




Edited by kiseca on Thursday 12th December 11:49

Mr Whippy

29,064 posts

242 months

Thursday 12th December 2019
quotequote all
Driving aid. Not a driver.

You don’t expect ABS to brake for you, or TCS to apply the throttle for you.

Expecting this badly named driving aid to not crash is retarded.

kambites

67,587 posts

222 months

Thursday 12th December 2019
quotequote all
DonkeyApple said:
There are a few incidences of people driving into parked police cars with their blues running so clearly it’s on par with the dumbest or most drunk drivers.
There are tens of thousands of instances every year of people driving into stationary cars; perhaps not police cars but it makes little difference to the severity of the accident whether the car in question has some flashy lights on top. Mostly very much like this one - people simpling not looking where they were going.

Anyway ultimately this story is "idiot driver doesn't look where he's going and crashes into stationary object", which happens repeatedly all over the world. The fact he happened to have a sort of advanced cruise control system engaged at the time isn't particularly relevant; idiots are idiots whatever they're driving.

Edited by kambites on Thursday 12th December 12:02

kambites

67,587 posts

222 months

Thursday 12th December 2019
quotequote all
Mr Whippy said:
Expecting this badly named driving aid to not crash is retarded.
How is it "badly named"? It does at least as much, arguably more, than any other "autopilot" system I can think of.

The last thing I used called an "autopilot" had no control over speed at all and couldn't even maintain a relative compass heading, just an angle to the wind.

MarkwG

4,858 posts

190 months

Thursday 12th December 2019
quotequote all
kiseca said:
My point is that the event, i.e. the avoidance of a collision, or what you call your first scenario, is recorded, all the time, by a mile passing uneventfully.

Doesn't matter if it's to avoid a stationary car, fire engine, traffic cone, pothole, debris in the road, or avoiding a car, dog, child or deer running out in front of the car. Or managing not to lose control on an unexpectedly slippery surface. Or driving into a ditch while changing the radio station. Suddenly entering fog. Or any of the thousands of things that make an accident possible.

Your argument is that there are particular conditions where they are not as safe. I'm not disagreeing with that, what I'm saying is that the overall safety needs to be considered if you actually want fewer casualties on the road. Those data are available.

So, unless we are misunderstanding each other, your last sentence for me doesn't align with the concern you've raised. You suggest you are concerned because the statistics don't paint the whole picture, but your reservation about failing to avoid stationary objects is actually not the whole picture. Crashes per mile is. It takes into account all the times an accident was avoided, all the times there was no need to avoid an accident, which is most of the time, and ranks unassisted drivers against assisted drivers, giving an overall view of which is safer.

Once you start going into particular situations where one is safer than the other, and you definitely want to do that to improve safety further still, you're now not looking at the whole picture, you're looking at a detail and working on that. And for that, you don't really need to compare the computer to a driver to see who wins. You just need to look at accidents caused by AP, and work out ways to ensure that a future generation of AP can avoid those as well, as well as educating current operators of the existing weakness. Which the owner's manual does.




Edited by kiseca on Thursday 12th December 11:49
I think I see where you're coming from, & we're probably fairly aligned. I see the big picture data : my difference is perhaps with your opening statement. I'm not sure we are avoiding a collision in every mile - there maybe two or more a trip, or maybe none. There are too many variables at play, & the autopilot data set is too small (as someone above has mentioned). A long trip on an empty German autobahn, vs driving through Mumbai rush hour, for example? First case, both probably comparable, second case, I reckon the autopilot would struggle, yet the human "mostly" manages it. My feeling is the data is skewed by the desire to prove it works, & that the desire is driven by money rather than safety. My preference is that the operator is properly trained to use the equipment, rather than relying on them reading a manual (which we know very few ever do - certainly not the second or third owner down the line, running it as an Uber). It's all very well to see any accidents as an unfortunate side effect of the testing process, but that shouldn't be done when third parties are being involuntary involved in it. I'm not suggesting the way cars themselves developed was perfect, far from it, but the worlds moved on, & we can do better than that as a society. Indeed, we have rules & laws in many areas precisely to protect the public from becoming someone elses crash test dummy.

NDNDNDND

2,024 posts

184 months

Thursday 12th December 2019
quotequote all
kambites said:
How is it "badly named"? It does at least as much, arguably more, than any other "autopilot" system I can think of.

The last thing I used called an "autopilot" had no control over speed at all and couldn't even maintain a relative compass heading, just an angle to the wind.
You can argue the semantics all you like, but the truth is the system was deliberately, consciously, cynically and irresponsibly named 'Autopilot' to give a false impression of full self driving. Euro NCAP agrees, and thinks the system has been dangerously mistitled.

If the system had been named 'Advanced Cruise Assist' or similar, this model 3 accident wouldn't have made the news, and the culpability of the driver wouldn't have been questioned.

kambites

67,587 posts

222 months

Thursday 12th December 2019
quotequote all
NDNDNDND said:
You can argue the semantics all you like, but the truth is the system was deliberately, consciously, cynically and irresponsibly named 'Autopilot' to give a false impression of full self driving. Euro NCAP agrees, and thinks the system has been dangerously mistitled.
My question really is why it gives that impression to anyone, given that "autopilot" has never in any other instance been used to imply full autonomy. Tesla have taken a term which is used in many other industries (aviatoin, shipping, etc.) and used it to mean... largely what it means in those other industries.

Tesla may or may not have given the impression that their driver assist package is capable of more autonomy than it is, I don't know (or care) enough about them to know, but I can't see anything wrong with the name.

Edited by kambites on Thursday 12th December 13:11

kiseca

9,339 posts

220 months

Thursday 12th December 2019
quotequote all
MarkwG said:
I think I see where you're coming from, & we're probably fairly aligned. I see the big picture data : my difference is perhaps with your opening statement. I'm not sure we are avoiding a collision in every mile - there maybe two or more a trip, or maybe none. There are too many variables at play, & the autopilot data set is too small (as someone above has mentioned). A long trip on an empty German autobahn, vs driving through Mumbai rush hour, for example? First case, both probably comparable, second case, I reckon the autopilot would struggle, yet the human "mostly" manages it. My feeling is the data is skewed by the desire to prove it works, & that the desire is driven by money rather than safety. My preference is that the operator is properly trained to use the equipment, rather than relying on them reading a manual (which we know very few ever do - certainly not the second or third owner down the line, running it as an Uber). It's all very well to see any accidents as an unfortunate side effect of the testing process, but that shouldn't be done when third parties are being involuntary involved in it. I'm not suggesting the way cars themselves developed was perfect, far from it, but the worlds moved on, & we can do better than that as a society. Indeed, we have rules & laws in many areas precisely to protect the public from becoming someone elses crash test dummy.
thumbup

Yes we pretty much are. My point wasn't about adding up how many accidents are avoided, rather just looking at miles where an accident doesn't happen. It would assume that the number of potential accidents per mile is a constant (or a reliable average) but I agree it would not be constant from one region to the next so those differences definitely need to be filtered out of the dataset. Then, if you find you have hundreds of millions of miles of data for unassisted cars in, say, California, and hundreds of millions of miles for Teslas in California, then you can make a reasonable assessment of whether or not (in California) you are less likely to crash in a Tesla (or Volvo or whatever) than in cars with no autopilot equivalent systems.

That doesn't tell an individual neccessarily whether their driving habits, level of skill, or environment (or even some unforseen and unpredictable hazard they may come across some day) would benefit or suffer safety wise in a Tesla using AP, but considering all purchasers, the answer will be correct more often than not.

I totally agree what where the system has known weaknesses that can kill you or someone else, it needs more emphasis than a note in a manual.. or even a half dozen news stories as the message doesn't seem to be sinking in to all Tesla drivers.

And it is an easily solvable weakness with driver assistance. Keep your eyes on the road and your hands upon the wheel (just ask Jim) and between your own efforts and those of the AP, you should avoid some accidents that the AP fails to, and the AP should save you from some of the times when you missed something. End result, still fallible but safer than you were without the system.

Edited by kiseca on Thursday 12th December 13:58

NDNDNDND

2,024 posts

184 months

Thursday 12th December 2019
quotequote all
kambites said:
My question really is why it gives that impression to anyone, given that "autopilot" has never in any other instance been used to imply full autonomy. Tesla have taken a term which is used in many other industries (aviatoin, shipping, etc.) and used it to mean... largely what it means in those other industries.

Tesla may or may not have given the impression that their driver assist package is capable of more autonomy than it is, I don't know (or care) enough about them to know, but I can't see anything wrong with the name.

Edited by kambites on Thursday 12th December 13:11
You're being deliberately ignorant.
The technical definition of 'Autopilot' is irrelevant.
The public perception of 'Autopilot' (especially given the vast majority of people do not encounter or use an autopilot in any setting) is that an autopilot controls the vehicle for you without intervention.
You know this. You're just pretending that you don't.

kambites

67,587 posts

222 months

Thursday 12th December 2019
quotequote all
NDNDNDND said:
The public perception of 'Autopilot' (especially given the vast majority of people do not encounter or use an autopilot in any setting) is that an autopilot controls the vehicle for you without intervention..
That was my question really -if that really is what the public think autopilot means... why? Where on earth does that perception of the meaning come from?

Ignoring subject matter knowledge and focussing on the language, I think "cruise control" implies every bit as much automated control as "auto-pilot". Is it just that people are familiar with one term and not the other? In which case you could argue that any new term for a driver aid is dangerous unless it's extremely specific.

I have no particularly subject matter experience of auto-pilots. I've used them on sailing boats a couple of times and vaguely know what they do on aeroplanes but it had genuinely never have occured to be that the term might imply complete automation until people started complaining about Tesla's name.

Edited by kambites on Thursday 12th December 13:45

DonkeyApple

55,407 posts

170 months

Thursday 12th December 2019
quotequote all
NDNDNDND said:
You can argue the semantics all you like, but the truth is the system was deliberately, consciously, cynically and irresponsibly named 'Autopilot' to give a false impression of full self driving. Euro NCAP agrees, and thinks the system has been dangerously mistitled.

If the system had been named 'Advanced Cruise Assist' or similar, this model 3 accident wouldn't have made the news, and the culpability of the driver wouldn't have been questioned.
Advanced Road Sensing Equipment.

MarkwG

4,858 posts

190 months

Thursday 12th December 2019
quotequote all
kambites said:
NDNDNDND said:
The public perception of 'Autopilot' (especially given the vast majority of people do not encounter or use an autopilot in any setting) is that an autopilot controls the vehicle for you without intervention..
That was my question really -if that really is what the public think autopilot means... why? Where on earth does that perception of the meaning come from?

Ignoring subject matter knowledge and focussing on the language, I think "cruise control" implies every bit as much automated control as "auto-pilot". Is it just that people are familiar with one term and not the other? In which case you could argue that any new term for a driver aid is dangerous unless it's extremely specific.

I have no particularly subject matter experience of auto-pilots. I've used them on sailing boats a couple of times and vaguely know what they do on aeroplanes but it had genuinely never have occured to be that the term might imply complete automation until people started complaining about Tesla's name.

Edited by kambites on Thursday 12th December 13:45
The public perception stems from cultural references, films etc, & is coupled with the media debate about autonomous cars: ask anyone not in the aviation industry whether the an aircraft is flown by a human or an autopliot, & they'll generally vote for the latter. Tesla haven't done anywhere near enough to correct that, because it's their USP: the car that will drive itself...just not yet. People don't hear the "just not yet"...

Automatic definition: (of a device or process) working by itself with little or no direct human control.
"an automatic kettle that switches itself off when it boils";
done or occurring spontaneously, without conscious thought or attention.
"automatic physical functions such as breathing".

Edited to add definition.

Edited by MarkwG on Thursday 12th December 14:11

kambites

67,587 posts

222 months

Thursday 12th December 2019
quotequote all
I guess I can accept the argument that Tesla aren't doing enough to save idiots from themselves, which has long been an intrinsic part of the car industry's role. Indeed you can see from the terminology used that the established industry has been very careful to down-play the ability of their systems to "drive the car". I also think Tesla are being ridiculously optimistic about their current hardware ever being able to reach stage-5 autonomy, but that's another argument.

Ultimately though the fact remains that statistically, cars with advanced driver aids (even those called Autopilot) are demonstrably safer than those without.

kiseca

9,339 posts

220 months

Thursday 12th December 2019
quotequote all
kambites said:
NDNDNDND said:
The public perception of 'Autopilot' (especially given the vast majority of people do not encounter or use an autopilot in any setting) is that an autopilot controls the vehicle for you without intervention..
That was my question really -if that really is what the public think autopilot means... why? Where on earth does that perception of the meaning come from?

Ignoring subject matter knowledge and focussing on the language, I think "cruise control" implies every bit as much automated control as "auto-pilot". Is it just that people are familiar with one term and not the other? In which case you could argue that any new term for a driver aid is dangerous unless it's extremely specific.

I have no particularly subject matter experience of auto-pilots. I've used them on sailing boats a couple of times and vaguely know what they do on aeroplanes but it had genuinely never have occured to be that the term might imply complete automation until people started complaining about Tesla's name.

Edited by kambites on Thursday 12th December 13:45
I don't think the name is the problem, personally. It's a very advanced - relatively - system that is capable of getting the car from point A to point B all by itself given sufficiently favourable conditions.

It can steer itself, accelerate or brake when needed, change lanes, and react to other vehicles on the road.

So, road testers give it a try, and to demonstrate, they drive down the motorway with their hands clearly off the wheel. The car drives itself. It slows down when the traffic does. It speeds itself up again. Road tester fails to crash and die and comes away impressed. So does the owner. This is the potential of the system, and because it's successful a lot more times than it fails, users build trust.

It's like all those Facebook posts from people who in the past travelled in a car without wearing a seatbelt, or in the load bed of a pickup truck, or ramped their BMX without a helmet, and all that crap, and didn't die. They think it's therefore safe and modern standards are overreacting. I see that argument on PH from time to time too. Unfortunately, all those who were killed by a lack of seatbelt, helmet or whatever, they aren't available to comment and balance the argument.

So people use AP, it works, they learn to trust it, they gradually put more and more faith in it, and as they put more faith in it they put less and less effort in themselves to compensate. I believe that all of us - or the vast majority of us - as drivers do that too. Esp. in the first few years. Then over time we get caught out, usually without crashing, but we learn, piece by piece, that our safety margins or our safety process has holes in it, and we gradually fill most of those holes. That is experience, and I think it battles against complacency caused partly by the fact that driving is, really, not a dangerous thing to do.

I think the system is a victim of its own ability, combined with its youth. Users don't have enough experience of it failing to factor that in to their own actions.

kambites

67,587 posts

222 months

Thursday 12th December 2019
quotequote all
One has to keep a degree of perspective. Accidents like this are extremely rare. Whether that's because most people aren't that stupid or because the huge majority of the time the system is good enough to deal with their stupidity is hard to know.

kiseca

9,339 posts

220 months

Thursday 12th December 2019
quotequote all
MarkwG said:
Tesla haven't done anywhere near enough to correct that, because it's their USP: the car that will drive itself...just not yet. People don't hear the "just not yet"...
To me, the problem is partly here. It WILL drive itself. Usually without crashing nor killing anyone. It can do it. It can't do very well. If it were a person, it would probably pass its driving test, and would be able to drive itself as long as it can afford what would be an awfully expensive insurance premium because of the "not very well" bit...

NDNDNDND

2,024 posts

184 months

Thursday 12th December 2019
quotequote all
kambites said:
I guess I can accept the argument that Tesla aren't doing enough to save idiots from themselves, which has long been an intrinsic part of the car industry's role. I also think Tesla are being ridiculously optimistic about their current hardware ever being able to reach stage-5 autonomy, but that's another argument.

Ultimately though the fact remains that statistically, cars with advanced driver aids (even those called Autopilot) are demonstrably safer than those without.
Extract from Euro NCAP 2018 report into autonomous car technology:

"‘Autopilot’ on the Tesla Model S gives the driver a high level of support with the vehicle primarily in control in both braking and stee-
ring scenarios. This results in a risk of over-reliance as, in some situations, the system still needs the driver to instantly correct and
override the system.
The name “Autopilot” implies a fully automated system where the driver is not required. However, the limited scenarios tested clearly indicate that is not the case...

As for the oft-repeated 'eight times safer' statistic, this has been shown before not to be credible as it does not compare like-with-like. It compares Tesla's "autopilot", which is generally only used by the safest demographic of people on the safest roads in the best conditions, wit the average, which covers all demographics, cars, types of road and weather conditions.