Fatal Tesla crash, software based issue?
Discussion
RobDickinson said:
The car wasnt pulling out of a junction.
The tesla was travelling in a straight line cruising.
The truck pulled across the road in front of it.
Teslas autopilot didnt 'kill' anyone, it failed to avoid an accident that wasnt its cause. The question is should it have?
Yes, the truck was across the road but I can't understand how the car didn't see the primemover cross in front of it. Seems odd.The tesla was travelling in a straight line cruising.
The truck pulled across the road in front of it.
Teslas autopilot didnt 'kill' anyone, it failed to avoid an accident that wasnt its cause. The question is should it have?
Also I've just read some more on a US website. The man who died had posted videos on YouTube demonstrating the cars ability to avoid accidents.
Jader1973 said:
Yes, the truck was across the road but I can't understand how the car didn't see the primemover cross in front of it. Seems odd.
Also I've just read some more on a US website. The man who died had posted videos on YouTube demonstrating the cars ability to avoid accidents.
it possibly did see the prime mover cross the road, and treated it like a solo car, not seeing the rest of the trailer.Also I've just read some more on a US website. The man who died had posted videos on YouTube demonstrating the cars ability to avoid accidents.
and sounds like the driver waited for the wonder of technology to save him rather than following the instructions and taking charge
Reading their excuses explanation it just seems like a bad sensor concept which is adequate for a lot of cases but not actually robust for things that are more unusual but not unexpected. It wouldn't be the only bit of Tesla engineering that I'd count as more 'it works' than 'it works properly even under stress'.
The fanboy replies to this elsewhere make slightly disturbing reading too; for all the ins and outs of what happened and why, it's never good to see cult-like support for a brand and product. It makes Apple look a bit tame.
Looking at who was involved in the crash I wouldn't discount that they had far too much faith in the product to sort things out and when it didn't they were stuffed.
The fanboy replies to this elsewhere make slightly disturbing reading too; for all the ins and outs of what happened and why, it's never good to see cult-like support for a brand and product. It makes Apple look a bit tame.
Looking at who was involved in the crash I wouldn't discount that they had far too much faith in the product to sort things out and when it didn't they were stuffed.
It's radar adaptive cruise control with auto lane keeping and auto lane change.
It's a very clever system, but it doesn't yet take over the responsibility of being in control of the car.
They've opened an investigation just as they would with any other accident of it's type, but because it's Tesla the media latches onto it. If another car's radar cruise system didn't stop the car in time before hitting a truck in similar circumstances and the driver died would that make headlines too?
It's a very clever system, but it doesn't yet take over the responsibility of being in control of the car.
They've opened an investigation just as they would with any other accident of it's type, but because it's Tesla the media latches onto it. If another car's radar cruise system didn't stop the car in time before hitting a truck in similar circumstances and the driver died would that make headlines too?
charltjr said:
If another car's radar cruise system didn't stop the car in time before hitting a truck in similar circumstances and the driver died would that make headlines too?
No, but the other car's system isn't called "Autopilot", wasn't downloaded automatically to the car, and few other manufacturers propagate the image that they are the great industry disrupter and that they are far superior to anyone else's tech.Because the other manufacturers know the pitfalls.
Jader1973 said:
No, but the other car's system isn't called "Autopilot", wasn't downloaded automatically to the car, and few other manufacturers propagate the image that they are the great industry disrupter and that they are far superior to anyone else's tech.
Because the other manufacturers know the pitfalls.
Yes they do, At least some of them.Because the other manufacturers know the pitfalls.
There was an ad for one of the Korean manufacturers I think, half. Dozen cars, following a lead vehicle, all the drivers get out and hop onto a trailer. Cars all driving themselves, lead car does emergency stop, all the others do so too.
That's pretty much "our cars can't crash/drive themselves"
Granted, they didn't call it autopilot.
You'd have to think the Tesla legal team would have stress tested their legal position on the warnings the car gives the driver before engaging autopilot. Will be interesting to see if their interpretation holds up during the investigation and subsequent ( virtually guaranteed? )legal suit.
RobDickinson said:
Jader1973 said:
I suppose US trailers having no under-run bars doesn't help.
Probably. Trailer would likely be 53ft long and most of that length is space at that level. Could easily be fooled into thinking the front and back are separate with nothing in the middle if the radar isnt picking up the trailer.
I work in an industry where it is not sufficient to provide a user with a mechanism to do something and then simply tell them they shouldn't do it under particular circumstances. If there are circumstances under which a user shouldn't be doing something, then controls should be put in place to prevent it from being used in those circumstances. The result for a not putting in place such controls and the aforementioned mechanism being used in the wrong circumstances would be pretty severe to the individual who did so, my company and me. I can't see how Tesla can roll out a system and then just expect all people to behave sensibly, as it simply won't happen.
So not only did the software not apply the brakes before the impact, but it seems it didn't apply them after the accident either!
Article says:
"The Tesla's windshield hit the bottom of the trailer as it passed underneath, and the car kept going, leaving the road, striking a fence, crossing a field, passing through another fence and finally hitting a utility pole about 100 feet south of the road, according to the report."
Article says:
"The Tesla's windshield hit the bottom of the trailer as it passed underneath, and the car kept going, leaving the road, striking a fence, crossing a field, passing through another fence and finally hitting a utility pole about 100 feet south of the road, according to the report."
akirk said:
Pretty sure that Tesla still say that humans must remain in control, but there does seem to be a disconnect sometimes between that and the impression that some have that they now have an autonomous car
I think a lot of people who'd be sold on the idea they could get drunk, watch movies etc in their autonomous car are in for a rude awakening. They'll realise just too late, and by then - oh dear. Legislation against driving yourself is everywhere! Doh!It's sad that someone has died in one. On the other hand, at least this will cause a bit of pushback against them. I still want to be able to take manual control whenever I want in 20 years time.
SuperVM said:
I work in an industry where it is not sufficient to provide a user with a mechanism to do something and then simply tell them they shouldn't do it under particular circumstances. If there are circumstances under which a user shouldn't be doing something, then controls should be put in place to prevent it from being used in those circumstances. The result for a not putting in place such controls and the aforementioned mechanism being used in the wrong circumstances would be pretty severe to the individual who did so, my company and me. I can't see how Tesla can roll out a system and then just expect all people to behave sensibly, as it simply won't happen.
How do they cope with driving cars? There are many things a car can do, and a whole host of rules saying you shouldn't do particular things. There is very little in the way of controls to stop you.RobDickinson said:
I thought ecu's already recorded a lot of data like that now.
I also doubt tesla would fake any data. Too much trouble there and too easy to be seen through. They dont after all have the car in question which likely has its own copy of records, just what it transmitted.
They are I think also not claiming it even started to brake, but that its sensor system was fooled, hardly a good position to be in after trying to fake data...
Car companies wouldn't fake things in order to make their company more successful and as such make more profit? I also doubt tesla would fake any data. Too much trouble there and too easy to be seen through. They dont after all have the car in question which likely has its own copy of records, just what it transmitted.
They are I think also not claiming it even started to brake, but that its sensor system was fooled, hardly a good position to be in after trying to fake data...
Tragic. We'll have to wait for the investigation to end but from what I've read I can't really fault the Tesla.
- it's clearly stated as the beta version of a fancy cruise control, not an autonomous driving option
- 130 million miles without a fatal incident is remarkable
- visibility and circumstances were very bad
- press reports that the driver of the truck stated that there was a harry potter movie playing in the car after the fatal crash
IMV the only thing that needs to happen out of this is that car manufacturers make it much harder to bypass the "presence detection" of cars with ACC and lane assist.
I don't know how the Tesla works but I've seen a video of a guy attaching a can of soda to the steering wheel of a new S-Class and climbing from the driver's seat to the the rear seat while on the Autobahn!
Here's another one like this: https://www.youtube.com/watch?v=6hKIDHisdjc
- it's clearly stated as the beta version of a fancy cruise control, not an autonomous driving option
- 130 million miles without a fatal incident is remarkable
- visibility and circumstances were very bad
- press reports that the driver of the truck stated that there was a harry potter movie playing in the car after the fatal crash
IMV the only thing that needs to happen out of this is that car manufacturers make it much harder to bypass the "presence detection" of cars with ACC and lane assist.
I don't know how the Tesla works but I've seen a video of a guy attaching a can of soda to the steering wheel of a new S-Class and climbing from the driver's seat to the the rear seat while on the Autobahn!
Here's another one like this: https://www.youtube.com/watch?v=6hKIDHisdjc
Just want to make two points on this. Correct me if I'm wrong but the legislation states that in any autonomous driving vehicle, a driver must remain able to retake full control at all times. Surely this would push blame back on the driver?
Secondly, this type of thing is bound to happen no matter how advanced the software. I work with very expensive software that has been in development since the late 70's, 11 versions on and after a lot of money and time, the software can still throw strange bugs up that no one has seen before. In my job its just a matter of doing it again but with something like driving, unforeseen circumstances are infinitely more likely and the complexity of sensors makes software glitches more probable. I wouldn't like to trust it with my life!
Secondly, this type of thing is bound to happen no matter how advanced the software. I work with very expensive software that has been in development since the late 70's, 11 versions on and after a lot of money and time, the software can still throw strange bugs up that no one has seen before. In my job its just a matter of doing it again but with something like driving, unforeseen circumstances are infinitely more likely and the complexity of sensors makes software glitches more probable. I wouldn't like to trust it with my life!
Gassing Station | General Gassing | Top of Page | What's New | My Stuff