Fatal Tesla crash, software based issue?

Fatal Tesla crash, software based issue?

Author
Discussion

RobDickinson

31,343 posts

255 months

Friday 1st July 2016
quotequote all
The car wasnt pulling out of a junction.

The tesla was travelling in a straight line cruising.

The truck pulled across the road in front of it.


Teslas autopilot didnt 'kill' anyone, it failed to avoid an accident that wasnt its cause. The question is should it have?

Jader1973

4,009 posts

201 months

Friday 1st July 2016
quotequote all
RobDickinson said:
The car wasnt pulling out of a junction.

The tesla was travelling in a straight line cruising.

The truck pulled across the road in front of it.


Teslas autopilot didnt 'kill' anyone, it failed to avoid an accident that wasnt its cause. The question is should it have?
Yes, the truck was across the road but I can't understand how the car didn't see the primemover cross in front of it. Seems odd.

Also I've just read some more on a US website. The man who died had posted videos on YouTube demonstrating the cars ability to avoid accidents.

RobDickinson

31,343 posts

255 months

Friday 1st July 2016
quotequote all
Jader1973 said:
Yes, the truck was across the road but I can't understand how the car didn't see the primemover cross in front of it. Seems odd.

Also I've just read some more on a US website. The man who died had posted videos on YouTube demonstrating the cars ability to avoid accidents.
it possibly did see the prime mover cross the road, and treated it like a solo car, not seeing the rest of the trailer.

and sounds like the driver waited for the wonder of technology to save him rather than following the instructions and taking charge

Jonesy23

4,650 posts

137 months

Friday 1st July 2016
quotequote all
Reading their excuses explanation it just seems like a bad sensor concept which is adequate for a lot of cases but not actually robust for things that are more unusual but not unexpected. It wouldn't be the only bit of Tesla engineering that I'd count as more 'it works' than 'it works properly even under stress'.

The fanboy replies to this elsewhere make slightly disturbing reading too; for all the ins and outs of what happened and why, it's never good to see cult-like support for a brand and product. It makes Apple look a bit tame.

Looking at who was involved in the crash I wouldn't discount that they had far too much faith in the product to sort things out and when it didn't they were stuffed.

anonymous-user

55 months

Friday 1st July 2016
quotequote all
It's radar adaptive cruise control with auto lane keeping and auto lane change.

It's a very clever system, but it doesn't yet take over the responsibility of being in control of the car.

They've opened an investigation just as they would with any other accident of it's type, but because it's Tesla the media latches onto it. If another car's radar cruise system didn't stop the car in time before hitting a truck in similar circumstances and the driver died would that make headlines too?

Jader1973

4,009 posts

201 months

Friday 1st July 2016
quotequote all
charltjr said:
If another car's radar cruise system didn't stop the car in time before hitting a truck in similar circumstances and the driver died would that make headlines too?
No, but the other car's system isn't called "Autopilot", wasn't downloaded automatically to the car, and few other manufacturers propagate the image that they are the great industry disrupter and that they are far superior to anyone else's tech.

Because the other manufacturers know the pitfalls.

anonymous-user

55 months

Friday 1st July 2016
quotequote all
They should have called it driver assist as auto pilot implies full control. That big screen is bloody annoying.

ILoveMondeo

9,614 posts

227 months

Friday 1st July 2016
quotequote all
Jader1973 said:
No, but the other car's system isn't called "Autopilot", wasn't downloaded automatically to the car, and few other manufacturers propagate the image that they are the great industry disrupter and that they are far superior to anyone else's tech.

Because the other manufacturers know the pitfalls.
Yes they do, At least some of them.

There was an ad for one of the Korean manufacturers I think, half. Dozen cars, following a lead vehicle, all the drivers get out and hop onto a trailer. Cars all driving themselves, lead car does emergency stop, all the others do so too.

That's pretty much "our cars can't crash/drive themselves"

Granted, they didn't call it autopilot.


vtgts300kw

598 posts

178 months

Friday 1st July 2016
quotequote all
You'd have to think the Tesla legal team would have stress tested their legal position on the warnings the car gives the driver before engaging autopilot. Will be interesting to see if their interpretation holds up during the investigation and subsequent ( virtually guaranteed? )legal suit.

Jader1973

4,009 posts

201 months

Friday 1st July 2016
quotequote all
Report I read says the driver was formerly a Navy SEAL.

Tesla killed a veteran!!!

That won't go down well in the US. Trump will probably promise to shut them down as a result.

biggrin

98elise

26,644 posts

162 months

Friday 1st July 2016
quotequote all
Its a sophisicated cruise control. Not an accident avoidance system.

It will follow lanes, and speed up/slow down with traffic. If a truck pulls across you its going to be no better than you at stopping, possibly worse. In an emergency you are supposed to take over.


98elise

26,644 posts

162 months

Friday 1st July 2016
quotequote all
RobDickinson said:
Jader1973 said:
I suppose US trailers having no under-run bars doesn't help.
Probably.

Trailer would likely be 53ft long and most of that length is space at that level. Could easily be fooled into thinking the front and back are separate with nothing in the middle if the radar isnt picking up the trailer.
I've driven in the US a lot and the lack of under-run on trucks has always puzzled me. Its such a simple thing that you would have thought it was a no brainer to fit.

SuperVM

1,098 posts

162 months

Friday 1st July 2016
quotequote all
I work in an industry where it is not sufficient to provide a user with a mechanism to do something and then simply tell them they shouldn't do it under particular circumstances. If there are circumstances under which a user shouldn't be doing something, then controls should be put in place to prevent it from being used in those circumstances. The result for a not putting in place such controls and the aforementioned mechanism being used in the wrong circumstances would be pretty severe to the individual who did so, my company and me. I can't see how Tesla can roll out a system and then just expect all people to behave sensibly, as it simply won't happen.

mgtony

4,022 posts

191 months

Friday 1st July 2016
quotequote all
So not only did the software not apply the brakes before the impact, but it seems it didn't apply them after the accident either!
Article says:

"The Tesla's windshield hit the bottom of the trailer as it passed underneath, and the car kept going, leaving the road, striking a fence, crossing a field, passing through another fence and finally hitting a utility pole about 100 feet south of the road, according to the report." eek

AH33

2,066 posts

136 months

Friday 1st July 2016
quotequote all
akirk said:
Pretty sure that Tesla still say that humans must remain in control, but there does seem to be a disconnect sometimes between that and the impression that some have that they now have an autonomous car wink
I think a lot of people who'd be sold on the idea they could get drunk, watch movies etc in their autonomous car are in for a rude awakening. They'll realise just too late, and by then - oh dear. Legislation against driving yourself is everywhere! Doh!

It's sad that someone has died in one. On the other hand, at least this will cause a bit of pushback against them. I still want to be able to take manual control whenever I want in 20 years time.

98elise

26,644 posts

162 months

Friday 1st July 2016
quotequote all
SuperVM said:
I work in an industry where it is not sufficient to provide a user with a mechanism to do something and then simply tell them they shouldn't do it under particular circumstances. If there are circumstances under which a user shouldn't be doing something, then controls should be put in place to prevent it from being used in those circumstances. The result for a not putting in place such controls and the aforementioned mechanism being used in the wrong circumstances would be pretty severe to the individual who did so, my company and me. I can't see how Tesla can roll out a system and then just expect all people to behave sensibly, as it simply won't happen.
How do they cope with driving cars? There are many things a car can do, and a whole host of rules saying you shouldn't do particular things. There is very little in the way of controls to stop you.

eltax91

9,893 posts

207 months

Friday 1st July 2016
quotequote all
RobDickinson said:
I thought ecu's already recorded a lot of data like that now.

I also doubt tesla would fake any data. Too much trouble there and too easy to be seen through. They dont after all have the car in question which likely has its own copy of records, just what it transmitted.

They are I think also not claiming it even started to brake, but that its sensor system was fooled, hardly a good position to be in after trying to fake data...
Car companies wouldn't fake things in order to make their company more successful and as such make more profit? scratchchin

aeropilot

34,666 posts

228 months

Friday 1st July 2016
quotequote all
jontykint said:
Hang on... if a big f-off truck turned across you why would you either
a. Not brake
b. Expect the car to brake for you
c. Not brake, expecting the car to brake for you
It's America......


EricE

1,945 posts

130 months

Friday 1st July 2016
quotequote all
Tragic. We'll have to wait for the investigation to end but from what I've read I can't really fault the Tesla.

- it's clearly stated as the beta version of a fancy cruise control, not an autonomous driving option
- 130 million miles without a fatal incident is remarkable
- visibility and circumstances were very bad
- press reports that the driver of the truck stated that there was a harry potter movie playing in the car after the fatal crash

IMV the only thing that needs to happen out of this is that car manufacturers make it much harder to bypass the "presence detection" of cars with ACC and lane assist.

I don't know how the Tesla works but I've seen a video of a guy attaching a can of soda to the steering wheel of a new S-Class and climbing from the driver's seat to the the rear seat while on the Autobahn!

Here's another one like this: https://www.youtube.com/watch?v=6hKIDHisdjc


veccy208

1,324 posts

102 months

Friday 1st July 2016
quotequote all
Just want to make two points on this. Correct me if I'm wrong but the legislation states that in any autonomous driving vehicle, a driver must remain able to retake full control at all times. Surely this would push blame back on the driver?

Secondly, this type of thing is bound to happen no matter how advanced the software. I work with very expensive software that has been in development since the late 70's, 11 versions on and after a lot of money and time, the software can still throw strange bugs up that no one has seen before. In my job its just a matter of doing it again but with something like driving, unforeseen circumstances are infinitely more likely and the complexity of sensors makes software glitches more probable. I wouldn't like to trust it with my life!