Fatal Tesla crash, software based issue?

Fatal Tesla crash, software based issue?

Author
Discussion

anonymous-user

55 months

Friday 1st July 2016
quotequote all
rxe said:
Spumfry said:
But why not compare it to Swedish fatality rates, or to Indian or Chinese fatality rates? Or worldwide rates as a whole?
Feel free to do so.

My point is simply that 1 in 132 million is a great soundbite, but is actually what humans are doing today. American drivers might be a bit worse, I have no idea. If humans were experiencing fatalities at a rate of 1 in a million, 1 in 132 million would be a big deal.
But in the UK we've taken, what, 100 years to get to this level of safety, places like India are still years behind. Tesla have managed to reach similar safety levels to the UK in 6 years. Clearly this technology needs to get better, but equally it shouldn't be written off, particularly as we don't know the full story of the human factors involved in the crash.

Devil2575

13,400 posts

189 months

Friday 1st July 2016
quotequote all
rxe said:
Spumfry said:
But why not compare it to Swedish fatality rates, or to Indian or Chinese fatality rates? Or worldwide rates as a whole?
Feel free to do so.

My point is simply that 1 in 132 million is a great soundbite, but is actually what humans are doing today. American drivers might be a bit worse, I have no idea. If humans were experiencing fatalities at a rate of 1 in a million, 1 in 132 million would be a big deal.
It's actually meaningless. A single crash is not statistically significant. It could be the safest car in the world driven by the safest and still be involved in a crash.

If I roll a dice 4 times and get 4 sixes it doesn't mean that the dice is weighted.

When I read the title of this thread I expected to see the usual detractors with their dick in their hand getting all worked up over this. The reality is that it means very little and anyone claiming that it does or that they have been proved right is deluded and suffering from confirmation bias.

culpz

4,884 posts

113 months

Friday 1st July 2016
quotequote all
I've said this in the previous Tesla accident thread and i'll say it again; we do not yet live in world where car's no longer require any human input. No matter what Tesla or any other company offer it's simply a feature to be used only when necessary for occasional use.

Tesla can quite happily be investigated for this as i'm sure they have some disclaimer's/ small print/ specific instructions for use for it. It's horrible someone has lost their life but unfortunately common sense really needed to dictate here.

Edited by culpz on Friday 1st July 13:29


Edited by culpz on Friday 1st July 13:30

98elise

26,686 posts

162 months

Friday 1st July 2016
quotequote all
Rovinghawk said:
EricE said:
- 130 million miles without a fatal incident is remarkable
What about 130 million miles WITH a fatal accident, as is actually the case?
As opposed to the 84 million for the average car? IIRC thats 130m on autopilot, not the model S as a whole.

I'm begining to side with those that say autopilot is the wrong choice of branding. If people are not watching the road ahead then it should be rebranded as cruise control + lane assist.



Emeye

9,773 posts

224 months

Friday 1st July 2016
quotequote all
TheInternet said:
Emeye said:
Elon Musk replied to a tweet this morning saying that a Tesla cannot scan up high for the side of trucks as it would see overhead traffic gantries and bridges as an obstruction.
I hope you've paraphrased the tweet because the reason above sounds ridiculous.
The exact tweet.

Elon Musk ?@elonmusk 16h16 hours ago
@artem_zin @theaweary Radar tunes out what looks like an overhead road sign to avoid false braking events

ETA: Reading that back my early morning sleepy eyed translation is a bit like translating into Chinese, then French then back to English. hehe

Edited by Emeye on Friday 1st July 14:33

Emeye

9,773 posts

224 months

Friday 1st July 2016
quotequote all
What about looking at it from the point of view of the truck driver - if the truck had been installed with some form of autonomous systems, maybe it would not have pulled out into the road as it's lidar could have sensed the Tesla coming along.

Consider this, part of the M6 up north has been designated for testing truck convoys where the lead truck driver is in charge and the other ones in the "train" just follow the one in front and the drivers get to have a kip or a read or a wk as they have no control, and if the testing videos I've seen before are correct, the trucks are so close together that the driver would not be able to see head anyway.

How you get off at your junction if a ten up articulated lorry train is passing by as you approach it could be a challenge.....

MarshPhantom

9,658 posts

138 months

Friday 1st July 2016
quotequote all
rxe said:
EricE said:
Tragic. We'll have to wait for the investigation to end but from what I've read I can't really fault the Tesla.

130 million miles without a fatal incident is remarkable
Surely the low number of autonomous cars out there means any stats are pretty much meaningless.

Blakewater

4,311 posts

158 months

Friday 1st July 2016
quotequote all
ILoveMondeo said:
Jader1973 said:
No, but the other car's system isn't called "Autopilot", wasn't downloaded automatically to the car, and few other manufacturers propagate the image that they are the great industry disrupter and that they are far superior to anyone else's tech.

Because the other manufacturers know the pitfalls.
Yes they do, At least some of them.

There was an ad for one of the Korean manufacturers I think, half. Dozen cars, following a lead vehicle, all the drivers get out and hop onto a trailer. Cars all driving themselves, lead car does emergency stop, all the others do so too.

That's pretty much "our cars can't crash/drive themselves"

Granted, they didn't call it autopilot.
Hyundai's Empty Car Convoy.

https://www.youtube.com/watch?v=EPTIXldrq3Q

There are plenty of warnings that people shouldn't try it themselves and that the systems have been modified not to cut in if they detect the driver isn't in control.

When cars were invented, people were so sure they were dangerous they were limited to walking speed and a man had to walk ahead with a red flag.

The idea of developing self driving technology is that all vehicles on the road will eventually work together in one big network so you won't have the human driven lorry pulling across in front of the autonomous car and there won't be road markings for human eyes on the road that the car might incorrectly follow.

We're in a transitional period between human driven vehicles and self driving vehicles. It's a new technology under development and, so long as people aren't careless with it, it shouldn't kill people. We've worked through deaths in road, rail, air and space travel and we accept there are still risks in using this technology. People are killed every day on the road because of human error and it seems that, while the car's technology failed to a certain extent in this case, both the Tesla driver and the lorry driver failed in their responsibilities as well.

In this video where Joshua Brown commends his car on avoiding an accident with a boom truck, he's listening to an audio book which I would say is potentially so absorbing it shouldn't be done in a car. I suspect he's developed a habit of trusting too much in the car to get him out of situations. He should have seen the truck coming in this instance before it became such a close call. As someone asked him in the comments, what would have happened if there had been something to the right of the car preventing it making a move to the right to avoid the truck?

https://www.youtube.com/watch?v=9I5rraWJq6E

My condolences go to Joshua Brown and his family, I'm not for a minute suggesting his death doesn't matter or count for anything, but it would be very short sighted and reactionary to dismiss autonomous driving technology on the basis of one case where someone misused early developmental technology. After all, would we be happy to carry on killing each other through our own mistakes or silly bouts of road rage?

anonymous-user

55 months

Friday 1st July 2016
quotequote all
@ Hyundai video love stuntman flail.

MarshPhantom

9,658 posts

138 months

Friday 1st July 2016
quotequote all
98elise said:
As opposed to the 84 million for the average car?
That's worldwide average so those figures take in a lot of spectacularly bad drivers.

Emeye

9,773 posts

224 months

Friday 1st July 2016
quotequote all
I read an article the other day that said their was a plan to remove white road markings and the centre line from roads as drivers tend to drive slower when their are no white lines to follow due to more caution.

How does that work with cars that follow the lines?

The Vambo

6,664 posts

142 months

Friday 1st July 2016
quotequote all
Emeye said:
I read an article the other day that said their was a plan to remove white road markings and the centre line from roads as drivers tend to drive slower when their are no white lines to follow due to more caution.

How does that work with cars that follow the lines?
Probably hypothetical solutions.

Hackney

6,856 posts

209 months

Friday 1st July 2016
quotequote all
Emeye said:
IKEA are recalling millions of draw units that can fall over and hurt or kill toddlers because everyone thinks they know better (myself included) and ignore the instructions to tether them to the wall.

Strangely IKEA are only recalling them in the US up to now, but my point is can you blame Tesla if people are ignoring their instructions and killing themselves? It seems that IKEA have been forced to recall their product due to user failure, so I could see Tesla being forced to do something like turn off the feature completely.

There is a lot of car industry influence within Washington and some car companies or oil may want to see Tesla get a knock down.
Hell, people sued McDonalds because the coffee was hot. People are idiots, but they have access to lawyers.

swimd

350 posts

122 months

Friday 1st July 2016
quotequote all
Police confirms presence of a DVD player...


http://www.reuters.com/article/us-tesla-autopilot-...

The Vambo

6,664 posts

142 months

Friday 1st July 2016
quotequote all
swimd said:
Police confirms presence of a DVD player...


http://www.reuters.com/article/us-tesla-autopilot-...
Ban DVD players now!

Jader1973

4,024 posts

201 months

Friday 1st July 2016
quotequote all
swimd said:
Tesla is using the semi-autonomous driving feature to collect data which will be used for the development of fully autonomous cars.

For example the "auto pilot" may have a tendency to follow a wrong or misleading road marking when it first encounters a certain point of a road. The driver then slightly nudges the car to the left or right, the algorithm learns this and will take this adjustment into account the next time a Tesla with autopilot encounters this situation. Once enough Tesla drivers have done this, the algorithm will eventually steer on the correct path.

A friend of mine owns a Tesla and told me this. He had a certain spot on his commute where the car always tried to leave the lane. He had to correct the car every time and after the 10th time it started to stay in the lane. Any Tesla owner encountering this spot now will never know that there was a problem.

Musk says Tesla is about 3 years away from fully autonomous cars and this is why. They're using Tesla owners to build a gigantic, detailed and humanly corrected dataset of all roads over the world.


Edited by swimd on Friday 1st July 10:14
But road layouts change all the time.

For example, they are widening the freeway on my way to work. To do that they have realigned the lanes, and every few weeks they tweak it depending on what they are doing. It may not be the same on Monday as it was on Friday.

So, how does a car with a system that has been taught the road goes one way cope when the white lines go a different way? Does it ignore it? Are there mass Tesla pile ups for days while the system relearns?

People adapt instantly, until all cars from all manufacturers are connected to a network and therefore talk to each other autonomous driving can't work properly. That brings its own problems though, like what happens when the network goes down.

The Vambo

6,664 posts

142 months

Saturday 2nd July 2016
quotequote all
Jader1973 said:
But road layouts change all the time.

For example, they are widening the freeway on my way to work. To do that they have realigned the lanes, and every few weeks they tweak it depending on what they are doing. It may not be the same on Monday as it was on Friday.

So, how does a car with a system that has been taught the road goes one way cope when the white lines go a different way? Does it ignore it? Are there mass Tesla pile ups for days while the system relearns?

People adapt instantly, until all cars from all manufacturers are connected to a network and therefore talk to each other autonomous driving can't work properly. That brings its own problems though, like what happens when the network goes down.
Jesus H.

You have to have the slightest idea of how the fecker works before having a go at it.

Mr Tidy

22,476 posts

128 months

Saturday 2nd July 2016
quotequote all
jontykint said:
Hang on... if a big f-off truck turned across you why would you either
a. Not brake
b. Expect the car to brake for you
c. Not brake, expecting the car to brake for you

Or just not see it coming so it wasn't yours or tesla's fault??
Not a difficult answer surely - because you were busy on ttter or FaceArse FFS!

Just get a train, bus or a taxi if you can't be A*sed with driving!

saaby93

32,038 posts

179 months

Saturday 2nd July 2016
quotequote all
Emeye said:
I read an article the other day that said their was a plan to remove white road markings and the centre line from roads as drivers tend to drive slower when their are no white lines to follow due to more caution.

How does that work with cars that follow the lines?
Do you get out much hehe
It's not a plan that may happen, its something thats in operation already.
They dont need to be removed. They naturally wear out and are not replaced. Have a look around you to see how many villages or urban streets no longer have lane markings

anonymous-user

55 months

Saturday 2nd July 2016
quotequote all
Watching a DVD, and six speeding tickets in eight years, if the reports are to be believed.


BMW also developing their equivalent to autopilot - and if they do it, pretty soon everyone else will too.