Fatal Tesla crash, software based issue?

Fatal Tesla crash, software based issue?

Author
Discussion

swimd

350 posts

120 months

Friday 1st July 2016
quotequote all
aeropilot said:
So......given all of the above, what's the point of it.
Tesla is using the semi-autonomous driving feature to collect data which will be used for the development of fully autonomous cars.

For example the "auto pilot" may have a tendency to follow a wrong or misleading road marking when it first encounters a certain point of a road. The driver then slightly nudges the car to the left or right, the algorithm learns this and will take this adjustment into account the next time a Tesla with autopilot encounters this situation. Once enough Tesla drivers have done this, the algorithm will eventually steer on the correct path.

A friend of mine owns a Tesla and told me this. He had a certain spot on his commute where the car always tried to leave the lane. He had to correct the car every time and after the 10th time it started to stay in the lane. Any Tesla owner encountering this spot now will never know that there was a problem.

Musk says Tesla is about 3 years away from fully autonomous cars and this is why. They're using Tesla owners to build a gigantic, detailed and humanly corrected dataset of all roads over the world.


Edited by swimd on Friday 1st July 10:14

RobDickinson

31,343 posts

253 months

Friday 1st July 2016
quotequote all
Someone want to guess a percentage of people who actually enjoy spending their time doing the work commute twice a day?

2%? 3%?

I love driving but most peoples driving, most of the time, is drudgery that can and will one day be automated.

aeropilot

34,302 posts

226 months

Friday 1st July 2016
quotequote all
swimd said:
aeropilot said:
So......given all of the above, what's the point of it.
Tesla is using the semi-autonomous driving feature to collect data which will be used for the development of fully autonomous cars.

For example the "auto pilot" may have a tendency to follow a wrong or misleading road marking when it first encounters a certain point of a road. The driver then slightly nudges the car to the left or right, the algorithm learns this and will take this adjustment into account the next time a Tesla with autopilot encounters this situation. Once enough Tesla drivers have done this, the algorithm will eventually steer on the correct path.

A friend of mine owns a Tesla and told me this. He had a certain spot on his commute where the car always tried to leave the lane. He had to correct the car every time and after the 10th time it started to stay in the lane. Any Tesla owner encountering this spot now will never know that there was a problem.

Musk says Tesla is about 3 years away from fully autonomous cars and this is why. They're using Tesla owners to build a gigantic, detailed and humanly corrected dataset of all roads over the world.


Edited by swimd on Friday 1st July 10:14
Gawd 'elp us.........



TheInternet

4,703 posts

162 months

Friday 1st July 2016
quotequote all
Emeye said:
Elon Musk replied to a tweet this morning saying that a Tesla cannot scan up high for the side of trucks as it would see overhead traffic gantries and bridges as an obstruction.
I hope you've paraphrased the tweet because the reason above sounds ridiculous.

Rovinghawk

13,300 posts

157 months

Friday 1st July 2016
quotequote all
EricE said:
- 130 million miles without a fatal incident is remarkable
What about 130 million miles WITH a fatal accident, as is actually the case?

The Wookie

13,909 posts

227 months

Friday 1st July 2016
quotequote all
Rovinghawk said:
EricE said:
- 130 million miles without a fatal incident is remarkable
What about 130 million miles WITH a fatal accident, as is actually the case?
Fag packet Wikipedia stasticamalising gives me about 75% as far as an average UK driver would have to go before being killed

Jader1973

3,946 posts

199 months

Friday 1st July 2016
quotequote all
swimd said:
aeropilot said:
So......given all of the above, what's the point of it.
Tesla is using the semi-autonomous driving feature to collect data which will be used for the development of fully autonomous cars.

For example the "auto pilot" may have a tendency to follow a wrong or misleading road marking when it first encounters a certain point of a road. The driver then slightly nudges the car to the left or right, the algorithm learns this and will take this adjustment into account the next time a Tesla with autopilot encounters this situation. Once enough Tesla drivers have done this, the algorithm will eventually steer on the correct path.

A friend of mine owns a Tesla and told me this. He had a certain spot on his commute where the car always tried to leave the lane. He had to correct the car every time and after the 10th time it started to stay in the lane. Any Tesla owner encountering this spot now will never know that there was a problem.

Musk says Tesla is about 3 years away from fully autonomous cars and this is why. They're using Tesla owners to build a gigantic, detailed and humanly corrected dataset of all roads over the world.


Edited by swimd on Friday 1st July 10:14
And if a few puny humans get killed in the process of creating his master database then surely there is no issue.

Has anyone checked he isn't a Terminator sent back from the future to start the war?

David87

6,648 posts

211 months

Friday 1st July 2016
quotequote all
Have read that the driver was apparently watching Harry Potter just before the accident happened and that, I think, is the problem with these systems (especially one called Autopilot). Yes, they're very good, but they aren't perfect and you do need to keep an eye on what's going on. Chances are you could drive down the road with your eyes closed day after day and you'd be fine, but then one day, without warning, you might not be.

dave_s13

13,813 posts

268 months

Friday 1st July 2016
quotequote all
Emeye said:
IKEA are recalling millions of draw units that can fall over and hurt or kill toddlers because everyone thinks they know better (myself included) and ignore the instructions to tether them to the wall..
A job for me this weekend.

AH33

2,066 posts

134 months

Friday 1st July 2016
quotequote all
s3fella said:
It should be banned, it is absurd. And if you don't want to actively steer or drive your car, take the fking bus.
A million times this.

AH33

2,066 posts

134 months

Friday 1st July 2016
quotequote all
RobDickinson said:
I love driving but most peoples driving, most of the time, is drudgery that can and will one day be automated.
The bus? The train? A taxi? Carpooling? Uber?

rxe

6,700 posts

102 months

Friday 1st July 2016
quotequote all
EricE said:
Tragic. We'll have to wait for the investigation to end but from what I've read I can't really fault the Tesla.

- it's clearly stated as the beta version of a fancy cruise control, not an autonomous driving option
- 130 million miles without a fatal incident is remarkable
- visibility and circumstances were very bad
- press reports that the driver of the truck stated that there was a harry potter movie playing in the car after the fatal crash

IMV the only thing that needs to happen out of this is that car manufacturers make it much harder to bypass the "presence detection" of cars with ACC and lane assist.

I don't know how the Tesla works but I've seen a video of a guy attaching a can of soda to the steering wheel of a new S-Class and climbing from the driver's seat to the the rear seat while on the Autobahn!

Here's another one like this: https://www.youtube.com/watch?v=6hKIDHisdjc
One fatal per 130 million sounds good, but in the UK there are 311 billion miles covered per annum with 1700 fatalities (2014 figures) - so the humans (in the UK at least) are running at one fatality per 182 million miles. And remember that the humans are driving on more dangerous rural roads rather than cherry picking motorway miles which are safer.

The technology is clever, but there is a fundamental issue of control here. We already know that humans will do stupid things: they will text while driving, even though they should be 100% in control of the car. This technology gives them the opportunity to even more daft things (e.g. watch Harry Potter movies), and in the main they will get away with it. Until, of course, they don't.

trickywoo

11,706 posts

229 months

Friday 1st July 2016
quotequote all
rxe said:
One fatal per 130 million sounds good, but in the UK there are 311 billion miles covered per annum with 1700 fatalities (2014 figures) - so the humans (in the UK at least) are running at one fatality per 182 million miles. And remember that the humans are driving on more dangerous rural roads rather than cherry picking motorway miles which are safer.
Good post.

Devil2575

13,400 posts

187 months

Friday 1st July 2016
quotequote all
98elise said:
SuperVM said:
I work in an industry where it is not sufficient to provide a user with a mechanism to do something and then simply tell them they shouldn't do it under particular circumstances. If there are circumstances under which a user shouldn't be doing something, then controls should be put in place to prevent it from being used in those circumstances. The result for a not putting in place such controls and the aforementioned mechanism being used in the wrong circumstances would be pretty severe to the individual who did so, my company and me. I can't see how Tesla can roll out a system and then just expect all people to behave sensibly, as it simply won't happen.
How do they cope with driving cars? There are many things a car can do, and a whole host of rules saying you shouldn't do particular things. There is very little in the way of controls to stop you.
I was just thinking the same thing.

lost in espace

6,136 posts

206 months

Friday 1st July 2016
quotequote all
I just went for a test drive in a P90D at Essendon Country Club. Tried autopilot on the A1, worked fine and I am not dead.

xjay1337

15,966 posts

117 months

Friday 1st July 2016
quotequote all
lost in espace said:
I just went for a test drive in a P90D at Essendon Country Club. Tried autopilot on the A1, worked fine and I am not dead.
Naturally

anonymous-user

53 months

Friday 1st July 2016
quotequote all
trickywoo said:
rxe said:
One fatal per 130 million sounds good, but in the UK there are 311 billion miles covered per annum with 1700 fatalities (2014 figures) - so the humans (in the UK at least) are running at one fatality per 182 million miles. And remember that the humans are driving on more dangerous rural roads rather than cherry picking motorway miles which are safer.
Good post.
But why not compare it to Swedish fatality rates, or to Indian or Chinese fatality rates? Or worldwide rates as a whole?

rxe

6,700 posts

102 months

Friday 1st July 2016
quotequote all
Spumfry said:
But why not compare it to Swedish fatality rates, or to Indian or Chinese fatality rates? Or worldwide rates as a whole?
Feel free to do so.

My point is simply that 1 in 132 million is a great soundbite, but is actually what humans are doing today. American drivers might be a bit worse, I have no idea. If humans were experiencing fatalities at a rate of 1 in a million, 1 in 132 million would be a big deal.

944fan

4,962 posts

184 months

Friday 1st July 2016
quotequote all
RobDickinson said:
Someone want to guess a percentage of people who actually enjoy spending their time doing the work commute twice a day?

2%? 3%?

I love driving but most peoples driving, most of the time, is drudgery that can and will one day be automated.
It used to be a drudge, then I bought a V10 S6. Now its fking epic.

944fan

4,962 posts

184 months

Friday 1st July 2016
quotequote all
rxe said:
Spumfry said:
But why not compare it to Swedish fatality rates, or to Indian or Chinese fatality rates? Or worldwide rates as a whole?
Feel free to do so.

My point is simply that 1 in 132 million is a great soundbite, but is actually what humans are doing today. American drivers might be a bit worse, I have no idea. If humans were experiencing fatalities at a rate of 1 in a million, 1 in 132 million would be a big deal.
Also, how many fatalities caused by humans are because of stupidity and recklessness of humans.

A computer following a specific set of instructions will not suddenly decide to overtake on a blind corner or boot it round a roundabout in the wet.

I don't think AI will ever completely remove all fatalities from driving because the human brain is fair more advanced than any computer (well not everyone's) but it wil remove the knob head factor from these things