Fatal Tesla crash, software based issue?
Discussion
aeropilot said:
So......given all of the above, what's the point of it.
Tesla is using the semi-autonomous driving feature to collect data which will be used for the development of fully autonomous cars. For example the "auto pilot" may have a tendency to follow a wrong or misleading road marking when it first encounters a certain point of a road. The driver then slightly nudges the car to the left or right, the algorithm learns this and will take this adjustment into account the next time a Tesla with autopilot encounters this situation. Once enough Tesla drivers have done this, the algorithm will eventually steer on the correct path.
A friend of mine owns a Tesla and told me this. He had a certain spot on his commute where the car always tried to leave the lane. He had to correct the car every time and after the 10th time it started to stay in the lane. Any Tesla owner encountering this spot now will never know that there was a problem.
Musk says Tesla is about 3 years away from fully autonomous cars and this is why. They're using Tesla owners to build a gigantic, detailed and humanly corrected dataset of all roads over the world.
Edited by swimd on Friday 1st July 10:14
swimd said:
aeropilot said:
So......given all of the above, what's the point of it.
Tesla is using the semi-autonomous driving feature to collect data which will be used for the development of fully autonomous cars. For example the "auto pilot" may have a tendency to follow a wrong or misleading road marking when it first encounters a certain point of a road. The driver then slightly nudges the car to the left or right, the algorithm learns this and will take this adjustment into account the next time a Tesla with autopilot encounters this situation. Once enough Tesla drivers have done this, the algorithm will eventually steer on the correct path.
A friend of mine owns a Tesla and told me this. He had a certain spot on his commute where the car always tried to leave the lane. He had to correct the car every time and after the 10th time it started to stay in the lane. Any Tesla owner encountering this spot now will never know that there was a problem.
Musk says Tesla is about 3 years away from fully autonomous cars and this is why. They're using Tesla owners to build a gigantic, detailed and humanly corrected dataset of all roads over the world.
Edited by swimd on Friday 1st July 10:14
Rovinghawk said:
EricE said:
- 130 million miles without a fatal incident is remarkable
What about 130 million miles WITH a fatal accident, as is actually the case?swimd said:
aeropilot said:
So......given all of the above, what's the point of it.
Tesla is using the semi-autonomous driving feature to collect data which will be used for the development of fully autonomous cars. For example the "auto pilot" may have a tendency to follow a wrong or misleading road marking when it first encounters a certain point of a road. The driver then slightly nudges the car to the left or right, the algorithm learns this and will take this adjustment into account the next time a Tesla with autopilot encounters this situation. Once enough Tesla drivers have done this, the algorithm will eventually steer on the correct path.
A friend of mine owns a Tesla and told me this. He had a certain spot on his commute where the car always tried to leave the lane. He had to correct the car every time and after the 10th time it started to stay in the lane. Any Tesla owner encountering this spot now will never know that there was a problem.
Musk says Tesla is about 3 years away from fully autonomous cars and this is why. They're using Tesla owners to build a gigantic, detailed and humanly corrected dataset of all roads over the world.
Edited by swimd on Friday 1st July 10:14
Has anyone checked he isn't a Terminator sent back from the future to start the war?
Have read that the driver was apparently watching Harry Potter just before the accident happened and that, I think, is the problem with these systems (especially one called Autopilot). Yes, they're very good, but they aren't perfect and you do need to keep an eye on what's going on. Chances are you could drive down the road with your eyes closed day after day and you'd be fine, but then one day, without warning, you might not be.
EricE said:
Tragic. We'll have to wait for the investigation to end but from what I've read I can't really fault the Tesla.
- it's clearly stated as the beta version of a fancy cruise control, not an autonomous driving option
- 130 million miles without a fatal incident is remarkable
- visibility and circumstances were very bad
- press reports that the driver of the truck stated that there was a harry potter movie playing in the car after the fatal crash
IMV the only thing that needs to happen out of this is that car manufacturers make it much harder to bypass the "presence detection" of cars with ACC and lane assist.
I don't know how the Tesla works but I've seen a video of a guy attaching a can of soda to the steering wheel of a new S-Class and climbing from the driver's seat to the the rear seat while on the Autobahn!
Here's another one like this: https://www.youtube.com/watch?v=6hKIDHisdjc
One fatal per 130 million sounds good, but in the UK there are 311 billion miles covered per annum with 1700 fatalities (2014 figures) - so the humans (in the UK at least) are running at one fatality per 182 million miles. And remember that the humans are driving on more dangerous rural roads rather than cherry picking motorway miles which are safer. - it's clearly stated as the beta version of a fancy cruise control, not an autonomous driving option
- 130 million miles without a fatal incident is remarkable
- visibility and circumstances were very bad
- press reports that the driver of the truck stated that there was a harry potter movie playing in the car after the fatal crash
IMV the only thing that needs to happen out of this is that car manufacturers make it much harder to bypass the "presence detection" of cars with ACC and lane assist.
I don't know how the Tesla works but I've seen a video of a guy attaching a can of soda to the steering wheel of a new S-Class and climbing from the driver's seat to the the rear seat while on the Autobahn!
Here's another one like this: https://www.youtube.com/watch?v=6hKIDHisdjc
The technology is clever, but there is a fundamental issue of control here. We already know that humans will do stupid things: they will text while driving, even though they should be 100% in control of the car. This technology gives them the opportunity to even more daft things (e.g. watch Harry Potter movies), and in the main they will get away with it. Until, of course, they don't.
rxe said:
One fatal per 130 million sounds good, but in the UK there are 311 billion miles covered per annum with 1700 fatalities (2014 figures) - so the humans (in the UK at least) are running at one fatality per 182 million miles. And remember that the humans are driving on more dangerous rural roads rather than cherry picking motorway miles which are safer.
Good post.98elise said:
SuperVM said:
I work in an industry where it is not sufficient to provide a user with a mechanism to do something and then simply tell them they shouldn't do it under particular circumstances. If there are circumstances under which a user shouldn't be doing something, then controls should be put in place to prevent it from being used in those circumstances. The result for a not putting in place such controls and the aforementioned mechanism being used in the wrong circumstances would be pretty severe to the individual who did so, my company and me. I can't see how Tesla can roll out a system and then just expect all people to behave sensibly, as it simply won't happen.
How do they cope with driving cars? There are many things a car can do, and a whole host of rules saying you shouldn't do particular things. There is very little in the way of controls to stop you.trickywoo said:
rxe said:
One fatal per 130 million sounds good, but in the UK there are 311 billion miles covered per annum with 1700 fatalities (2014 figures) - so the humans (in the UK at least) are running at one fatality per 182 million miles. And remember that the humans are driving on more dangerous rural roads rather than cherry picking motorway miles which are safer.
Good post.Spumfry said:
But why not compare it to Swedish fatality rates, or to Indian or Chinese fatality rates? Or worldwide rates as a whole?
Feel free to do so.My point is simply that 1 in 132 million is a great soundbite, but is actually what humans are doing today. American drivers might be a bit worse, I have no idea. If humans were experiencing fatalities at a rate of 1 in a million, 1 in 132 million would be a big deal.
RobDickinson said:
Someone want to guess a percentage of people who actually enjoy spending their time doing the work commute twice a day?
2%? 3%?
I love driving but most peoples driving, most of the time, is drudgery that can and will one day be automated.
It used to be a drudge, then I bought a V10 S6. Now its fking epic.2%? 3%?
I love driving but most peoples driving, most of the time, is drudgery that can and will one day be automated.
rxe said:
Spumfry said:
But why not compare it to Swedish fatality rates, or to Indian or Chinese fatality rates? Or worldwide rates as a whole?
Feel free to do so.My point is simply that 1 in 132 million is a great soundbite, but is actually what humans are doing today. American drivers might be a bit worse, I have no idea. If humans were experiencing fatalities at a rate of 1 in a million, 1 in 132 million would be a big deal.
A computer following a specific set of instructions will not suddenly decide to overtake on a blind corner or boot it round a roundabout in the wet.
I don't think AI will ever completely remove all fatalities from driving because the human brain is fair more advanced than any computer (well not everyone's) but it wil remove the knob head factor from these things
Gassing Station | General Gassing | Top of Page | What's New | My Stuff