Uber driverless car in fatal accident

Uber driverless car in fatal accident

Author
Discussion

The Selfish Gene

5,516 posts

211 months

Tuesday 20th March 2018
quotequote all
DoubleD said:
The Selfish Gene said:
wow - what was the idiot monitoring behind the wheel doing?

it was only a matter of time.
How do you know that he is an idiot? Has some evidence come out about the crash?
erm because his entire job was to sit there and do nothing unless there was a risk, and avert an accident.

He didn't perform his basic function, therefore an idiot.

I'd fire one of my testers on the spot for a lot less than that.

Edited to add - this "passive driver" wasn't driving. He was monitoring. He should be a professional tester. He should have the capacity to be looking everywhere (as he doesn't need to concentrate on driving the car at all).

Obviously though - it's Uber - so they probably found one of their usual drivers to test it........and well frankly, that is negligent when the cars aren't self driving.

It will never happen whilst we have humans in the mix.




Edited by The Selfish Gene on Tuesday 20th March 08:39

DonkeyApple

55,419 posts

170 months

Tuesday 20th March 2018
quotequote all
juice said:
Bit confused by this as by all accounts there was a driver on-board....How come he never intervened ?
I imagine it’s actually quite hard to do so. I imagine that by the time you realise the car hasn’t actually seen the obstacle then there is little time to do much. Plus, the odds are arguably greater that the passenger was spending less time monitoring the environment outside the vehicle than if they were driving it.

I like the idea of autonomous car. Personally I don’t believe we are anywhere near as close to cracking the complexity of driving as believers try and claim but more importantly, I kind of know that people will die during this period of evolution and I struggle to believe it is worth that. And Uber also know that but have made the decision that their profits from making all their remote staff redundant far outweigh the lives they will take getting there. But I guess they believe that self driving taxis will kill fewer people than human driven ones ultimately.

DonkeyApple

55,419 posts

170 months

Tuesday 20th March 2018
quotequote all
HTP99 said:
I wonder where the blame lies and who is charged; person in the car, Uber, will it be a huge corporate blame, will some faceless director be thrown under the bus by Uber?

Could be a massive can of worms.
Pedestrians fault. They weren’t where the rules said they should be. The article states several times that they weren’t on the crosswalk. And that’s the can of worms. Outside of Canada or fanatical dictatorships where is it that people follow all the rules all the time?

The Selfish Gene

5,516 posts

211 months

Tuesday 20th March 2018
quotequote all
stuttgartmetal said:
If these Uber cars were speed limited to 10mph In residential street, the chances of a fatal accident would be infinitesimally small.
Plus. It doesn’t matter what they look like, theyre utensils.
They can be covered in airbags driven by micro sensors.
It’ll happen.
Uber with humans are already a total liability - stopping everywhere, checking their phone the entire time, not watching where they are going, slowing down and speeding up.

To have a self driving car - that does 10mph ...........getting in everyone's way, will cause more accidents as people impatiently get past it.

I for one would simply sabotage anything I see driving at 10mph without a driver in it.

What is the point of creating something automated that is worse, and slower than what we have now.


Bungleaio

6,337 posts

203 months

Tuesday 20th March 2018
quotequote all
It's a sad loss of life but they can't give up the testing. Globally I bet there were hundreds killed on the road yesterday by human drivers.

Tuna

19,930 posts

285 months

Tuesday 20th March 2018
quotequote all
The Selfish Gene said:
erm because his entire job was to sit there and do nothing unless there was a risk, and avert an accident.

He didn't perform his basic function, therefore an idiot.

I'd fire one of my testers on the spot for a lot less than that.
Glad I don't work for you then.

NASA have done some interesting tests on how quickly a 'passenger' can respond if the thing doing the driving makes an incorrect decision... response time drops off massively because humans are just not able to keep attention up when someone else is doing the work. It's a serious issue (one of many) with autonomous cars, and believed to be partly to blame for the death that occurred last year.

g7jhp

6,969 posts

239 months

Tuesday 20th March 2018
quotequote all
DonkeyApple said:
Pedestrians fault. They weren’t where the rules said they should be. The article states several times that they weren’t on the crosswalk. And that’s the can of worms. Outside of Canada or fanatical dictatorships where is it that people follow all the rules all the time?
So if a human has to move into the road it's OK for an Autonomus car to kill them because 'normally' you shouldn't be there.

Life isn't black or white. Technology works until we have anything out of the norm and then if often fails badly.

gmaz

4,414 posts

211 months

Tuesday 20th March 2018
quotequote all
Its a bit limitation of driverless cars that they cannot read body-language and determine the intention of a person before they act, e.g. making eye-contact and registering that I've seen you see me. Similarly, if a football rolls out from between parked cars, would a driverless car expect a child to run out to retrieve it?

RobDickinson

31,343 posts

255 months

Tuesday 20th March 2018
quotequote all
What death last year?


Imo level 4 should be the legal responsibility of the company making it. In this case the software, car, driver all are Uber so not that tricky but what liability is the safety driver exposed to?

Level 5 doesn't even have physical controls..

It could well have been unavoidable due to the actions of the pedestrian, but that will come out at some point soon.

phil4

1,217 posts

239 months

Tuesday 20th March 2018
quotequote all
Tuna said:
Glad I don't work for you then.

NASA have done some interesting tests on how quickly a 'passenger' can respond if the thing doing the driving makes an incorrect decision... response time drops off massively because humans are just not able to keep attention up when someone else is doing the work. It's a serious issue (one of many) with autonomous cars, and believed to be partly to blame for the death that occurred last year.
From personal experience I agree... and as someone has already said, we can't expect to sit someone in a seat, give them nothing to do, except "watch-out" for something happening, and react promptly if it does.

I think the issue is less something we need to do something tech wise about, and more something we should try and ensure is understood. In every day life there's lots of humans used as supervisors for tasks of all sorts, the less interactive they have, the worse they'll be at it. We'd at the extreme end of no interaction. If we expect it to eliminate problems, we're simply mistaken, we need to accept that and have the tech stand on it's own two feet.... or not.

forzaminardi

2,290 posts

188 months

Tuesday 20th March 2018
quotequote all
It's sad obviously for the lady involved, but there's nothing newsworthy about someone getting run over by a taxi other than the fact it was an autonomous driving vehicle. As someone earlier posted, the issue is that much as AI is amazing and the technology deployed in these vehicles will be amazingly sensitive and clever, they will always be subject to the vagaries of human behaviour - and short of having everyone in the world sign some sort of release form agreeing to obey every trivial rule of the road, the legalities of who is responsible for events like this can only be massively difficult to resolve.
I personally think the legal, insurance and practical difficulties of autonomous driving will mean it is either so regulated (ie the car can drive autonomously, but must be subject to override by a conscious, sober, qualified driver at any time) or so restricted in its application (ie may only drive autonomously on say smart motorways) that for 99% of people ots pointless 99% of the time. I enjoy driving, even in heavy traffic, but I can see that having the chance to dash of some work emails or play a game while the car drives itself to work, or have the car drive me home from the pub after a few drinks would be useful. But I can't see the law ever letting me do so, even if the technology makes it genuinely possible.

kambites

67,593 posts

222 months

Tuesday 20th March 2018
quotequote all
I wonder if the "driver" will be prosecuted for causing death by dangerous driving?

DonkeyApple

55,419 posts

170 months

Tuesday 20th March 2018
quotequote all
g7jhp said:
DonkeyApple said:
Pedestrians fault. They weren’t where the rules said they should be. The article states several times that they weren’t on the crosswalk. And that’s the can of worms. Outside of Canada or fanatical dictatorships where is it that people follow all the rules all the time?
So if a human has to move into the road it's OK for an Autonomus car to kill them because 'normally' you shouldn't be there.

Life isn't black or white. Technology works until we have anything out of the norm and then if often fails badly.
Yup. That’s the point I was making. A robot works by a set of programmed rules. Humans don’t. Humans aren’t black of white, robots are. And that’s the mismatch that will make autonomous cars safer in some regards, less so than others.

Personallly, I’ll only believe autonomous cars actually work when they can drive through the West End and the City without either being held up by pedestrians who know the vehicle will give way to them or without injuring a pedestrian who isn’t living in the black and white world of law or binary code.

Vanden Saab

14,139 posts

75 months

Tuesday 20th March 2018
quotequote all
peterperkins said:
Let's say for argument by 2040 driverless cars reduce road deaths by 50%, is that ok?


The liability question is of course interesting and might take legislation to prevent class action suits crippling the industry before it has a chance to show long term benefits. People are naturally short sighted and emotionally attached when it's one of their own who has bought it in an accident. The greater good might be limited liability and money for research.
Only if we accept that we are happy for machines to kill humans for the greater good....are you?

So not only can machines kill humans but they can do so with no sanctions or financial penalties....

My whole point is that as a human you can accept if another human accidentally kills someone, you will see that it is different if it is deliberate to some degree ie drunk driving. In the case of death or serious injury at least their relatives are compensated for their loss.
Maybe I am just old as I do not accept that it is ok for a machine to kill a human without penalty because 3,5 or even 5,000 have been saved. For me this is the Hiroshima argument and a very slippery slope.

aeropilot

34,680 posts

228 months

Tuesday 20th March 2018
quotequote all
cptsideways said:
Along with an observer/driver, does make you wonder what they were observing.
Observer was probably sitting playing with their phone.....

So, no different to most US drivers that are supposedly in charge of all the non-autonomous cars over there..... rolleyes






Eric Mc

122,058 posts

266 months

Tuesday 20th March 2018
quotequote all
All of that is academic.

Is not the whole point of an autonomous car the fact that it liberates the humans on board from responsibility?

If they don't do that, what is their purpose?

I had visions of people travelling on long overnight journeys in their autonomous motor homes, safely asleep in their bunk bed whilst the car safely conducts its journey smile.

Or popping down the back to watch telly smile

akirk

5,395 posts

115 months

Tuesday 20th March 2018
quotequote all
RobDickinson said:
An AI driver making a mistake will have all the data available and it will be fixed so none of those AI drivers ever make that mistake again.
Possibly, and for some scenarios, but not all...

code is written by a human, there is some ridiculously naive thinking around autonomous cars being computers and therefore better than humans - any machine is only coded by a human - or worse still, by a set of humans, each of whom makes assumptions / forgets to consider things / overlooks bits / leaves something on their todo list / etc.! They are not truly AI capable - they can only learn programatically and within the confines of how they are coded - and having been in coding and IT for 30+ years, I am yet to see a single set of code without issues...

if companies such as microsoft and apple are unable to put out code without bugs after their decades of experience - what hope is there for new companies?

and there is a big assumption that it is fixable - this was a basic scenario - pedestrian crossing road, with one complication, night-time. If the coders have not already dealt with that scenario and played every possibility then they should be fired! If they have, then they will look at code which should have spotted the situation, and didn't - they may not be easy to fix, and if assumptions were made initially, you can guarantee that more assumptions will be made in the fix - and like any big coding project, there is a strong chance that in fixing one issue, others will be created...

if planes can not be put into commercial service autonomously yet (and their journey is simple - point A to point B with precisely known parameters, no random pedestrians and no conflicting traffic) then how will a car be expected to succeed?

if trains can not be fully autonomous without oversight / guarantees of absolutely zero issues (and their journey is even easier as they are tied to a rail!), then how can we expect cars to succeed...

the blunt truth is that we are still a long way from full autonomous motoring capabilities - with huge corporates wanting to make a lot of money using their PR machines to tell us differently - and governments scared to miss out, taking that PR at face value. We have systems in place that are active and live, yet do not work to the claimed 100% (e.g. auto-braking / adaptive cruise control / etc.), we are being told that everything is sorted and perfect, that humans are not as good as machines, yet there is no evidence to actually support that...

people defend the autonomous car having killed someone by saying:
- a driver wouldn't have been able to react as quickly
- how can you expect the person behind the wheel to stay engaged

well - a good driver should have been looking ahead, spotting pedestrians and asking what if - and potentially slowing down, i.e. they should be predictive, not reactive - as should the autonomous car - but clearly there are issues at times with humans, and here with the computer... If the person behind the wheel can't stay engaged, then the testing should not take place - otherwise it is no different to sending out an autonomous car with no-one behind the wheel - and the technology is not ready for that yet...

a clear case of corporate man-slaughter
and an even clearer case of a market puffing off the emperor's new clothes...

J4CKO

41,637 posts

201 months

Tuesday 20th March 2018
quotequote all
Every technological leap humans make involves loss of life, Space exploration, Aviation etc, inevitably as things are improved, some people will be killed, but eventually the work done will save and improve lives.

I wonder how many road deaths there were in Arizona on the 19th of March ? a quick Google shows that in 2016 there were a total of 856, or 2.34 Fatal accidents per day on average.

The tech isnt there yet, as this proves, but what annoys me is the "There, I told you so mentality, its a bad idea" as so many believe they can do it better, maybe some can, right now but as time and technology moves on the bar will get higher and higher and human drivers wont be able to out drive the autonomy, it is a bit scary for those of us who have never experienced it, and we like driving, I get that but personally I think it will be of major benefit for a lot of people who cant, or dont like driving, are tired or otherwise happy for the car to take over.

I suspect, eventually, in a lot of cases it will be like ESP and ABS, where you drive and the autonomy keeps a watching brief and only intervenes when it reckons you are about to screw up.

Obviously it is tragic for the lady killed, but then, in 2016 it was tragic for 856 other people and their families.

However, because it is different and new, I suspect it may well be treated as something to be vilified despite huge amounts of stats indicating human drivers are creating carnage daily, I draw parallels with other debates where perceived and actual risk are massively skewed and the answer may be "Ban It" whilst death and destruction still happens unabated for other reasons that everyone seems to be comfortable with.


BMWBen

4,899 posts

202 months

Tuesday 20th March 2018
quotequote all
gmaz said:
Its a bit limitation of driverless cars that they cannot read body-language and determine the intention of a person before they act, e.g. making eye-contact and registering that I've seen you see me. Similarly, if a football rolls out from between parked cars, would a driverless car expect a child to run out to retrieve it?
The first part is something that humans often get wrong and is only an indicator. I've had people pull out in front of me when I'm looking them right in the eye. Regardless, there's no reason why a computer with sensors couldn't do this better than a human by tackling it differently.

You use the "have they looked at me" as a proxy measurement for "are they going to pull out". You could equally look at the situation from their perspective, what other vehicles are coming from what other directions, how long have they been stationary for, how long have they had to observe their surroundings to work out whether it's likely that they've correctly assessed the situation.

On the second part, that's exactly what computers are good at. Set of inputs (road speed, type of road, amount of peripheral visibility etc), logical test, output.

Eric Mc

122,058 posts

266 months

Tuesday 20th March 2018
quotequote all
It is a flawed and wasteful project that will not be feasible until vehicles as we know them no longer exist and all wheeled traffic runs on rails or in sealed routes where humans (pedestrians, cycles etc) are not allowed.

Only then, will full autonomy for wheeled transport be realistic.