RE: PH Blog: blind faith

RE: PH Blog: blind faith

Author
Discussion

renrut

1,478 posts

205 months

Monday 19th December 2011
quotequote all
Only one thing to ask in this thread - in a fully automated car who is responsible when something does go wrong (and as MaxTorque points out they always will at some point) and it runs over 5 kids at a pedestrian crossing?

Is it the human 'driver', the car itself, the software programmers for the system, the director of the company who manufactured the car?

I suspect no one will dare answer this until it ends up in a court. The manufacturers will probably try to cover themselves with some weasel words in the owners manual while at the same time trying to convince everyone in adverts that its perfectly safe to let it get on with driving while you get pissed up and sleep in the car on a long journey.

Blown2CV

28,819 posts

203 months

Monday 19th December 2011
quotequote all
very interesting point, especially when there will be a time when a mixture of automated and human-driven cars on the roads. Basically humans do an infinite number of illogical and unpredictable things on the road, and largely this is what makes driving dangerous. The media are always banging on (when it suits) that speed kills, but often it's the reduced time to react to someone else's stupid behaviour that causes the accident when going fast e.g. the idiot in front pulling out without looking or indicating. There will be some accidents that an automated system (or person) would never be able to avoid. Imagine the car driving itself bowling along at 70mph with your hands behind your head or reading a paper, and the car in front gets a blowout. Then what? You could argue that the system should account for every possible thing that can go wrong on the roads, but I liken this to the falling piano - you can't predict everything. Sometimes you come round a bend doing an acceptable speed, and there's a tramp bumming a sheep in the middle of the road (or whatever). I can only imagine we will all need to have black box data recorders to help judge whether an accident was avoidable or not, or if there was a fault or something.

ploz

89 posts

229 months

Monday 19th December 2011
quotequote all
WRT blame in the case of an accident - the strnage thing is that it will be easier for the manufacturers/programmers/operators to avoid blame because the reactions of the autonomous system to any eventuality will be predictable and modellable. Therefore, in Court, it will be the word of a fallable human against a print out of repeatable and demonstrable facts that even the best layer will have difficulty in tearing apart. The likelyhood of 'knock for knock' or 'no blame' claims is much reduced.

It is not beyond the realms of possibility that manufacturers will be able to insure against system failures causing accidents (and don't forget how cleaver they are at avoiding warranty claims) to allow them to take the risk of putting autonomous cars on the road.

All the analysis we have done so far suggests that large trucks will be the first to go autonomous as the potential operating cost savings (absolutely optimal driving technique and the ability to drive all night at a calculated speed to arrive exactly on time) will outweigh the additional risk and costs in a large fleet.

Scuffers

20,887 posts

274 months

Monday 19th December 2011
quotequote all
think the problem here is having automated cars sharing road space with people-driven cars, that's just never going to work.

As Google (and others) have been working on automated cars for years, they still have a shed long way to go, and the rules of the road are going to have to change to make this happen, overtaking will be the first thing to go (no way can an automated car deal with the permutations of both having to overtake and deal with cars overtaking around it).

Yes, we have autopilot's, but they are nothing like as hard to understand, air-traffic is well defined, and nothing like as crowded.

ploz

89 posts

229 months

Monday 19th December 2011
quotequote all
Overtaking could become significantly safer! An autonomous car (or lorry) overtaking another is likely to be able to network with and use the leading vehicle's sensors - providing considerably more information on which to base the overtake decision on. And by simple reference to digital maps, the autonomous vehicle is unlikely to make the decision to overtake on a road where it knows its sensor will experience blind spots (dips of a blind curve, plus it will be able to factor in junctions) which many a human driver fail to take into account. Sure - it will be slightly more difficult until there are a significant number of autonomous vehicles netwroking and sharing information on the move, but, experience seems to show, that real advantages start to be seen when penetration reaches no more than 15% - for trucks on trunk roads, this could be quite an easy target to hit.

renrut

1,478 posts

205 months

Monday 19th December 2011
quotequote all
ploz said:
WRT blame in the case of an accident - the strange thing is that it will be easier for the manufacturers/programmers/operators to avoid blame because the reactions of the autonomous system to any eventuality will be predictable and modellable. Therefore, in Court, it will be the word of a fallable human against a print out of repeatable and demonstrable facts that even the best layer will have difficulty in tearing apart. The likelyhood of 'knock for knock' or 'no blame' claims is much reduced.

It is not beyond the realms of possibility that manufacturers will be able to insure against system failures causing accidents (and don't forget how cleaver they are at avoiding warranty claims) to allow them to take the risk of putting autonomous cars on the road.

All the analysis we have done so far suggests that large trucks will be the first to go autonomous as the potential operating cost savings (absolutely optimal driving technique and the ability to drive all night at a calculated speed to arrive exactly on time) will outweigh the additional risk and costs in a large fleet.
But its not a case of 'insuring' against claims - if you kill someone in an accident you end up in court and possibly in jail. If a car does it, no matter the insurance who ends up in court? If a company or its products kill someone it often depends on the sign off procedures of that company, e.g. engineer says its not ready yet so won't sign it off, boss man says can't wait needs to be at market next week so sign it off. Whoever signs the paper is seen as responsible.

But currently for cars its still dependant on the driver to ensure their vehicle is safe unless it suffers a serious fault but I don't have the kit or expertise to do a full diagnostic and verification on an automated car and I doubt many owners would. Now they could put some clever disclaimer in saying that the driver is still responsible but what use would it then be to a driver who actually does keep to that responsibility. It would be worse than just driving yourself as you'd have nothing to do. Bear in mind that it would only take a small error in the programming to cause a dangerous incident, perhaps confusing units (classic cm vs inches story) between programmers or a bad link between different bits of code.

Obviously they've sorted these issues for large airliners but the manufacturing numbers and the quality checks are a world apart from normal automotive e.g. part tracability and redundancy. I don't think the car industry is currently capable of handling this. A 'friday car' might be quite the dangerous proposition and from the threads we get on here they still happen from time to time and lets not forget even Toyota wrecked their perfect reliability record in the last few years due to software problems.

Now I'm not a legal expert so I wonder if any of our legal bods had any idea how it would play out?

Scuffers

20,887 posts

274 months

Monday 19th December 2011
quotequote all
renrut said:
Toyota wrecked their perfect reliability record in the last few years due to software problems.
??

what software problem?

they had a quality issue with the drive-by-wire throttle pedal assemblies (Made in China along with a load of other OEM's ones), nothing to do with software (although they have also now changed the DBW SW to include throttle drop on brake application)

renrut

1,478 posts

205 months

Monday 19th December 2011
quotequote all
Scuffers said:
renrut said:
Toyota wrecked their perfect reliability record in the last few years due to software problems.
??

what software problem?

they had a quality issue with the drive-by-wire throttle pedal assemblies (Made in China along with a load of other OEM's ones), nothing to do with software (although they have also now changed the DBW SW to include throttle drop on brake application)
I had heard there was a software element to it, I must have been mistaken. Regardless the point still stands - the cars didn't work as intended. Could that happen again? Very likely. Would you want that on cars that are driving themselves?

The Wookie

13,948 posts

228 months

Monday 19th December 2011
quotequote all
ploz said:
We will probably go through the same process with autonomous systems - infact, we already are with some of London's tube trains, which are perfectly capable of operating autonomously, and yet it has been deemed that the public is not yet ready for a driverless train (DLR not withstanding for some reason).
With the Tube drivers' union I think most Londoners are more than ready for a driverless train...

ploz

89 posts

229 months

Monday 19th December 2011
quotequote all
renrut said:
Bear in mind that it would only take a small error in the programming to cause a dangerous incident, perhaps confusing units (classic cm vs inches story) between programmers or a bad link between different bits of code.
Don't forget that should it occur - tht small error will be traceable and correctable. In the case of the present situation with millions of individual driver errors on our roads each day, there is nothing that can be done. It is also useful to remember that 'glitches' that might prove fatal will, in the most part, be picked up in real world use as 'potentially hazardous incidents' before any serious consequences as was the case with Toyota's dodgy throttle.

renrut

1,478 posts

205 months

Monday 19th December 2011
quotequote all
ploz said:
renrut said:
Bear in mind that it would only take a small error in the programming to cause a dangerous incident, perhaps confusing units (classic cm vs inches story) between programmers or a bad link between different bits of code.
Don't forget that should it occur - tht small error will be traceable and correctable. In the case of the present situation with millions of individual driver errors on our roads each day, there is nothing that can be done. It is also useful to remember that 'glitches' that might prove fatal will, in the most part, be picked up in real world use as 'potentially hazardous incidents' before any serious consequences as was the case with Toyota's dodgy throttle.
But who is responsible should that error cause a death? Its foolish to assume you will always catch it before it happens. Its also foolish to assume that a small error in a large software program and system will have small consequences.

Blown2CV

28,819 posts

203 months

Monday 19th December 2011
quotequote all
renrut said:
Scuffers said:
renrut said:
Toyota wrecked their perfect reliability record in the last few years due to software problems.
??

what software problem?

they had a quality issue with the drive-by-wire throttle pedal assemblies (Made in China along with a load of other OEM's ones), nothing to do with software (although they have also now changed the DBW SW to include throttle drop on brake application)
I had heard there was a software element to it, I must have been mistaken. Regardless the point still stands - the cars didn't work as intended. Could that happen again? Very likely. Would you want that on cars that are driving themselves?
If it is drive by wire then there will be a software element to it

renrut

1,478 posts

205 months

Monday 19th December 2011
quotequote all
Blown2CV said:
renrut said:
Scuffers said:
renrut said:
Toyota wrecked their perfect reliability record in the last few years due to software problems.
??

what software problem?

they had a quality issue with the drive-by-wire throttle pedal assemblies (Made in China along with a load of other OEM's ones), nothing to do with software (although they have also now changed the DBW SW to include throttle drop on brake application)
I had heard there was a software element to it, I must have been mistaken. Regardless the point still stands - the cars didn't work as intended. Could that happen again? Very likely. Would you want that on cars that are driving themselves?
If it is drive by wire then there will be a software element to it
Likely but not necessarily - could be dodgy potentiometer or something like that.

ploz

89 posts

229 months

Monday 19th December 2011
quotequote all
renrut said:
But who is responsible should that error cause a death? Its foolish to assume you will always catch it before it happens. Its also foolish to assume that a small error in a large software program and system will have small consequences.
Surely we can't apply different standards to human drivers and autonomous cars? If we applied the levels of acceptable risk you seem to be asking for to human driven cars - we simply wouldn't let anyone drive - at all!

Scuffers

20,887 posts

274 months

Monday 19th December 2011
quotequote all
renrut said:
Blown2CV said:
renrut said:
Scuffers said:
renrut said:
Toyota wrecked their perfect reliability record in the last few years due to software problems.
??

what software problem?

they had a quality issue with the drive-by-wire throttle pedal assemblies (Made in China along with a load of other OEM's ones), nothing to do with software (although they have also now changed the DBW SW to include throttle drop on brake application)
I had heard there was a software element to it, I must have been mistaken. Regardless the point still stands - the cars didn't work as intended. Could that happen again? Very likely. Would you want that on cars that are driving themselves?
If it is drive by wire then there will be a software element to it
Likely but not necessarily - could be dodgy potentiometer or something like that.
far more basic than that, bad design made the pedal bearing seize up, thus sticking pedal.



renrut

1,478 posts

205 months

Monday 19th December 2011
quotequote all
ploz said:
renrut said:
But who is responsible should that error cause a death? Its foolish to assume you will always catch it before it happens. Its also foolish to assume that a small error in a large software program and system will have small consequences.
Surely we can't apply different standards to human drivers and autonomous cars? If we applied the levels of acceptable risk you seem to be asking for to human driven cars - we simply wouldn't let anyone drive - at all!
That might seem the case but in the case of me crashing my car into yours and killing you I'd end up in court for death by dangerous driving - who would if it was a computer driving?

Additionally its one thing for a person to kill another, this as a fact of life but currently computers don't kill people, if they started how would people react?

These are similar questions to those about the armed forces using UAVs etc, and as far as I know no one has come up with a good answer.

Scuffers

20,887 posts

274 months

Monday 19th December 2011
quotequote all
renrut said:
These are similar questions to those about the armed forces using UAVs etc, and as far as I know no one has come up with a good answer.
not quite the same thing though, current 'active' UAV's are all 'remote' manned.


TdM-GTV

291 posts

217 months

Monday 19th December 2011
quotequote all
One of the problems is having a mix of automated and non automated cars on the roads. A road full of automated cars will flow very nicely however if you start dumping humans into the equation suddenly the traffic gnarls up and gets possibly dangerous as the human controlled cars are adding hugely random movements, cutting up e.t.c. into the equation.

One in isolation probably isn't too much of an issue but several of them on one road and you have a big traffic jam at the least. I also, despite being a geek, HATE being in a car that is controlled by a computer - cruise control gives me the willies on it's own! This could be a bit of a control freak thing though.

As for driver aids, they are good for normal drivers. If you have a driver who is going to push the limits then it can be dangerous. Take for example Evo's, skylines e.t.c. They are brilliant, they cling on like nothing else, but when they DO let go (which contrary to popular belief they do) all the driver aids have done is raised the speed at which the accident happens. This isn't a fault of the electronics, it's a fault of the drivers. Does this mean that the best thing to do is to get rid of the driver or get rid of the aids? Who knows!

ploz

89 posts

229 months

Monday 19th December 2011
quotequote all
renrut said:
That might seem the case but in the case of me crashing my car into yours and killing you I'd end up in court for death by dangerous driving - who would if it was a computer driving?

Additionally its one thing for a person to kill another, this as a fact of life but currently computers don't kill people, if they started how would people react?

These are similar questions to those about the armed forces using UAVs etc, and as far as I know no one has come up with a good answer.
If you ended up in court for causing the death of someone else by dangerouse driving, the Court's job would be to determine if you caused that death on purpose, through willful neglect, or through some other cause that you were not in control of. If it were an autimated system driving the car, a Court may have to establish if the controlor or manufacturer was willfully neglegent, but it could be established in fact as to whether the system operated as designed or not. In that sense, it is, legally, a clearer cut case than had you been driving, but it suits the legal system to be able to introduce doubt into the deliberations. Presuming there was no willfull neglect or actual intent discovered on the part of the controllor or manufacturer, there may be a case for a charge of Corporate Manslaughter - but that charge has yet to stand up in court yet, so the situation is unlikely to change.

On a more philosophical level, if an electronic system were found to be at fault for causing a death, other examples of that system can be altered to prevent the occurence happening again (potentially saving many lives). If a person is found responsible, all that can be done is punishing that person (OK, you can take away his driving licence, which he probably never had in the first place - and tell him he can't apply for another one when he does it again!). Retribution doesn't really solve the problem (except, may be legally) - redesigning the system does.

loomx

327 posts

225 months

Monday 19th December 2011
quotequote all
I am assuming the Merc tractions control is like it is in the BMWs, with DSC fully on it doesn't let much happen and if it does, it snaps it back in quickly, in practice mode (DTC) it lets it kick out quite a bit if you aggressive, but you have to opposite lock otherwise its obvious its going to go wrong, if the computers are limited in their reigning in, and you have a boot full, and your still steering the wrong way, its obvious its going to go wrong.