Why driverless car's are a LONG way off.

Why driverless car's are a LONG way off.

Author
Discussion

Hackney

6,853 posts

209 months

Wednesday 1st June 2016
quotequote all
EnglishTony said:
I think the major stumbling block is going to be customer resistance. Why would anybody want one?
For the same reason most people want a washing machine instead of taking their clothes to the river to bash with a rock.

You and I may not want autonomous cars, but many people aren't interested in driving but just getting from a to b, so they'll go for it. In fact, for my commute I'd be interested, as long as I can get there quickly and easily and I'm still allowed to buy a "manual" car for the weekend.

Hackney

6,853 posts

209 months

Wednesday 1st June 2016
quotequote all
TbirdX said:
skyrover said:
Here's another dilemma.

Autonomous car malfunctions and pitches itself into a wall/pedestrian/vehicle/etc, resulting in fatalities.

Who is liable? Will anyone be prosecuted? Are the engineers who wrote the software now guilty of manslaughter?
Test case, precedent set.

Next.
And then all autonomous cars recalled as no software engineer / corporation wants to go to prison.
A very expensive experiment then.

CrutyRammers

13,735 posts

199 months

Wednesday 1st June 2016
quotequote all
kambites said:
In the short term (as in the next twenty years) yes, but if we can get to the point where we can get statistically significant data to show that they're safer than human drivers, that will change.

They will never be perfect but they have the potential to be vastly safer than the average human driver. The difficulties that need to be overcome are political and sociological, not technological.[b]There is a huge political motivation for adopting them - they can massively increase effective road capacity at zero cost to the state; the question is how long it will take for the public to accept them.
[/b]
Edited by kambites on Tuesday 31st May 22:01
Said this in another thread, I think our politicians are banking on this to get them out of the infrastructure hole they have allowed to happen.

Emeye

9,773 posts

224 months

Wednesday 1st June 2016
quotequote all
technodup said:
Emeye said:
We are definitely a long way off being able to trust a self driving vehicle.
You presumably trust humans; the wife for example? And even if you don't trust her to drive you, if you drive anywhere you're trusting ones you don't even know not to crash into you.

Some of them will be drunk. Some will be borderline blind. Some will be reading a tablet or applying make-up. Some will be half asleep and some will simply be poor drivers.

I'm pretty sure I'd trust a billions pound computer system than take pot luck with the general public but it might just be me. (I know it's not just me).
Can you name a fully automated human transportation system that travels at a decent speed, that doesn't require a human being to be available to take over should there be a failure that is currently in daily, commercial use? There may be some rail systems, but no ships, aircraft or cars roaming free that I know of. I'm happy to be enlightened.

I don't trust all humans, and technology just isn't at the stage where I trust it yet - if a computer system I can trust is going to cost a billion pounds then I'm not sure I'll be able to afford the monthly PCP repayments.


Edited by Emeye on Wednesday 1st June 10:12

Mr Snrub

24,990 posts

228 months

Wednesday 1st June 2016
quotequote all
Hackney said:
For the same reason most people want a washing machine instead of taking their clothes to the river to bash with a rock.
Difference being a washing machine is unlikely to kill anyone if it goes wrong

technodup

7,584 posts

131 months

Wednesday 1st June 2016
quotequote all
Emeye said:
Can you name a fully automated human transportation system that travels at a decent speed, that doesn't require a human being to be available to take over should there be a failure that is currently in daily, commercial use? There may be some rail systems, but no ships, aircraft or cars roaming free that I know of.
I think the key word there is 'require'. That's quite subjective. There's the technological 'require' and the moral, ethical or union 'require' for a start.

But it doesn't really matter if you trust it yet, it's not ready yet.

But when it is don't underestimate the power of government, money and peer pressure to change your mind.



rxe

6,700 posts

104 months

Wednesday 1st June 2016
quotequote all
Said this on a similar thread.

We need to be very clear what "driverless" is - for me it is me asleep on the back seat, pi55ed or sober, and the car is doing everything. Anything that requires me to take over is not driverless, it's an annoying toy that I'm not willing to pay for. I am more than happy to pay for real driverless.

There is a huge gulf between a technology demonstrator and a 100% working product. Google has technology demonstrators - they drive in ideal conditions, can't handle arbitrary road works, won't stop for the police and rely on very precisely mapped roads. The LIDAR and software can't tell the difference between a plastic bag blowing across the road and a boulder. As of 2015, these problems were scheduled to be fixed by 2020. They may achieve that, they may not.

Anything released to the public has to get it right 100% of the time. It may make fewer mistakes than a human, but if there are situations that confuse it, and it stops, it is useless. It needs to do motorway roadworks in the pouring rain, realise that a field is a temporary car park for an event, respond to marshalling by humans, know what a policeman is, see when a ford is too deep to pass though, understand that the movement of a plastic bag indicates that it is empty .... and a million other things as well.

The next problem that needs solving is total reliability, because any failure will stop the car. It needs to be solid enough to hand it to a person and say "see you in 12,000 miles for a service". No overnight fiddling, no engineers updating software. The security nerd in me is terrified by "overnight updates" - what happens when a company gets it wrong, and all their cars set off one morning with al the obstacle avoidance code screwed up.

With a bit of luck, these will be available by the time that I become too decrepit to drive myself - hopefully 20 - 30 years off....



EnglishTony

2,552 posts

100 months

Wednesday 1st June 2016
quotequote all
Have we discussed the insurance issues yet?

skyrover

Original Poster:

12,674 posts

205 months

Wednesday 1st June 2016
quotequote all
This is very true...

Look at the problems Nvidia has had with it's latest graphics card drivers cooking peoples hardware.

This is from a company with YEARS of development experience and a very mature driver codebase/team and yet they still get it wrong with wide scale consequences.

Autonomous car's are orders of magnitude more complex and the stakes are a lot higher than some toasted silicone.

Charles Sweeney

105 posts

96 months

Wednesday 1st June 2016
quotequote all
Autonomous cars are long off because nobody wants them.

ajcj

798 posts

206 months

Wednesday 1st June 2016
quotequote all
Jader1973 said:
I read an article a couple of months ago about a test in the US: the self driving car had to join a freeway from a slip road, cross 4 lanes of traffic to exit from another slip road on the other side a few hundred metres further on.

It couldn't do it because all the cameras, sensors etc were picking up enough traffic for it to decide it wasn't safe. A human would just have stuck the indicator on and gone for it.

Perfect example of the difficulty of having normal and self driving cars that can't communicate with each other on the roads, and one reason why it is years away, if it ever happens at all.
This is the most difficult part, I understand. I had a conversation last year with a chap who is working on the technology. I asked him what the biggest obstacles were to getting autonomous vehicles onto the roads. "Roundabouts" he said, and shuddered....


lostkiwi

4,584 posts

125 months

Wednesday 1st June 2016
quotequote all
CrutyRammers said:
Said this in another thread, I think our politicians are banking on this to get them out of the infrastructure hole they have allowed to happen.
Politicians are between a rock and a hard place here.
Autonomous cars have the potential to make our roads more efficient by reducing traffic and congestion. That in turn will reduce the amount of fuel burned which in turn will reduce the taxation revenue.
Also its only to be expected that autonomous vehicles will be wholly electric. The power generation requirements of a million electric vehicles coursing the streets of London brings another infrastructure challenge all of its own in generation and distribution of electricity.

culpz

4,884 posts

113 months

Wednesday 1st June 2016
quotequote all
Read about the incidents regarding Tesla's autopilot feature if you haven't already. We're a long way off. It's obviously a huge deal leaving something as dangerous as driving in the hands of a computer system all thing's considered. Technology is advancing greatly i know but to adapt it to make human input completely void would need to be absolutely faultless. We've not even solved the issue that is the replacement for the internal combustion engine yet.

otolith

56,206 posts

205 months

Wednesday 1st June 2016
quotequote all
EnglishTony said:
Have we discussed the insurance issues yet?
You will have to indemnify your car for third party risks if you wish to use it on the road. If it injures someone or damages something, your insurer will have to pay up. They might then seek to recover the costs from the manufacturer if they can show negligence. In practice, they will probably just set the premium to a level whereby they can just pay up. That premium is likely to be substantially lower than many current drivers have to pay to cover the terrible driving record of their demographic.

otolith

56,206 posts

205 months

Wednesday 1st June 2016
quotequote all
I think the most likely route of introduction, by the way, is by incremental improvement in the autonomous abilities of ordinary cars. It won't take long before the German luxury car manufacturers have a production rival to Tesla's autopilot functionality (they've already got active cruise, and they've already demonstrated their versions of the technology) and then it's just a technology war. The range of circumstances the systems can cope with will grow, the technology will filter down to cheaper models, people will get to trust and understand it. Eventually there will be a tipping point, and we will get cars which don't have manual override and an appropriate legal framework for them. Once that's in place, you can start to get the real benefits. We've got an ageing population who want to stay mobile.

Emeye

9,773 posts

224 months

Wednesday 1st June 2016
quotequote all
technodup said:
Emeye said:
Can you name a fully automated human transportation system that travels at a decent speed, that doesn't require a human being to be available to take over should there be a failure that is currently in daily, commercial use? There may be some rail systems, but no ships, aircraft or cars roaming free that I know of.
I think the key word there is 'require'. That's quite subjective. There's the technological 'require' and the moral, ethical or union 'require' for a start.

But it doesn't really matter if you trust it yet, it's not ready yet.

But when it is don't underestimate the power of government, money and peer pressure to change your mind.
.
That's what I said in the section you didn't quote of my post. Technology isn't at the stage where we can trust it yet and it will be along time until it is ready.

My new Passat has Active Cruise Control. It works reasonably well in certain situations. But it doesn't trust itself to set off in start-stop traffic after a few seconds stationary - very frustrating - is this a technology limitation of a fear of litigation issue?

One thing that baffles me was that the government was discussing removing white lines from some roads as it makes humans drive more responsibly and slow down if there is a lack on comfortable markings showing the way, yet how does that fit in with autonomous vehicles that use the road markings to indicate lane separation?

Is the issue that we cannot mix human and computer drivers on the same roads?

There are so many other issues than just the safety - Uber are very keen to get rid of human drivers, purely for commercial gain. How are all those truck and taxi drivers going to earn a living, though that is the same for all technological advances that remove humans from the process.

youngsyr

14,742 posts

193 months

Wednesday 1st June 2016
quotequote all
kambites said:
MG CHRIS said:
Which wont happen because you will have to be in charge if things go wrong which makes them pointless.
In the short term (as in the next twenty years) yes, but if we can get to the point where we can get statistically significant data to show that they're safer than human drivers, that will change.

They will never be perfect but they have the potential to be vastly safer than the average human driver. The difficulties that need to be overcome are political and sociological, not technological. There is a huge political motivation for adopting them - they can massively increase effective road capacity at zero cost to the state; the question is how long it will take for the public to accept them.
My bet is: "Not long".

In my experience, most people consider driving to be a chore and will be more than happy to hand it over to a machine to do for them.

blearyeyedboy

6,305 posts

180 months

Wednesday 1st June 2016
quotequote all
It's inevitable. It happens in little increments, and those increments will answer all of your questions, OP.

This will seem weirdly off topic, but bear with me. When creationists try to say that they don't believe in evolution, they often use the eyeball as an example of why evolution can't be true. "But it's so complex!", they say. "Unless everything's spot on, it can't possibly work, so evolution can't be true!" This description by Richard Dawkins about the evolution of the eyeball in all its complexity takes the objections of creationists and destroys them.

Sorry if that seems weirdly off topic but in my opinion, you're making the same logical mistake as those creationists, OP.

In the same way as the eyeball evolved, autonomous driving is on its way. We have most of the elements, we just need the fine tuning and legislation. Machines make decisions every day and with more success than humans. A little radar cruise control here, a little blind spot warning there... a little lane departure warning here, a little proximity emergency braking there... a little pattern recognition here, a little decision making algorithm there...

Computers don't need to account for every possibility, simply ethical decision algorithms (as already discussed), legal frameworks and broad decision making frameworks (i.e., "Is it safe to proceed along my intended route? Yes Or No?").

All the elements already exist. Google, Tesla and Apple are building up tens of millions of miles of experience faster than you or I, and the evolution of autonomous driving is well underway and extremely rapid. It's easy to come up with grossly optimistic predictions in the fashion of Tomorrow's World, but I would be astonished if considerable numbers of autonomous cars aren't on our roads within 15-20 years.

The main sticking point will be legislation. For once, the inefficiency of governments might be the saving grace of people who wish to continue to drive themselves.

FredClogs

14,041 posts

162 months

Wednesday 1st June 2016
quotequote all
skyrover said:
I've seen plenty of posts recently recently describing the imminent demise of the human driver in place of autonomous car's and how our leadership will soon look to ban human piloted car's altogether on the grounds of "safety".

I am going to list the reasons why this simply is not the case, at least for the near and somewhat distant foreseeable future.

1. Humans are actually very good at driving with around 1 death for every 100 million miles driven on average, something a machine will find incredibly difficult to match for the following reasons.

2. Machines are incapable of dealing with tasks that have not been foreseen by the software engineers and each possible eventuality programmed with acceptable solutions. It will take billons of miles driven and an absolutely enormous variety of situations and hazards for the software engineers understand and to solve.

3. Machines are incapable of making philosophical decisions. If a crash is unavoidable, who do you hit, the lady crossing the street or the oncoming car? How does a machine anticipate hazards i.e is the child on a bicycle along the pavement more likely to veer into the road compared to the adult?

4. How does a machine react to faulty sensors.. or sensors giving inaccurate information?

5. Someone is about to drive into you... do you continue on or stop for the red light in front of you?

6. How does the machine behave with damage or neglect? etc etc

Even google.. arguably the most successful implementer of driverless technology so far admits it's going to be a long time before these vehicles are a common sight on our roads.

http://thenextweb.com/opinion/2016/03/18/great-tec...

Then there are legal, legislative and consumer acceptance barriers to get through, not to mention inter vehicle software communication/update standards to decide upon.

So sleep easy folks, ignore the hype, your steering wheel and pedals are safe for a long while yet.
Aside your misuse use of apostrophes you've made a couple of errors...

1) Any accidental human death is treated as a tragedy, but driver less cars offer much more than safety improvements, most importantly they offer control over traffic flow and congestion.

2) That's nonsense you clearly don't understand how complex software works, even if you had driven billions of miles in you life (which you haven't) and intelligent software system which compiles and uses meta data from all the cars within that system would have gathered and shared to all users more information in the first 10 minutes of that system going live than you could in a life time.

3) How would you evaluate that "philosophical" decision and how would you live with it after wards? If you think that it's preferable that individuals decide who and when they're going to kill pedestrians to a computer avoiding being in that position in the first place than I think you're wrong.

4) Backup, we have some pretty complicated machines, aircraft, spacecraft etc... with multiple redundant safety systems. If you radar systems will fail because of "a faulty sensor" you misunderstood the technology and if you really don't believe the technology can work then I'd suggest you never fly.

5) A driver less car wouldn't drive into you. I accept that a mix of driver less and non driver less cars will present the same risks as present but a truly autonomous system would be just that. There is also no reason (I expect it will be mandated) that the driver less function could be immediately over ridden by driver input anyway, as your cruise control on your car does now.

6) I don't know, how does a human behave when they're eating a burger, trying to text and scratching their balls whilst doing 85mph on the motorway? Technology can be made resilient to environments and electronics requires very little care or maintenance, when was the last time you "serviced" your laptop?


FWIW you're quite correct about the other issues of market acceptance, legal responsibility and the requirement for a single mandatory protocol and common system , but they're all achievable, we have very clear standards for all sorts of other areas of business and technology - I suspect the only real hold up to mass acceptance within a couple of decades will be the legal issues of who is culpable in the event of a failure of the system - I imagine it will always have to lie with the driver.

98elise

26,645 posts

162 months

Wednesday 1st June 2016
quotequote all
Mr Snrub said:
Hackney said:
For the same reason most people want a washing machine instead of taking their clothes to the river to bash with a rock.
Difference being a washing machine is unlikely to kill anyone if it goes wrong
You mean like catch fire and burn your house down?