Will driverless cars save lives? (more than 130 collisions)

Will driverless cars save lives? (more than 130 collisions)

Author
Discussion

Dave Finney

Original Poster:

431 posts

148 months

Thursday 15th February
quotequote all
"...more than 130 autonomous vehicle collision reports..." in California in a single year:
https://www.msn.com/en-gb/cars/news/two-driverless...

We're told that driverless cars will reduce collisions but,
as they do more miles, the collisions are mounting up.
And there are reported fatalities already.

Currently they are tested in the easiest environments but, as they drive more difficult roads, collisions may increase.
Conversely, as the tech improves, collisions ought to reduce.

One great advantage of driverless cars is that, when collisions occur, new software can fix each issue and updates can prevent other cars from causing the same problem.

The equivalent of a software update for human drivers, is to change behaviour by prosecuting the behaviour that causes collisions (and alter the driving test).
The problem with that is you have to accurately determine the factors that cause collisions,
and ensure that there are no negative side effects from your enforcement.

Mod note: no need to promote your YouTube video again.

gazza285

9,844 posts

210 months

Thursday 15th February
quotequote all
I listened to a report on these vehicles in San Francisco yesterday, there was an unfortunate woman who was in a collision with one of these vehicles. She wasn’t badly hurt in the initial incident, but the car then moved itself to the side of the road, first running over the prone woman, and then dragging her under the rear wheel to the kerb, where it parked itself, the poor woman still trapped.

vonhosen

40,298 posts

219 months

Thursday 15th February
quotequote all
Clicks are good.

Panamax

4,179 posts

36 months

Thursday 15th February
quotequote all
This has been a known problem since the early 1990s and there's clear evidence from footage at that time,
https://www.youtube.com/watch?v=eWgrvNHjKkY

ScoobyChris

1,726 posts

204 months

Thursday 15th February
quotequote all
A few things to consider, but in the scheme of things 130 accidents (in apparently 3.3 millions miles) seems fairly low. Also wonder how many of those accidents were the "fault" of a decision the car made.

Interesting article here...

https://www.warpnews.org/transportation/self-drivi...

Chris

untakenname

4,976 posts

194 months

Thursday 15th February
quotequote all
Driverless cars are still novel and thus are usually remotely monitored anyway with a kill switch, once they become standard there will be issues with the sensors as the cars age and also the sheer number of vehicles on the road.

Uber had some bad press five years ago when a member of staff was watching a show on their phone whilst in their autonomous car and ran down a pedestrian
https://www.ntsb.gov/investigations/AccidentReport...



The driver got away with it as they claimed they were the victim of a hate campaign against them.


Dave Finney

Original Poster:

431 posts

148 months

Thursday 15th February
quotequote all
untakenname said:
Uber had some bad press five years ago when a member of staff was watching a show on their phone whilst in their autonomous car and ran down a pedestrian
https://www.ntsb.gov/investigations/AccidentReport...
That's another driverless car fatality that I did not know about.
I think it's too early to say if driverless cars will be safer than humans.

If there were an affordable driverless car now, I'd definitely be interested.
Set the destination, read a book and catch a nap,
but, unfortunately, I don't think that there will be an affordable driverless car in my lifetime.

ScoobyChris

1,726 posts

204 months

Thursday 15th February
quotequote all
untakenname said:
Uber had some bad press five years ago when a member of staff was watching a show on their phone whilst in their autonomous car and ran down a pedestrian
Maybe the AI learning model is based on human Uber drivers whistle

Chris

spookly

4,035 posts

97 months

Thursday 15th February
quotequote all
I think eventually driverless cars could be safer. But what's happening at the moment is that manufacturers (especially that twunt baby Elon) are trying to put them in the real world long before they're proven even close to being ready.
I'm amazed that some states have allowed Tesla to use their populace as guinea pigs.

FMOB

1,064 posts

14 months

Thursday 15th February
quotequote all
I doubt it, the only thing that will change is the windows Blue screen of death will be real.

The simple fact is human beings just can't write software that is reliable, it is guaranteed at some point it will crash (the software) or some other problem occurs that will render it broken and in those few moments there will be carnage.

NFT

1,324 posts

24 months

Thursday 15th February
quotequote all
Panamax said:
This has been a known problem since the early 1990s and there's clear evidence from footage at that time,
https://www.youtube.com/watch?v=eWgrvNHjKkY
Great scene's.

I always thought they should have included a Jane limo's, ladies in the back, and female driver up front, Just slide in, for a Joyful Ride. Ride in Jane limo's. laugh

tribbles

3,984 posts

224 months

Friday 16th February
quotequote all
ScoobyChris said:
Maybe the AI learning model is based on human Uber drivers whistle

Chris
Nah - it's because I've been deliberately telling the web sites that the squares with motorcycles in don't have any, and there's never any traffic lights.

ScoobyChris

1,726 posts

204 months

Friday 16th February
quotequote all
tribbles said:
Nah - it's because I've been deliberately telling the web sites that the squares with motorcycles in don't have any, and there's never any traffic lights.
Ha! You reminded me of this video

biggrin

Chris

CLK-GTR

808 posts

247 months

Friday 16th February
quotequote all
They will undoubtedly be safer eventually. No tiredness, no loss of concentration, no mistakes, but there will be many crashes and fatalities before we get them to that point. What we have now is nowhere near the level where they should be allowed on the road unmonitored.

galro

776 posts

171 months

Friday 16th February
quotequote all
spookly said:
I'm amazed that some states have allowed Tesla to use their populace as guinea pigs.
Tesla do not offer autonomous vehicles and none of the autonomous services uses Teslas. They are instead using I-paces, Bolts and Chrysler cars fitted with their own autonomous technology.

Electro1980

8,439 posts

141 months

Friday 16th February
quotequote all
FMO said:
The simple fact is human beings just can't write software that is reliable, .
Yes they can.
FMO said:
it is guaranteed at some point it will crash (the software)
No it isn’t.
FMO said:
or some other problem occurs that will render it broken and in those few moments there will be carnage.
See above.

Safety critical software is not the same as your home PC. Lots of things run without fault. Airplanes for example.

monkfish1

11,165 posts

226 months

Friday 16th February
quotequote all
Electro1980 said:
FMO said:
The simple fact is human beings just can't write software that is reliable, .
Yes they can.
FMO said:
it is guaranteed at some point it will crash (the software)
No it isn’t.
FMO said:
or some other problem occurs that will render it broken and in those few moments there will be carnage.
See above.

Safety critical software is not the same as your home PC. Lots of things run without fault. Airplanes for example.
Go on then. Where is this software that NEVER crashes?

Aeroplanes, Not airplanes. But no matter, they do go wrong. Not often, but they do.

FMOB

1,064 posts

14 months

Friday 16th February
quotequote all
Electro1980 said:
Safety critical software is not the same as your home PC. Lots of things run without fault. Airplanes for example.
Just look at Boeing and their Dreamliner, how many dead? The NATS software failure last September that brought UK skies to a standstill, there are many examples of what should be ultra reliable software just not living up to the description.

Safety critical software has other bits of software designed to stop the most severe consequences occuring when the main bit of software has a problem i.e. it fails safe, not sure you can apply the same approach to driverless cars as the software might have failed safe but people can still be in danger.

You can analyse a bit of software for failure modes, etc till you are old and grey and still find unexpected modes.

Terminator X

15,215 posts

206 months

Friday 16th February
quotequote all
I will never order a car with it or get in one that has it. Yet more inventions that we don't need.

TX.

FiF

44,311 posts

253 months

Friday 16th February
quotequote all
galro said:
spookly said:
I'm amazed that some states have allowed Tesla to use their populace as guinea pigs.
Tesla do not offer autonomous vehicles and none of the autonomous services uses Teslas. They are instead using I-paces, Bolts and Chrysler cars fitted with their own autonomous technology.
Probably a good job they don't use Teslas.

From a recent road test of the latest Model 3. "There’s also a lane-keeping assistant, which Tesla claims is a self-driving system (it isn’t) and in practice the camera-based system misses rather too much for comfort. It failed to spot a horse right in front of the bonnet, but did spot a wheelie bin beside it, so it’s a neigh from the equine population, then.

It also failed to detect a pedestrian in a dark coat and an unlit van parked half on the pavement and was preparing to drive straight into it. "

Yet they still want to charge £6k+ to prepare the "self driving system".