ICE ban clouds on the horizon. Are you out?

ICE ban clouds on the horizon. Are you out?

Author
Discussion

321boost

1,253 posts

71 months

Thursday 25th February 2021
quotequote all
rxe said:
Max_Torque said:
I'm astonished at how many people seem to think that "self driving" cars a have to be perfect! They don't. All they have to be is better than the average human driver. And the average human driver is, ime, absolutely terrible.

Look at automation in aircraft, it has pretty much reduced air accidents by ten times of more in the last 30 years. Yes, planes still crash, and yes, they now crash because the automation fk'ed up, but guess unlike a human crash, after an automation crash the system learns and generally, doesn't make the same mistake again, because once one "node" has learned, they all learn, near instananeously.


When you think about it, as a human, you are pretty much not suited to driving. We get bored easily, we either fail to react in time, or compeltely over react, in fact, we pretty much never do the same thing twice unless we have done a lot of specific training, and when say my neighbour learns to speak spanish, that doesn't help me because we cannot transfer learning specifically, only the idea of learning. And of course we only see in the visible spectrum, and our communication is short distance only and slow.

By comparison, a computer can learn as a single entity, despite being made up of many nodes, it can "see" across a wide range of frequencies, for example, using IR to see in the dark or in the fog, it can react IDENTICALLY in a millionth of a second (or less), and it can accurately diagnose it's own faults

And once, multiple automated cars are on our roads, and all communicating and sharing information at the speed of light, then "comming round a blind corner and finding a tractor in the road" something a human can ONLY mitigate against by the method of driving really slowly, is a think of the past, because beyond-line-of-sight coms means the tractor can tell your car, all cars in fact, that it is indeed just round the corner in the middle of the road!

Yes, today, the system for machine learning and automation are, relatively speaking, still in their infancy, but don't be mistaken, as a human, you are very much "the weak link" in the chain.....
Your view of aviation automation being in some way self learning and replicating across a network is simply wrong. As we have seen with Boeings recent experiences, what happens is:

- A plane crashes. Humans scratch their heads and blame sloppy maintenance
- Another plane crashes in suspiciously similar circumstance. Humans start paying a lot of attention.
- Humans work out why it happened and ground the entire fleet
- Humans beaver away for years trying to fix the software
- Humans eventually determine a fix, and apply it, but by that time most of the humans will never fly on a 737 again.

Now, the thing that adjusts the pitch of an aeroplane is computationally a piece of piss compared to something looking at a video stream and trying to work out where that child on a bike may be going, so the fix in the latter will be a lot harder. In a car, it also has to work in ALL situations, because the automated car can’t suddenly decide “I don’t fancy that multi-storey today” or “I’m not sure that driveway is actually a road” or “the roadworks guys have messed with the M4 so badly its really hard to work out what to do”.

Your vision of wonderfully connected cars all learning together would give any security person nightmares. First rule of hardware: once you’ve sold it, you can’t trust it. The last thing anyone wants in the morning is a ruleset update that might have originated from Igor the hacker down the road, who has spoofed his car’s sensors and has convinced the autopilot that driving on two wheels is a valid strategy for beating traffic jams.

Someone else has pointed out the grave dangers in assuming that just because another automated car is not coming round the blind bend, nothing is.

IMO we’re going to see loads of Level 3 and 4 automation. It may or may not be as good as human, it will be flawed and it will have to hand over regularly. Full on Level 5, asleep on the back seat? 15 - 20 years away would be my guess.
I would like to know which aviation system is self learning?

321boost

1,253 posts

71 months

Thursday 25th February 2021
quotequote all
DonkeyApple said:
swisstoni said:
Max_Torque said:
DonkeyApple said:
In the 1980s a self driving car drove into a wall. In 2020 quite a few self driving cars drove into walls. That's a lot of progress in 40 years. wink
I'm astonished at how many people seem to think that "self driving" cars a have to be perfect! They don't. All they have to be is better than the average human driver. And the average human driver is, ime, absolutely terrible.

Look at automation in aircraft, it has pretty much reduced air accidents by ten times of more in the last 30 years. Yes, planes still crash, and yes, they now crash because the automation fk'ed up, but guess unlike a human crash, after an automation crash the system learns and generally, doesn't make the same mistake again, because once one "node" has learned, they all learn, near instananeously.


When you think about it, as a human, you are pretty much not suited to driving. We get bored easily, we either fail to react in time, or compeltely over react, in fact, we pretty much never do the same thing twice unless we have done a lot of specific training, and when say my neighbour learns to speak spanish, that doesn't help me because we cannot transfer learning specifically, only the idea of learning. And of course we only see in the visible spectrum, and our communication is short distance only and slow.

By comparison, a computer can learn as a single entity, despite being made up of many nodes, it can "see" across a wide range of frequencies, for example, using IR to see in the dark or in the fog, it can react IDENTICALLY in a millionth of a second (or less), and it can accurately diagnose it's own faults

And once, multiple automated cars are on our roads, and all communicating and sharing information at the speed of light, then "comming round a blind corner and finding a tractor in the road" something a human can ONLY mitigate against by the method of driving really slowly, is a think of the past, because beyond-line-of-sight coms means the tractor can tell your car, all cars in fact, that it is indeed just round the corner in the middle of the road!

Yes, today, the system for machine learning and automation are, relatively speaking, still in their infancy, but don't be mistaken, as a human, you are very much "the weak link" in the chain.....
Hard to believe sometimes but the human brain is the most incredible thing.
To think that it’s straightforward to create a machine that comes close to its ability to negotiate a typical town centre, for instance, and do it without taking all day, or flattening things, seems like plain hubris to me. One of the human brain’s less desirable little foibles.
Indeed. The reality is that to date autonomous driving is nowhere near as competent as the simplest human. And that's still completely ignoring the fundamental flaw of autonomous cars which is that a box programmed to give way to humans will be made to give way by humans. You'd need a totalitarian police state for autonomous cars to move in an urban environment. It's the dreams of nerds who never got out much in the real world and have a misunderstanding of the human race.

Even someone who doesn't regularly move around a complex and busy urban environment so might not immediately grasp the very simple problem, only has to go to their local retail car park and watch how pedestrians will walk past the back of vehicles that are reversing these days. They do so because modern sensors have reduced the risk and therefore delivered a reward for simply not waiting.

It's fear that keeps pedestrians where pedestrians are supposed to be. Not manners or civility as they have none. It's fear of injury. Remove that fear of unjustly from the motor car and for that car to be able to move another fear must be put into that pedestrian. That's what the dreamers are unable to comprehend.

Edited by DonkeyApple on Wednesday 24th February 22:04
I think max's opinion of humans is so low because subconsciously that is how he/she sees themselves.

bigothunter

11,297 posts

61 months

Thursday 25th February 2021
quotequote all
rxe said:
IMO we’re going to see loads of Level 3 and 4 automation. It may or may not be as good as human, it will be flawed and it will have to hand over regularly. Full on Level 5, asleep on the back seat? 15 - 20 years away would be my guess.
Agreed yes

Level 4 autonomy available to the customer by 2030 with full Level 5 by 2040 are reasonable targets, given the significant development needed. This video illustrates some challenges ahead:


bigothunter

11,297 posts

61 months

Thursday 25th February 2021
quotequote all
321boost said:
I think max's opinion of humans is so low because subconsciously that is how he/she sees themselves.
What an unfortunate remark rolleyes

Evanivitch

20,132 posts

123 months

Thursday 25th February 2021
quotequote all
rxe said:
Your view of aviation automation being in some way self learning and replicating across a network is simply wrong. As we have seen with Boeings recent experiences, what happens is:

- A plane crashes. Humans scratch their heads and blame sloppy maintenance
- Another plane crashes in suspiciously similar circumstance. Humans start paying a lot of attention.
- Humans work out why it happened and ground the entire fleet
- Humans beaver away for years trying to fix the software
- Humans eventually determine a fix, and apply it, but by that time most of the humans will never fly on a 737 again.

Now, the thing that adjusts the pitch of an aeroplane is computationally a piece of piss compared to something looking at a video stream and trying to work out where that child on a bike may be going, so the fix in the latter will be a lot harder. In a car, it also has to work in ALL situations, because the automated car can’t suddenly decide “I don’t fancy that multi-storey today” or “I’m not sure that driveway is actually a road” or “the roadworks guys have messed with the M4 so badly its really hard to work out what to do”.
Probably best not to use an example like the 737MAX if you don't understand the issue. Suggesting it was just a software issue for adjusting the pitch is a complete misrepresentation of why it went wrong and why it has taken so long to re certify aircraft.

DonkeyApple

55,407 posts

170 months

Thursday 25th February 2021
quotequote all
Max_Torque said:
DonkeyApple said:
Indeed. The reality is that to date autonomous driving is nowhere near as competent as the simplest human.
"In the 1st quarter, we registered one accident for every 2.87 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 1.76 million miles driven."


Well, sorry but it ALREADY is safer than the simplest human, and in fact safer than the average one too, even with non automoous safety features enabled.......

Sure, that's for the basic autopilot, but really the writing is on the wall for us meatware when it comes to safe driving.


Consider how a human drives a car.

We see, hear, then actuate our muscles, using our brains learning, and when we are not learnt, ie in experienced, we crash a LOT as all the statistics prove (try insuring a 21 year old on a Ferrai 812 and see how you get on....) With automation, every single car can drive, on every trip, like the most advanced part of the network

And yes, there will be crashes, mistakes, dodge software, people dying, but cruically, just like in the aero industry, less people will be hurt and killed by automation than by us fairly hopeless human drivers.
Tesla user data? That's for a system where a human only uses it in selected conditions filtered by safety.

Max you're suffering from woods and trees syndrome as per usual. It's not helped by your general disdain for humans.

I haven't even said that autonomy needs to be faultless that's a construct that you've created for the purpose of trying to have an argument. Again.

We've been able to pilot objects along simple paths for decades and for decades that has not been good enough to implement a complete change of use upon. In order for that to happen the technology needs to be able to function at a greater average competence to a human in a complex environment and situation and despite what you and Musk claim, both of you know that we are not anywhere near that point. In simple terms, the tech cannot yet 'see' whether a dog is on a lead so cannot even begins to run an accurate risk calculation regardless of whether Einstein created the algorithm.

A problem that many engineers and tech specialists have is often too great a confidence in inanimate objects combining with a general fear even to the point of loathing of people dissimilar to themselves, although it's also been my experience that they tend to consider their peers to be idiots also because of some need they have to belittle those around them or conversely, worship someone. Anyway, there is a common trend not just of belittling others but in simply not understanding or possibly wanting to understand. Hence why you'll always find committee designed functionality in cars with weird flaws.

Anyway, the inability of an engineer to properly understand how humans factor in risk and amend their behaviour accordingly is hardly a surprise. Especially if someone's natural state is to bow down before technology. But most humans don't bow to technology. For most humans a bit of technology that is programmed to be subservient to them will be treated as such.

Why would any pedestrian wait to cross at a light that's just turned green? This isn't about people suddenly walking out in front of fast moving vehicles but about humans instantly adapting to take advantage, which is what we do. It's hard to see humanity and how it works when you sit indoors and thinknof humanity as being inferior to you.

You might get vehicles capablenof autonomy and laws that permit that autonomy in fixed and regulated environments through this decade but you aren't going to see autonomous pods bumbling around the open urban environment, which surprise surprise is where the majority of money, people and commerce is.

DonkeyApple

55,407 posts

170 months

Thursday 25th February 2021
quotequote all
ThrottleLine30 said:
I'll be driving petrol until they physically stop selling it. Wouldn't catch me dead in Electric.
That's just a bizarre state of mind to be honest. What actual, viable benefit would that even serve you?

Surely the logic is to opt for whichever fuel best suits your needs whether in usage terms, tax terms or simple cost terms?

Why would you doggedly stick to something if there is an alternative that is cheaper and easier?

At the moment the cheaper and easier option for most people is petrol but it's clear to see that for some who are able to take advantage of the benefits that exist to date the EV is the cheaper and easier option but more importantly we can also clearly see that the switch point between the two firms of fuel is moving on a monthly basis, ever expanding the group for which the EV is the cheaper, easier option.

The big step change in the UK that will arguably see the number of pure EVs double over the next 12/18 months (albeit it from a very small base) is the BIK saving. It's suddenly rather obvious that someone who has a driveway, has another car, has pretty predictable usage patterns and whose job grants then a car allowance/company car in the value region where EVs are would switch and take those savings?

How would doggedly insisting on paying much more money than you need to for no material gain be a sane life long choice to make?

What is going to happen when the EV product eventually becomes viable in the segment where you buy your cars? What happens when your peers are all taking advantage of EVs? Are you geneuinely going to be there doggedly refusing to take advantage and forcing yourself to spend ever greater amounts of money for a product that is blocked from more and more destinations? That would be mental. Are you mental?

Kawasicki

13,093 posts

236 months

Thursday 25th February 2021
quotequote all
I know a few engineers working in autonomous driving. All of them say the same thing. It’s not ready, and don’t expect it (level 5) anytime soon. Some of them say a whole lot more negative stuff than that...

Terminator X

15,105 posts

205 months

Thursday 25th February 2021
quotequote all
Max_Torque said:
DonkeyApple said:
Indeed. The reality is that to date autonomous driving is nowhere near as competent as the simplest human.
"In the 1st quarter, we registered one accident for every 2.87 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 1.76 million miles driven."


Well, sorry but it ALREADY is safer than the simplest human, and in fact safer than the average one too, even with non automoous safety features enabled.......

Sure, that's for the basic autopilot, but really the writing is on the wall for us meatware when it comes to safe driving.


Consider how a human drives a car.

We see, hear, then actuate our muscles, using our brains learning, and when we are not learnt, ie in experienced, we crash a LOT as all the statistics prove (try insuring a 21 year old on a Ferrai 812 and see how you get on....) With automation, every single car can drive, on every trip, like the most advanced part of the network

And yes, there will be crashes, mistakes, dodge software, people dying, but cruically, just like in the aero industry, less people will be hurt and killed by automation than by us fairly hopeless human drivers.
How many people are kiiled though in accidents today vs what it might be like with autonomous and is it worth spending £billions to get fatalities down from an already low figure? Someone in power clearly thinks yes or has a lot of mates in that industry wink

TX.

PS I haven't been involved in a own fault accident since the 80's does that make me safer than an autonomous car?

rxe

6,700 posts

104 months

Thursday 25th February 2021
quotequote all
Evanivitch said:
Probably best not to use an example like the 737MAX if you don't understand the issue. Suggesting it was just a software issue for adjusting the pitch is a complete misrepresentation of why it went wrong and why it has taken so long to re certify aircraft.
I know perfectly well that the main issue was stumpy wheels and big engines, requiring an engine position that lead to a larger moment around the CoG than was desirable. They “solved” this with software. The ‘plane is not inherently dangerous, other than some software that makes bad decisions when fed incomplete data. In the absence of the software, the ‘plane could have had ‘interesting but unlikely to be fatal” pitch characteristics when power was applied. The software made it fly more like a normal 737, removing the need for training.

bigothunter

11,297 posts

61 months

Thursday 25th February 2021
quotequote all
Terminator X said:
How many people are kiiled though in accidents today vs what it might be like with autonomous and is it worth spending £billions to get fatalities down from an already low figure? Someone in power clearly thinks yes or has a lot of mates in that industry wink

PS I haven't been involved in a own fault accident since the 80's does that make me safer than an autonomous car?
Safety apart, removing the tedious task of driving to work and back for millions of commuters is reason enough to go autonomous. And there are many other advantages for the general car user, who are not enthusiasts.

Would you rather use a manual or automatic washing machine? Would an automatic washing sorter/loader be even better? Or just leave that boring task to the wife?

Why get any closer to the mechanics of driving the car or washing your clothes than absolutely necessary? scratchchin


DonkeyApple

55,407 posts

170 months

Thursday 25th February 2021
quotequote all
bigothunter said:
Safety apart, removing the tedious task of driving to work and back for millions of commuters is reason enough to go autonomous. And there are many other advantages for the general car user, who are not enthusiasts.

Would you rather use a manual or automatic washing machine? Would an automatic washing sorter/loader be even better? Or just leave that boring task to the wife?

Why get any closer to the mechanics of driving the car or washing your clothes than absolutely necessary? scratchchin
Yup. Huge advantages, especially the commercial innovation that would appear. This doesn't mean that those in receipt of investment funds are correct with their pitches regards how close viability actually is. Nor does it negate the simple yet significant issue of humans outside of the pod that is programmed to defer to them.

otolith

56,204 posts

205 months

Thursday 25th February 2021
quotequote all
As someone recently pointed out on here, though, the thought stopping people stepping out in front of cars in cities is less that the driver won't try not to run them over, rather that the driver will then get out and kick their head in.

bigothunter

11,297 posts

61 months

Thursday 25th February 2021
quotequote all
DonkeyApple said:
Yup. Huge advantages, especially the commercial innovation that would appear. This doesn't mean that those in receipt of investment funds are correct with their pitches regards how close viability actually is. Nor does it negate the simple yet significant issue of humans outside of the pod that is programmed to defer to them.
Vision Zero safety target is often quoted. Europe is committed by 2050. Proper physical segregation of cars, pedestrians and cyclists will be essential (eg barriers). Motorcycles are clearly alien and unacceptable.


DonkeyApple

55,407 posts

170 months

Thursday 25th February 2021
quotequote all
otolith said:
As someone recently pointed out on here, though, the thought stopping people stepping out in front of cars in cities is less that the driver won't try not to run them over, rather that the driver will then get out and kick their head in.
There will be some of that and there are high streets in London where you see the pedestrian, typically a 'hard man' check the driver to ensure that they are an inferior fighting machine and that they will not only stop for them but stay in their car but for the most part the scenario has nothing much to do with that. Very few people will ever get out of their cars, from a pedestrian perspective it's pretty close to a zero risk.

It's to do with that basic human instinct where an object that starts to move towards them will cause them to not step out, step back or speed up out of the way. We've all had the pedestrian who nips across and speeds up their walk as the car approaches.

And just consider the impact on traffic flow at busy lights if the cars are unable to edge forward behind push bikes setting off etc.

The reality is that in complex road systems humans will adapt to take advantage of benefits available to them. That's what we do as a species. Vehicles programmed to stop and wait will simply be made to do so. It's human nature. Remove the fear and you dynamically change how humans interact.

Few urban environments are like Vancouver where pedestrians are the autonomous objects rigidly following a set of programmed protocols. Few urban environments are as incredibly simple as a country lane or motorway.

Even Graeme Smith at Oxbotica holds the view that functioning autonomy is a decade away and given that his job only exists because of third party funding of the Driven program one can safely assume 10 years is optimistic.

Oxbotica are trying to do it without GPS because of the inherent issues over accuracy of GPS, ie there is none at the micro level that cars need to operate within. But their system is reliant on them creating their own map of a city, such as what they are doing in London. A city that has road side furniture that changes more than people think. Their Oxford trials weren't exactly brilliant and that's with cyclists with above average IQs wink and the Stratford event was a triumph of marketing.

Getting a machine to get remotely close to what a human brain is capable of doing almost subconsciously remains a long way off. The best we might get over the next decade is autonomous buses due to their fixed routes and motorway use.

The dream of having a pod to take us to the pub and back, which is the only true and relevant goal is sadly a long way off. Unless you live above a pub in which case Stanna already have an autonomous solution. biggrin

otolith

56,204 posts

205 months

Thursday 25th February 2021
quotequote all
I do think it's coming, but agree, still some time away.

I think once it's here, the countdown to us not being allowed to drive ourselves is ticking. Once people dying on the roads becomes an optional extra for the sake of manual control it becomes pretty morally indefensible. Add to that - kids don't care about learning to drive already, take away any practical benefit to it and being able to and wanting to drive manually rapidly becomes the pursuit of a tiny minority.

I think probably there will be autonomous cars by the time I'm too old to drive myself and that probably there will still be self driven ones in first world economies by the time I'm dead.

NDA

21,615 posts

226 months

Thursday 25th February 2021
quotequote all
ThrottleLine30 said:
I'll be driving petrol until they physically stop selling it. Wouldn't catch me dead in Electric.
Really?

I like petrol cars too and have had some very fast ones - I still have a pair of V8's to keep our Autumns warm. But I think it would be rather narrow minded to say 'you wouldn't catch me dead in one'. You should try one - they're quick, easy to own and cheap to run. I doubt you'd catch me in one. If you see what I mean.

DonkeyApple

55,407 posts

170 months

Thursday 25th February 2021
quotequote all
otolith said:
I do think it's coming, but agree, still some time away.

I think once it's here, the countdown to us not being allowed to drive ourselves is ticking. Once people dying on the roads becomes an optional extra for the sake of manual control it becomes pretty morally indefensible. Add to that - kids don't care about learning to drive already, take away any practical benefit to it and being able to and wanting to drive manually rapidly becomes the pursuit of a tiny minority.

I think probably there will be autonomous cars by the time I'm too old to drive myself and that probably there will still be self driven ones in first world economies by the time I'm dead.
I agree. There's too much commercial benefit in many areas for it not to eventually be cracked but pub pods will be something my children benefit from at best.

Evanivitch

20,132 posts

123 months

Thursday 25th February 2021
quotequote all
rxe said:
Evanivitch said:
Probably best not to use an example like the 737MAX if you don't understand the issue. Suggesting it was just a software issue for adjusting the pitch is a complete misrepresentation of why it went wrong and why it has taken so long to re certify aircraft.
I know perfectly well that the main issue was stumpy wheels and big engines, requiring an engine position that lead to a larger moment around the CoG than was desirable. They “solved” this with software. The ‘plane is not inherently dangerous, other than some software that makes bad decisions when fed incomplete data. In the absence of the software, the ‘plane could have had ‘interesting but unlikely to be fatal” pitch characteristics when power was applied. The software made it fly more like a normal 737, removing the need for training.
Great so you realised that the use of software in this instance has no parallels to autonomous driving. Fantastic.

otolith

56,204 posts

205 months

Thursday 25th February 2021
quotequote all
DonkeyApple said:
I agree. There's too much commercial benefit in many areas for it not to eventually be cracked but pub pods will be something my children benefit from at best.
Stuck with plague vectoring Uber drivers for now.