Tesla and Uber Unlikely to Survive (Vol. 2)

Tesla and Uber Unlikely to Survive (Vol. 2)

TOPIC CLOSED
TOPIC CLOSED
Author
Discussion

lothianJim

2,274 posts

57 months

Monday 26th October 2020
quotequote all
You are discussing hypotheticals. For foreseeable future, any driver who buys FSD is fully responsible for the supervising the car and correcting errors.

As long as fsd + supervisor is safer. (Which is probably true even with this bug fest. ) there is no logical reason to ban it.

NDNDNDND

2,429 posts

198 months

Monday 26th October 2020
quotequote all
lothianJim said:
You are discussing hypotheticals. For foreseeable future, any driver who buys FSD is fully responsible for the supervising the car and correcting errors.

As long as fsd + supervisor is safer. (Which is probably true even with this bug fest. ) there is no logical reason to ban it.
Judging by the videos I've seen, the only way this FSD could be safer is that it's so nerve-wracking to use that no one can risk looking at their phone for a moment because it might do something completely unpredictable at any moment...

As ever, it's the psychology of it that'll be dangerous. It's the Joshua Browns of the world, whose utter and un-founded faith in the technology means they're simply not present when it all goes wrong.

If anything, the better the system gets, the less safe it will be. As mistakes become less frequent, complacency creeps in, meaning the car manager is even less engaged with what's going on and is even less able to mash the brake in a panic when it decides to drive straight at a parked car.

Edited by NDNDNDND on Monday 26th October 13:33

ZesPak

25,485 posts

211 months

Monday 26th October 2020
quotequote all
I've driven a LOT of km's with autopilot.
In the end it's something you learn to trust and use as a piece of technology.

Just as I trust my car to brake from 160-0 in a certain distance, or the accelerator to respond in a certain way.

lothianJim

2,274 posts

57 months

Monday 26th October 2020
quotequote all
NDNDNDND said:
Judging by the videos I've seen, the only way this FSD could be safer is that it's so nerve-wracking to use that no one can risk looking at their phone for a moment because it might do something completely unpredictable at any moment...

As ever, it's the psychology of it that'll be dangerous. It's the Joshua Browns of the world, whose utter and un-founded faith in the technology means they're simply not present when it all goes wrong.

If anything, the better the system gets, the less safe it will be. As mistakes become less frequent, complacency creeps in, meaning the car manager is even less engaged with what's going on and is even less able to mash the brake in a panic when it decides to drive straight at a parked car.

Edited by NDNDNDND on Monday 26th October 13:33
As you say, beta testers are on high alert so likely a big reason why accident rate is (presumably) lower than the average. I say presumably otherwise it would be banned in no time.

I’m curious why you are so confident that accident rate will increase as the ‘mix’ between human alertness and system performance shifts.

More likely I think is accident rate will stay stable or drop as the balance gradually shifts. If the accident rate increases it’s game over for public fsd so I don’t imagine that being allowed to happen.

gangzoom

7,351 posts

230 months

Monday 26th October 2020
quotequote all
NDNDNDND said:
The Tesla nearly drives straight into a parked car, though.

The idea that the general public might have access to this in a couple of months time is fking terrifying, frankly. I don't want to share the roads with these. I sincerely hope Tesla 'FSD' is not permitted in the UK.
It's actually pretty amazing Tesla can get their cars to even attempt to take a STOP junction with no prior HD mapping (I presume). I would say the logic of even simple STOP junctions in the US is as hard to program as anything we see on UK roads.

What's more amazing is our 2017 car will be able to do the same, and yet every day - even today, people are going out and buying brand new cars that will never gain any functionality once they leave the show room.

This clearly very early adopter tech, and Tesla haven't even released to the general public expect for a few hand picked people in the US - Most of whom seem to have had a phone call from Tesla explaining what they were getting them selves into, no doubt there was/is plenty of legal disclaimer to get through before Tesla pushed the software to their cars.

How or if the system can improve is the next step - It would be interesting to see how well a Waymo Robotaxi would have done on the same stretch of road.

Edited by gangzoom on Monday 26th October 13:57

NDNDNDND

2,429 posts

198 months

Monday 26th October 2020
quotequote all
ZesPak said:
I've driven a LOT of km's with autopilot.
In the end it's something you learn to trust and use as a piece of technology.

Just as I trust my car to brake from 160-0 in a certain distance, or the accelerator to respond in a certain way.
You'll trust it, at least until the next update when it might start doing something completely different:

https://teslamotorsclub.com/tmc/threads/the-cancer...

Given Tesla's track record on this, it doesn't bode well for FSD. It might drive down a certain street a hundred times, and then a software update means it might do anything...

Edited by NDNDNDND on Monday 26th October 14:10

Brother D

4,168 posts

191 months

Monday 26th October 2020
quotequote all
lothianJim said:
NDNDNDND said:
Judging by the videos I've seen, the only way this FSD could be safer is that it's so nerve-wracking to use that no one can risk looking at their phone for a moment because it might do something completely unpredictable at any moment...

As ever, it's the psychology of it that'll be dangerous. It's the Joshua Browns of the world, whose utter and un-founded faith in the technology means they're simply not present when it all goes wrong.

If anything, the better the system gets, the less safe it will be. As mistakes become less frequent, complacency creeps in, meaning the car manager is even less engaged with what's going on and is even less able to mash the brake in a panic when it decides to drive straight at a parked car.

Edited by NDNDNDND on Monday 26th October 13:33
As you say, beta testers are on high alert so likely a big reason why accident rate is (presumably) lower than the average. I say presumably otherwise it would be banned in no time.

I’m curious why you are so confident that accident rate will increase as the ‘mix’ between human alertness and system performance shifts.

More likely I think is accident rate will stay stable or drop as the balance gradually shifts. If the accident rate increases it’s game over for public fsd so I don’t imagine that being allowed to happen.
They are already 10 times safer than normal cars running auto pilot on average:

https://cleantechnica.com/2020/08/01/tesla-autopil...







Tuna

19,930 posts

299 months

Monday 26th October 2020
quotequote all
SWoll said:
This. All a numbers game, it doesn't need to be perfect just better than the large number of inattentive and poor drivers on the road.
You are choosing who gets involved in an accident - the classic trolley problem talks about exactly this issue.

What if FSD is particularly bad at spotting small children? Sure, you could say it's reduced the number of accidents, but it may have increased the number of children being hit. Is that acceptable?

It's not just a question of better overall accident rates. Are some accident types more likely now? Is the outcome of accidents worse for those that are involved? Are the accidents that Teslas do get involved in more avoidable by human drivers?

It's really far more complicated than "the software just has to get good enough" - we have to have agreement on what "good enough" actually means, and some way of proving that the software consistently delivers that, in all conditions, in all locations, and that it continues to do so every time they change or update it.

"Oh we'll just give it out and see what happens" is not a reassuring approach.

lothianJim

2,274 posts

57 months

Monday 26th October 2020
quotequote all
Human drivers don't perform consistently in all conditions. Yet insurance companies still have to estimate risk for different occupations, ages, engine size....

I got a quote for my performance, that was nearly twice the price that I was quoted a few days earlier. Turns out buying a 500bhp car with one days notice, was interpreted to mean I was more likely to make compulsive and unpredictable errors than someone who planned weeks ahead.

If anything, I would guess actuaries have an easier time working out premiums for cars that are mostly driven on autopilot, as even with all it's flaws, it's probably more predictable kind of unpredictability, than the average human driver.



Cheeses of Nazareth

789 posts

66 months

Monday 26th October 2020
quotequote all
lothianJim said:
Human drivers don't perform consistently in all conditions. Yet insurance companies still have to estimate risk for different occupations, ages, engine size....

I got a quote for my performance, that was nearly twice the price that I was quoted a few days earlier. Turns out buying a 500bhp car with one days notice, was interpreted to mean I was more likely to make compulsive and unpredictable errors than someone who planned weeks ahead.

If anything, I would guess actuaries have an easier time working out premiums for cars that are mostly driven on autopilot, as even with all it's flaws, it's probably more predictable kind of unpredictability, than the average human driver.

But you are a single risk individual, and assessed as such.

And when they find ( not that they have to look hard) the flaw, it wont be the driver that pays , it will be whoever is on the hook for the software.

Lets assume 1000 people drive a car into a wall... 1000 different reasons.. nobody cares 1000 different claimants , 1000 different people

Telsa drives 1000 cars into a wall... 1 set of software 1 reason.. Try mitigating that .

off_again

13,893 posts

249 months

Monday 26th October 2020
quotequote all
Brother D said:
They are already 10 times safer than normal cars running auto pilot on average:

https://cleantechnica.com/2020/08/01/tesla-autopil...






That is very misleading. Since Autopilot cant be used on certain roads, situations or conditions and the vast majority of users, just turn it on for longer journeys on comparatively safe freeways and interstates - it completely ignores the higher risk factor miles covered and paints it to be perfect, which it is not.

This is positioning from Tesla to try and sell this from a regulatory point of view to get it approved on the basis of security. Its misleading, frankly incorrect and hopefully the legislative and regulatory bodies see through the spin of statistics to support their marketing efforts.

Tuna

19,930 posts

299 months

Monday 26th October 2020
quotequote all
lothianJim said:
Human drivers don't perform consistently in all conditions. Yet insurance companies still have to estimate risk for different occupations, ages, engine size....

I got a quote for my performance, that was nearly twice the price that I was quoted a few days earlier. Turns out buying a 500bhp car with one days notice, was interpreted to mean I was more likely to make compulsive and unpredictable errors than someone who planned weeks ahead.

If anything, I would guess actuaries have an easier time working out premiums for cars that are mostly driven on autopilot, as even with all it's flaws, it's probably more predictable kind of unpredictability, than the average human driver.

Insurance companies don't come into it - this is about the law and who is legally in charge of a vehicle. By law, we have to have a test that takes into account how human beings react, and tests the 'edge cases' for our behaviour - can we see, can we control the vehicle, can we react and can we behave responsibly. Note that the test has baked into it the idea that a human driver with reasonable eyesight can tell the difference between a stop sign and an advertising hoarding.

Now how do we 'test' the software? We don't know if it suffers from 'digital' optical illusions (actually, we have a strong idea it does - phantom braking is one example, but researchers have shown how even tiny changes in a scene, equivalent to a cigarette packet lying in the road can fool neural nets into completely misinterpreting the scene). We don't know how it prioritises options (if a child steps out into the street, but a bus is in the opposite lane, which way will it swerve?), and we have no mechanism for checking whether a software update changes those priorities.

Software can absolutely speed up response times, and that will certainly reduce a particular type of accident. However, there are just as many situations where the software has to make a choice - and that is where we have a huge can of worms that doesn't just go away because cars with "fast brakes" are better at avoiding rear-end shunts. Basic things like reliable positioning in the road are key not only to avoiding your own accidents, but also not causing other people to have accidents avoiding you.

Tuna

19,930 posts

299 months

Monday 26th October 2020
quotequote all
off_again said:
That is very misleading. Since Autopilot cant be used on certain roads, situations or conditions and the vast majority of users, just turn it on for longer journeys on comparatively safe freeways and interstates - it completely ignores the higher risk factor miles covered and paints it to be perfect, which it is not.

This is positioning from Tesla to try and sell this from a regulatory point of view to get it approved on the basis of security. Its misleading, frankly incorrect and hopefully the legislative and regulatory bodies see through the spin of statistics to support their marketing efforts.
"People who've just bought brand new expensive cars have fewer accidents than average shocker" rolleyes

gangzoom

7,351 posts

230 months

Monday 26th October 2020
quotequote all
How anyone can say this latest FSD beta build isn't a massive setup from anything else avaliable to the public is beyond me.

Again it would fascinating to see how Waymo would deal with on coming traffic and parked cars across half the lane.

https://twitter.com/kimpaquette/status/13207562610...

https://twitter.com/WholeMarsBlog/status/132059302...

Edited by gangzoom on Monday 26th October 19:04

Gandahar

9,600 posts

143 months

Tuesday 27th October 2020
quotequote all
gangzoom said:
How anyone can say this latest FSD beta build isn't a massive setup from anything else avaliable to the public is beyond me.

Again it would fascinating to see how Waymo would deal with on coming traffic and parked cars across half the lane.

https://twitter.com/kimpaquette/status/13207562610...

https://twitter.com/WholeMarsBlog/status/132059302...

Edited by gangzoom on Monday 26th October 19:04
It certainly is a setup, to keep the share price up to get Elon more money ... biggrin

Let's see how it goes. The interesting thing will be if someone makes a mistake whilst driving the car, then blames it on the FSD and sues Tesla, even when not Teslas fault. I can see Tesla getting snowed under by lawsuits ( you know how much people love those nowadays ) by third parties and Tesla owners.

Meanwhile this is interesting,

https://finance.yahoo.com/news/toyota-panasonic-ba...

Japanese aiming to get more efficient and catch up.


gangzoom

7,351 posts

230 months

Thursday 29th October 2020
quotequote all
Now its slowing down for speed bumps at night, managing roundabouts (slowly), and tried to crash into a train!

Apparently this FSD beta build isn't actually running on Tesla's latest NN (HW4 maybe needed??). So the software is clearly got lots of scope for improvement.

Cannot wait to see wider release of this at some point smile.

https://youtu.be/RN5Qoei7v1k

Smiljan

11,632 posts

212 months

Thursday 29th October 2020
quotequote all
It still has such a long way to go, even with the relatively simple roads in that video the driver has to take control so many times. It’s an impressive bit of kit but at the moment seems to increase the driver workload keeping any eye on what it’s doing rather than being a useful tool.

There’s a lot of talk on YouTube videos of data being fed back to the neural net to help it learn but no one has actually shown that this is happening or how it is possible. The data transfer from the cars would be enormous, does every Model 3 really update Tesla with its driving data in this way?

They still have the issue that it’ll cut out completely in heavy rain, snow, dirt on the cameras etc... but I guess that’s why they release the beta to owners in typically dry desert like states.

This still seems so far away from self driving robotaxis that were promised this year, perhaps it’s getting better slowly but progress is just that, slow.

Order66

6,739 posts

264 months

Thursday 29th October 2020
quotequote all
gangzoom said:
How anyone can say this latest FSD beta build isn't a massive setup from anything else avaliable to the public is beyond me.
I agree with what you're saying, but that still doesn't mean its good. In many ways it drives like a bad learner - hesitant, unpredictable, inappropriate actions, and there has to be a fully licensed driver alongside poised to take control at any second - a long way from "Full Self Driving"

NDNDNDND

2,429 posts

198 months

Thursday 29th October 2020
quotequote all
gangzoom said:
Now its [...] tried to crash into a train!

Cannot wait to see wider release of this at some point smile.
Ummm...

Gandahar

9,600 posts

143 months

Thursday 29th October 2020
quotequote all
gangzoom said:
Now its slowing down for speed bumps at night, managing roundabouts (slowly), and tried to crash into a train!

Apparently this FSD beta build isn't actually running on Tesla's latest NN (HW4 maybe needed??). So the software is clearly got lots of scope for improvement.

Cannot wait to see wider release of this at some point smile.

https://youtu.be/RN5Qoei7v1k
As a Tesla critic I think it would be unfair to get too wound up with this beta software having a few issues. As you say, lots of room for improvement and also Tesla have taken that step, unlike others so far. Pushing the envelope.

Having said that ( critic hat back on) what version of the hardware and software are these beta testers running? The hardware needs to be 4, as you rightly said and also the new beta "ground up" software that Elon said.

Anyone else with a Tesla and who doesn't have those two things, even though they paid a lot for it, are going to have to wait for upgrades...

You know there is going to be a HW5 as well .. for the promised land.

I get the feeling Musk thought he had to do this because of all his past promises for 2020. He's better sticking to trying to get Chinese people to buy Chinese made Teslas rather than excess shipping to Europe .....



TOPIC CLOSED
TOPIC CLOSED