The reason I ask, is I've been seeing alot of news and cases of Tesla's self driving acting up and being a point of contention. But back in 2016-17 my ex's uncle and aunt got a Model X when they first dropped and they "auto-drove" us like 50 miles without any noticeable issue.
Was i just gambling my life or has the tech somehow gotten worse?
My wife's car is 6 years old, and is level 2. Nothing amazing now, but kinda cool in 2018.
Since then expectations have increased dramatically, and the problems you're hearing about are cars expected to have the higher levels of automation but failing to achieve that.
It seems like this is one of those technical problems that gets exponentially more difficult to solve, the closer we get to solving it. What I mean is, suppose a human averages 100,000km per "incident". It was easy to make a car do 90,000km per incident, less so to have it do 95,000km per incident, but we're finding it very very difficult to get that last 5% performance.
It seems like this is one of those technical problems that gets exponentially more difficult to solve, the closer we get to solving it. What I mean is, suppose a human averages 100,000km per “incident”. It was easy to make a car do 90,000km per incident, less so to have it do 95,000km per incident, but we’re finding it very very difficult to get that last 5% performance.
to this, most cars with relatively high degrees of automation bounce out of it when there's something that it feels it can't solve adequately. Most of the time that works, but it would be like if a human were somehow able to only drive when everything was routine. (90% of our driving is routine. it's that last ten percent that isn't- and that's when we crash. well. barring things like drunk driving or distracted driving. Humans are dumb. autonomous driving was designed by humans and isn't any smarter)
You're not wrong, but that's not really what I meant although perhaps I didn't explain it very well.
Another way to say the same thing, if you group together all the various components or aspects of "driving", 95% of them might be solved relatively easily, but getting the last 5% right is extraordinarily difficult.
It's deceiving because the first time you saw a Level 2 car in 2018 it's natural to think that if they've made so much progress seemingly overnight, then surely in the next few years we will have Level 6 cars.
I do take your point that humans are also good drivers 95% of the time and mistakes only occur within 5% of situations. The issue there is the imperative that autonomous cars must be better than a human in all circumstances. If a human makes, on average, 5 serious mistakes every 500,000km, but an autonomous car makes 6, you'd probably not want to put your family in that autonomous car.
well, to be fair, I'm pretty ardently apposed to self driving cars- i wouldn't put my family in one even if it was better than humans.
The reason being is that all it takes is one bug in one code and your entire family is fucked; and with the way corporations are now handling updates, I frankly don't trust them to maintain it properly at all. (AKA forced updates, with shit-for-testing cloudstrike, for example; could have been prevented, but they just had to push that update globally, all at the same time, to everyone. imagine a malicious admin doing a terror attack by making all your cars crash. or some intern pushing the wrong code and suddenly your car is bricked. I'll keep my dumb car, thank you very much.)
Being able to drive without killing someone is only one aspect of an autonomous vehicle, and security is one that I'm not confident about in the least.
I've noticed that my wife's Level 2 car is just hopeless outside of the city. Sure that's where most people live and it's fine for most people.
Driving on country roads it spends more time having self-disabled it's autonomous features than not, simply because it can't see the road or what have you.