Move fast, break shit.
Fake it till you sell it, then move the goal posts down. Shift human casualties onto individual responsibility, a core libertarian theme.
Profit off the lies because it's too late, money already in the bank.
“If you’ve got, at scale, a statistically significant amount of data that shows conclusively that the autonomous car has, let’s say, half the accident rate of a human-driven car, I think that’s difficult to ignore,” Musk said.
That's a very problematic claim - and it might only be true if you compare completely unassited vehicles to L2 Teslas.
Other brands also have a plethora of L2 features, but they are marketed and designed in a different way. The L2 features are activate but designed in a way to keep the driver engaged in driving.
So L2 features are for better safety, not for a "wow we live in the future" show effect.
For example lane keeping in my car - you don't notice it when driving, it is just below your level of attention. But when I'm unconcentrated for a moment the car just stays on the lane, even on curving roads. It's just designed to steer a bit later than I would do. (Also, even before, the wheel turns minimally lighter into the direction to keep the car center of lane, than turning it to the other direction - it's just below what you notice, however if you don't concentrate on that effect)
Adaptive speed control is just sold as adaptive speed control - it did notice it uses radar AND the cameras once, as it considers. my lane free as soon the car in front me clears the lane markings with its wheels (when changing lanes)
It feels like the software in my car could do a lot more, but its features are undersold.
The combination of a human driver and the driver assist systems in combination makes driving a lot safer than relying on the human or the machine alone.
In fact the braking assistant has once stopped my car in tight traffic before I could even react, as the guy in front of me suddenly slammed their brakes. If the system had failed and not detected the situation then it would have been my job to react in time. (I did react, but can't say if I might have been fast enough with reaction times)
What Tesla does with technology is impressive, but I feel the system could be so. much better if they didn't compromise saftey in the name of marketing and hyperbole.
If Tesla's Autopilot was designed frim ground up to keep the driver engaged, I believe it would really be the safest car on the road.
I feel they are rather designed to be able to show off "cool stuff".
VERGE articles seem to be getting worse over the years, they've almost reached Forbes level, yes this does raise some valid safety concerns. No Tesla isn't bad just because it's Tesla.
It doesn't really give us the full picture. For starters, there's no comparison with Level 2 systems from other car makers, which also require driver engagement and have their own methods to ensure attention. This would help us understand how Tesla's tech actually measures up.
Plus, the piece skips over extremely important stats that would give us a clearer idea of how safe (or not) Tesla's systems are compared to good old human driving.
We're left in the dark about how Tesla compares in scenarios like drunk, distracted, or tired driving—common issues that automation aims to mitigate. (probably on purpose).
It feels like the article is more about stirring up feelings against Tesla rather than diving deep into the data. A more genuine take would have included these comparisons and variables, giving us a broader view of what these technologies mean for road safety.
I feel like any opportunity to jump on the Elon hate wagon is getting tiresome. (and yes i hate Elon too).
I love to hate on musky boi as much as the next guy, but how does this actually compare to vehicular accidents and deaths overall? CGP Grey had the right idea when he said they didn't need to be perfect, just as good as or better than humans.
Is the investigation exhaustive? If these are all the crashes they could find related to the driver assist / self driving features, then it is probably much safer than a human driver. 1000 crashes out of 5M+ Teslas sold the last 5 years is actually a very small amount
I would want an article to try and find the rate of accidents per 100,00, group it by severity, and then compare and contrast that with human caused accidents.
Because while it's clear by now Teslas aren't the perfect self driving machines we were promised, there is no doubt at all that humans are bad drivers.
We lose over 40k people a year to car accidents. And fatal car accidents are rare, so multiple that by like 100 to get the total number of car accidents.
Obviously the time to react to the problem was before the system told you about it, that's the whole point, THE SYSTEM IS NOT READY. Cars are not ready to drive themselves, and obviously the legal system is too slow and backwards to deal with it so it's not ready either. But fuck it let's do it anyway, sure, and while we're at it we can do away with the concept of the driver's license in the first place because nothing matters any more and who gives a shit we're all obviously fucking retarded.
Is linked to excess deaths? Technically it could be saving lives at a population scale. I doubt that's the case, but it could be. I'll read the article now and find out.
Edit: it doesn't seem to say anything regarding "normal" auto related deaths. They're focusing on the bullshit designation of an unfinished product as "autopilot",and a (small) subset of specific cases that are particularly aggregious, where there were 5-10 seconds of lead time into an incident. In these cases a person who was paying attention wouldn't have been in the accident.
In March 2023, a North Carolina student was stepping off a school bus when he was struck by a Tesla Model Y traveling at “highway speeds,” according to a federal investigation that published today.
The Tesla driver was using Autopilot, the automaker’s advanced driver-assist feature that Elon Musk insists will eventually lead to fully autonomous cars.
NHTSA was prompted to launch its investigation after several incidents of Tesla drivers crashing into stationary emergency vehicles parked on the side of the road.
Most of these incidents took place after dark, with the software ignoring scene control measures, including warning lights, flares, cones, and an illuminated arrow board.
Tesla issued a voluntary recall late last year in response to the investigation, pushing out an over-the-air software update to add more warnings to Autopilot.
The findings cut against Musk’s insistence that Tesla is an artificial intelligence company that is on the cusp of releasing a fully autonomous vehicle for personal use.
The original article contains 788 words, the summary contains 158 words. Saved 80%. I'm a bot and I'm open source!
and the pedestrian-emergency-break on tesla cars, and many other cars with that feature will malfunction sometimes causing people behind you to rear-end you.
It's just a dozen! You know how many people COVID took? And everyone wanted COVID! ...it spreads of the air? Where's my fabric non filtering 😷 mask with added holes baby!? So you know...how cool would it be if you're riding a ordinary car and someone else is driving it into a wall or semi, except it's actually not a sentient being but an algorithm? It would be pretty cool right?
These are spanning from the earliest adopters, up until August of last year. Plenty of idiots using a cruise control system and trusting their lives to beta software. Not the same as the current FSD software.
Your own car insurance isn’t based on your driving skill when you had your learners permit. When Tesla takes on the liability and insurance for CyberCab, you’ll know it’s much safer than human drivers.
The same people who are upset over self driving cars are the ones who scream at the self checkout that they shouldn't have to scan their own groceries because the store isn't paying them.
32% of all traffic crash fatalities in the United States involve drunk drivers.
I can't wait until the day that this kind of technology is required by law I'm tired of sharing the road with these idiots and I absolutely trust self driving vehicles more than I trust other humans.
There are some real Elon haters out there. I think they're ugly as sin but I'm happy to see more people driving vehicles with all the crazy safety features, even if they aren't perfect.
You're in control of a massive vehicle capable of killing people and destroying property, you're responsible for it.