Software's alleged inability to handle cross traffic central to court battle after two road deaths
Tesla knew Autopilot caused death, but didn't fix it::Software's alleged inability to handle cross traffic central to court battle after two road deaths
It's time to give up the Tesla FSD dream. I loved the idea of it when it came out, and believed it would get better over time. FSD simply hasn't. Worse, Musk has either fired or lost all the engineering talent Telsa had. FSD is only going to get worse from here and it's time to put a stop to it.
The article isn't talking about FSD, these accidents are from 2019 and 2016 before public availability of FSD. Of course, "Full Self Driving" ain't great either...
The whole article is kind of FUD. It's saying engineers didn't "fix" the issue, when the issue is people are using Autopilot, essentially advanced lane keep, on roads it shouldn't be used on. It doesn't give a shit about intersections, stop signs, or stop lights. It just keeps you in your lane and prevents you from rear ending someone. That's it. It's a super useful tool in it's element, but shouldn't be used outside of freeways or very simple roads at reasonable speeds. That said, it also shouldn't be fucking called "autopilot". That's purely marketing and it's extremely dangerous, as we can see.