The false promises of Tesla’s Full Self-Driving
The false promises of Tesla’s Full Self-Driving

Its robots in cars getting coffee — or getting confused by roundabouts — for the latest episode of Land of the Giants: The Tesla Shock Wave.

The false promises of Tesla’s Full Self-Driving
Its robots in cars getting coffee — or getting confused by roundabouts — for the latest episode of Land of the Giants: The Tesla Shock Wave.
Imagine naming a feature "Full Self-Driving," and yet you can't take your attention away from the road and must be ready to take over at a moment's notice.
I remember reading a post that claimed that Tesla's safety rating was given to them because a bunch of their crashes were determined to be human error - because the self-driving feature would automatically disconnect if it faced a crash it couldn't avoid.
It's ok, it's in beta, so some features may not be complete just yet, but hey, let's just release this to the public anyways.
And charge a shit load for it
I feel like even with fully autonomous cars, there's going to be laws about how the main driver should always be alerted. This would be the case unless our cars are their own independent drivers like a cab.
Honestly, there should be laws against full self driving modes unless they can be proven to be good enough to not require driver intervention at all, and the manufacturer can be legally considered as the driver in case of an incident.
Requiring a driver to be alert and attentive to the road while not doing anything to operate the car runs contrary to human psychology. People cannot be expected to maintain focus on the road for extended periods while the car drives itself.
I don’t know exactly where the line should be drawn between basic cruise control and full self driving, but either the driver should be kept actively involved in driving or the car manufacturer should be held liable for whatever the car does.
It's just a driving assistant, like in any other car. As far as I know, currently Mercedes is the only one who implemented autonomous driving, and even that one is limited to some specific areas. But at least that one is real. So much, that legally Mercedes (the company) is considered to be the driver of such cars, in case anything happens on the roads.
Depends on your definition for autonomous driving which mainly depends on your ODD but they're not the only ones. Honda ,Volvo and GM have something. Others (i.e. BMW) have stuff next year but they're all going with more accurate names. CoPilot, PilotAssist, Super cruise, Traffic Jam Pilot. Makes it clear that these are drive assists, not drive replace.
Are there any truly autonomous machines which don't require a human to monitor?
Lots. Toasters, refrigerators, robot vacuums, thermostats, smart home lights, etc.
The reason why self-driving cars are extra tricky is both because they have a much more complex task and the negative consequences are sky high. If a robot vacuum screws up, it's not a big deal. This is why it's totally irresponsible to advertise something as having "full" autonomy when the stakes are so high.
Yeah, got a small delivery car in my country that drives the streets fully autonomous. It is used to deliver groceries to a distribution point.
It was kind of hallucinating to see it drive past. Since the car has a sort of cockpit, but it is too narrow to seat any human.
It is currently limited to 25 kph, and someone supervises it remotely at all times and can intervene. Just to be on the safe side. Although that rarely happens.
The main reason it can do this is because it always drives the same route.
https://press.colruytgroup.com/collectgo-tests-unmanned-vehicle-in-londerzeel#
You're absolutely right, it can be quite misleading to name a feature "Full Self-Driving" when it still requires constant attention and intervention from the driver. The expectations set by such a name may not align with the reality of the technology's current limitations.
Let's be fair. It could be called Driver Assistant Plus and you people would still be complaining because this isn't about Tesla
I complained because it absolutely sucked. Only Tesla would release this garbage in such a fraudulent manner, no other company would risk the lawsuits. Tesla's been killing people with autopilot since 2016, and FSD since it was released to the public. That should make you think, but that seems to be hard for some people when it comes to a Musking,
You're absolutely right, it can be quite misleading to name a feature "Full Self-Driving" when it still requires the driver's constant attention and readiness to take control. The expectation that the vehicle can handle all driving tasks autonomously is not aligned with the current reality. It's important for automakers to be transparent and accurate in their naming conventions to avoid any false expectations.
Without LIDAR, this is a fool's endeavor.
I wish this was talked about every single time the subject came up.
Responsible, technologically progressive companies have been developing excellent, safe, self-driving car technology for decades now.
Elon Musk is eviscerating the reputation of automated vehicles with his idiocy and arrogance. They don't all suck, but Tesla sure sucks.
Even with LIDAR there are just too many edge cases for me to ever trust a self driving car that uses current-day computing technology. Just a few situations I’ve been in that I think a FSD system would have trouble with:
Just like that cheaper non-lidar Roomba with room mapping technology, it will get lost.
I don't know why people are so quick to defend the need of LIDAR when it's clear the challenges in self driving are not with data acquisition.
Sure, there are a few corner cases that it would perform better than visual cameras, but a new array of sensors won't solve self driving. Similarly, the lack of LIDAR does not forbid self driving, otherwise we wouldn't be able to drive either.
challenges in self driving are not with data acquisition.
What?!?! Of course it is.
We can already run all this shit through a simulator and it works great, but that's because the computer knows the exact position, orientation, velocity of every object in a scene.
In the real world, the underlying problem is the computer doesn't know what's around it, and what those things around doing or going to do.
It's 100% a data acquisition problem.
Source? I do autonomous vehicle control for a living. In environments much more complicated than a paved road with accepted set rules.
Yes, self driving is not computationally solved at all. But the reason people defend LIDAR is that visible light cameras are very bad at depth estimation. Even with paralax, a lot of software has a very hard time accurately calculating distance and motion.
Do you have lidar on your head? No, yet you're able to drive with just two cameras on your face. So no lidar isn't required. Not that driving in a very dynamic world isn't very difficult for computers to do, it's not a matter of if, it's just a matter of time.
Would lidar allow "super human" driving abilities? Like seeing through fog and in every direction in the dark, sure. But it's not required for the job at hand.
You have eyes that are way more amazing than any cameras that are used in self driving, with stereoscopic vision, on a movable platform, and most importantly, controlled via a biological brain with millions of years of evolution behind it.
I'm sorry, you can't attach a couple cameras to a processor, add some neural nets, and think it's anything close to your brain and eyes.
I remember watching a video talking about is there a camera that can see as well as a human eye. The resolution was there are cameras that see close but not as well and they are very big and expensive and the human brain filters much of it without you realizing. I think it could be done with a camera or two but I think we are not close to the technology for the near future.
Do you have CCDs in your head? No? This argument is always so broken it's insane to see it still typed out as anything but sarcasm.
A lot of LIDAR fans here for some reason, but you're absolutely right.
There's just not a good amount of evidence pointing that accurate depth perception only obtained through LIDAR is required for self driving, and it also won't solve the complex navigation of a real world scenario. A set of visible spectrum cameras over time can reconstruct a 3D environment well enough for navigation and it's quite literally what Tesla's FSD does.
I don't know why someone would still say it's not possible when we already have an example running in production.
"But Tesla FSD has a high disengagement rate" - for now, yes. But these scenarios are more often possible to be solved by high definition maps than by LIDAR. For anyone that disagrees, go to youtube, choose a recent video of Tesla's FSD and try to find a scenario where a disengagement would have been avoided by LIDAR only.
There are many parts missing for a complete autonomous driving experience. LIDAR is not one of them.
The elephant in the room is that the NHTSA still doesn't have a director, and hasn't had a long-term director since 2017.
Steven Cliff was the director for 2 months in 2022. Aside from that, this important safety organization has been... erm... on autopilot (see what I did there?) and leaderless.
How are we supposed to keep tabs on car safety if the damn agency in charge of automobile safety doesn't even have a leader?
So, when are we changing this forums name from Technology to it's actual purpose of late "every click and rage bait post about Tesla and Musk so people can circlejerk worse than reddit"?
It's literally nothing but bullshit about Tesla and Twitter. All day long. No one cares!
I want to know about some actual tech, not the drama.
You don't care. If no one cared, there wouldn't be so many posts and extremely active discussions about them. If you want different content, post it.
Elon Musk is a scammer. He's good at that and it's the only thing he's good at
Can we go now and talk about technology,?
Unfortunately, all the current tech news is either people running naked scams or people debunking them.
The tragedy of our modern era is how much money we've invested in selling people a box labeled "Newest Life Changing Gadget" that's just full of rocks.
Check out the podcast TrashFuture. They do a bit about a shitty tech enterprise every episode, sometimes twice a week. From Juicero to Neom, the list of awful tech bullshit is limitless.
The only way to fix it is to post more interesting stuff yourself. Me too, tbh.
It seems like a new anti Tesla article hits lemmy every day. It's boring at this point.
Hopefully soon after the garbage copy/paste press release "articles" about "AI", fake superconductors, and other nonsense stops being posted.
Wow. Impressive collection.
Somehow reminds me of Jehova's witnesses and the end of the world :-)
Lmaooo too real
TBF, we have achieved a FSD that is safer than one human this year. But we took away the driver license of grandma so now we have to find another human that’s worse than FSD.
I live in a small town with a large college. The students just came back for fall semester. I believe we have quite a few candidates for your list.
Tesla's software is not safe:
https://www.washingtonpost.com/technology/2023/06/10/tesla-autopilot-crashes-elon-musk/
I wonder how much impact there might have been on code quality when Elon forced lead devs from their projects at Tesla to work on Twitter. I've never seen a situation like that turn out well for either party.
I wonder how this statistically compares to non-Tesla crashes?
Edit: quick Google/math shows average rate of lethal automobile crashes at 12 per 100,000 drivers. Tesla has supposedly sold 4.5million cars. 4.5million divided by 17 deaths from the article = 1 death per 200,000 Tesla drivers.
This isn't exactly apples-to-apples and would love for some to "do the math" more accurately, but it seems like Tesla is much safer than a standard driver.
The other confounding factor is we don't know how many of these drivers were abusing autopilot by cheating the rules (it requires hands on the wheel and full attention on the road)
It is not a valid comparison. Many deaths are in bad weather or in bad roads. Tesla self driving will not even turn on in these conditions. I do not believe apples to apples data exists.
this math is too sloppy to draw any conclusions
Whatabout what your mom does, down by the docks at night?
This is the best summary I could come up with:
Back in 2016, Tesla CEO Elon Musk stunned the automotive world by announcing that, henceforth, all of his company’s vehicles would be shipped with the hardware necessary for “full self-driving.” You will be able to nap in your car while it drives you to work, he promised.
But while Musk would eventually ship an advanced driver-assist system that he called Full Self-Driving (FSD) beta, the idea that any Tesla owner could catch some z’s while their car whisks them along is, at best, laughable — and at worst, a profoundly fatal error.
Since that 2016 announcement, hundreds of fully driverless cars have rolled out in multiple US cities, and none of them bear the Tesla logo.
His supporters point to the success of Autopilot, and then FSD, as evidence that while his promises may not exactly line up with reality, he is still at the forefront of a societal shift from human-powered vehicles to ones piloted by AI.
You’ll also hear from a former Tesla employee who was fired after posting videos of FSD errors, experts who compare the company’s self-driving efforts to its competitors, and even from the competitors themselves — like Kyle Vogt, CEO of the General Motors-backed Cruise, who is unconvinced that Musk can fulfill his promises without rethinking his entire hardware strategy.
Listen to the latest episode of Land of the Giants: The Tesla Shock Wave, a co-production between The Verge and the Vox Media Podcast Network.
The original article contains 497 words, the summary contains 236 words. Saved 53%. I'm a bot and I'm open source!
I've been ranting about this since 2016.
Having consumer trust in developing AI vehicles is hard enough without this asshole's ego and lies muddying the water.
Lol, ok. Your anecdotal experience can totally be believed over all the data gathered over years. Great. Thanks.
Yeah perfect! No more debates this guy settled it
Counter-counterpoint: I've been using it since 2019. I think you're exaggerating.
The only time I trust FSD is when it's stop-and-go traffic. Beyond that I have to pay so much attention to the thing that I might as well just drive myself. The "worst thing it can do" isn't just detour; it's "smash into the thing that it thought wasn't an issue".
I have only been driving a Tesla for a few days in 2022, but i fully agree with you, i wanted to specifically test the FSD and i had so many incidents where it tried to get into an appearing turning lane, even tough it should go straight, just straight up slowed to 10kph in a Tunnel where speed limit was 50, and there were blind corners because of "bad vision conditions" even the cruise control was annoying, it felt like my steering input was basically just a "suggestion" that i sometimes really had to force through against the will of the car because otherwise bad shit would've happened.... Sports mode steering made that only slightly better in the dual motor Model Y
Overall I enjoyed driving the ID3 more actually.. at least that had solid and responsive steering that felt - compared to the Tesla - like driving a sports car.. and i've driven the ID3 directly after the Tesla.
Only good thing about the Tesla was acceleration.
It doesn’t read turn signals
It does in the FSD beta (somewhat). It even brakes and allows them in if it detects that they have a signal on. It doesn’t understand merges as well, but it’s still better than regular autopilot. All your other points are pretty valid. I am constantly taking it out of AP Ana putting it back in during a city drive, even though I have “FSD”.
The worst it will do is pick the wrong lane and detour a bit to get back on track.
https://www.cbsnews.com/amp/news/tesla-car-crash-nhtsa-school-bus/
Here is an alternative Piped link(s): https://piped.video/xvqQ4F7Yf2o?si=3tMs2_aHpnfqKCBu
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source, check me out at GitHub.