If the cars run over people while going 30kmh because they use cameras and a bug crashed into the camera and that caused the car to go crazy, that is not acceptable, even if the cars crash "less than humans".
Self driving needs to be highly regulated by law and demand to have some bare minimum sensors, including radars, lidars, etc. Camera only self driving is beyond stupid. Cameras cant see in snow or dark or whatever. Anyone who has a phone knows how fucky the camera can get under specific light exposures, etc.
Noone but tesla is doing camera only "self driving" and they are only doing it in order to cut down the cost. Their older cars had more sensors than their newer cars. But Musk is living in his Bioshock uber capitalistic dream. Who cares if a few people die in the process of developing visual based self driving.
What are you? Some kind of lidar shill? Camera only should obviously be the endgame goal for all robots.
Also, this article is not even about camera only.
Because that's expensive and can be done with a camera. And once you figure the camera stuff out - you gucci. Now you can do all kinds of shit without needing a lidar on every single robot.
Because that’s expensive and can be done with a camera.
Expensive, as in probably less than $600? Compared to the $35000 cost of a tesla?
(comparing the cost of the iPhone 12 (without lidar) and iPhone 12 pro (with lidar), we can guess that the sensor probably costs less than $200, so 3 of them (for left, right, and front) would cost probably less than $600)
lidar can actually be very cheap and small. Unfortunately, Apple bought the only company that seems to make sensors like that (besides some other super high end models)
There have been a lot of promising research papers on the technology lately though, so I expect more, higher resolution and cheaper lidar sensors to be available relatively soon (next couple years probably).
Yeah that's not even remotely the same type of sensor used in robotics and autonomous cars. Yes lidar is getting cheaper, but for high detail long range detection they're much more expensive than the case of your iphone example. The iPhone "lidar" is less than useless in an automotive context.
Perhaps. Idk, maybe I'm wrong. But it for sure seems it would be so much better if we achieved the same shit with a cheaper and more primitive simpler sensor.
To get the same resolution and quality of image in all lighting scenarios, cameras are actually going to be more expensive than LiDAR. Cameras suffer in low light, low contrast situations due to the physical limitations of bending light. More light = bigger lenses = higher cost, when LiDAR works better and is cheaper
My eyes are decent, but if I had a sixth sense that gave me full accurate 3D 360 spatial awareness regardless of visibility, I would probably not turn it off just to use my eyes. I’d use both.
I wasn't attempting sarcasm, so maybe I'm a moron idk. Fair, it likely I'm uninformed. I just know my daddy Elon said something about how solving shit with camera only is probably the best path and will pay off.
I've heard Elon Musk (or was it Karpathy?) talking about how camera should be sufficient for all scenarios because humans can do it on vision alone, but that's poor reasoning IMO. Cars are not humans, so there's no reason to confine them to the same limitations. If we want them to be safer and more capable than human drivers, one way to do that is by providing them with more information.
No it doesn't. Every life stolen matters and if it could be found that if tesla could have replicated industry best practice and saved more lives so that they could sell more cars then that is on them
A human can be held accountable for their failure, bet you a fucking emerald mine Musk won't be held accountable for these and all the other fool self drive fuckups.
Nothing was misguided and if anything your tone deaf attempt to double down only proves the point I'm making.
This stopped being about human deaths for you a long time ago.
Let's not even bother to ask the question of whether or not this guy could ultimately be saving lives. All that matters to you is that you have a target to take your anger out on the event that a loved one dies in an accident or something.
This stopped being about human deaths for you a long time ago.
Nope, it's about accountability. The fact that you can't see how important accountability is just says you're a musk fan boy. If Musk would shut the fuck up and do the work, he'd be better off - instead he's cheaping out left and right on literal life dependent tech, so tesla's stock gets a bump. It's ridiculous, like your entire argument.
I don't give a fuck about musk. I think hos Hyperloop is beyond idiotic and nothing he makes fucking works. In fact I never even said I necessarily think the state of Tesla autopilot is acceptable. All I said was that categorically rejecting autopilot (even for future generations where tech can be much better) for the express purpose of being able to prosecute people is beyond empty and shallow.
If you need to make up lies about me and strawman me to disagree you only prove my point. You stopped being a rational agent who weighs the good and bad of things a long time ago. You don't care about how good the autopilot is or can be. All you care about is your mental fixation against the CEO of the company in question.
Your political opinions should be based on principles, not whatever feels convenient in the moment.
This is 100% correct.
Look at the average rate of crashes per mile driven with autopilot versus a human. If the autopilot number is lower, they're doing it right and should be rewarded and NHTSA should leave them be. If the autopilot number is higher, then yes by all means bring in the regulation or whatever.
Humans are extremely flawed beings and if your standard for leaving companies alone to make as much money as possible is that they are at least minimally better than extremely flawed, I don't want to live in the same world as you want to live in.
Having anything that can save lives over an alternative is an improvement. In general. Yes, we should be pushing for safer self driving, and regulating that. But if we can start saving lives now, then sooner is better than later.
I'm not sure if that was supposed to be in agreement or countering what I said.
Over the past few decades, some people have noticed and commented on the enormous death toll that our reliance on driving and the vast amount of driving hours spent on our roads and said that that amount of death is unacceptable. Nothing has ever been able to come of it because of that aforementioned reliance on driving that our society has. Human nature cannot be the thing that changes, we can't expect humans to behave differently all of a sudden nor change their ability to focus and drive safely.
But this moment in time, when the shift from human to machine drivers is happening, the time when we shift from beings incapable of performing better on a global scale, to machines able to avoid the current death tolls due to their ability to be vastly more precise than humans, this is the time to reduce that death toll.
If we allow companies to get away with removing sensors from their cars which results in lower safety just so that said company can increase their bottom line, I consider that unacceptable even if the death toll is slightly lower than human driven cars if it could be greatly lower than human driven cars.
One company says they can build FSD with 15 sensors and sensor fusion. Another company says they can build FSD with just cameras. As I see it, the development path doesn't matter, it's the end result that matters.
It is not my place or yours or the governments to tell people how to spend their money or not.
It IS our place to ensure that companies aren't producing products that kill people.
Thus money doesn't matter here. What matters is whether or not FSD is more dangerous than a human. If it is, it should be prohibited or only used under very monitored conditions. If it is equal or better than a human, IE same or fewer accident / fatalities per mile driven, then Tesla should be allowed to sell it, even if it is imperfect.
In the US we have a free market. Nobody is obligated to pay for FSD or use it. People can vote with their wallet whether they think it's worth the money or not, THAT is what determines if Tesla makes more money or not. It's up to each individual customer to decide if it's worth it. That's their choice not mine or yours.
As I see it, in a free market what Tesla has to prove is that their system doesn't make things worse. If they can, if they can prove they're not making roads more dangerous IE no need to ban it, then it's a matter between them and their customer.
Tesla's self driving appears to be less safe and causes more accidents than their competitors.
"NHTSA’s Office of Defects Investigation said in documents released Friday that it completed “an extensive body of work” which turned up evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities."
Can you link me the data that says Tesla's competitors self-driving is more safe and causes less accidents and WHICH ONES? I would really like to know who else has this level of self-driving while also having less accidents.
I don't quite understand what they mean by this. It tracks drivers with a camera and the steering wheel sensor and literally turns itself off if you stop paying attention. What more can they do?
Separated into two sections – voluntary guidance and technical assistance to states – the new guidance focuses on SAE international levels of automation 3-5, clarifies that entities do not need to wait to test or deploy their ADS, revises design elements from the safety self-assessment, aligns federal guidance with the latest developments and terminology, and clarifies the role of federal and state governments.
The guidance reinforces the voluntary nature of the guidelines and does not come with a compliance requirement or enforcement mechanism.
(emphasis mine)
The U.S. has operated on a "states are laboratories for laws" principal since its founding. The current situation is in line with that principle.
No one else has the same capability in as wide a geographic range. Waymo, Cruise, Blue Cruise, Mercedes, etc are all geolocked to certain areas or certain stretches of road.
That's not how rates work tho. Larger sample size doesn't correlate with a higher rate of accidents, which is what any such study implies, not just raw numbers. Your bullshit rationalization is funny. In fact, a larger sample size tends to correspond with lower rates of flaws, as there is less chance that an error/fault makes an outsized impact on the data.
No one's talking about rates. The article itself, all the articles linked in these comments are talking about counts. Numbers of incidents. I'm not justifying anything because I'm not injecting my opinion here. I'm only pointing out that without context, counts don't give you enough information to draw a conclusion, that's just math. You can't even derive a rate without that context!
That's not my point though. We both know that the government agency doing this work is primarily interested in the rates, whether or not reports from the media are talking about the total numbers or not. The only reason they started the process of investigation was because of individual incidents, yes, but they're not looking for a few cases, but a pattern.
Once more, I'm literally not injecting an opinion here or arguing for or against anyone's point. All the articles here talked about counts of individual accidents with zero context about sample size, something that is absolutely crucial to establishing exactly what you're talking about, rates. You can shit all over that, and then pretend you didn't, but Im only pointing out that the math doesn't work unless that context is there.
My argument is that self driving car fatalities have to be compared against human driven car fatalities. If the self driving cars kill 500 people a year, but humans kill 1000 people a year, which one is better. Logic clearly isn't your strong suit, maybe sit this one out...