Safe Streets Rebel's protest comes after automatic vehicles were blamed for incidents including crashing into a bus and running over a dog. City officials in June said...
Safe Streets Rebel's protest comes after automatic vehicles were blamed for incidents including crashing into a bus and running over a dog. City officials in June said...
You make it sound like it's a 50/50 split between human drivers and autonomous vehicles, which is definitely not the case.
There are way more human drivers than autonomous vehicles. So, when an autonomous vehicle runs your child or pet over or whatever, who do you blame? The company? The programmers? The DMV for even allowing them on the road in the first place?
What's an autonomous vehicle do if it gets a flat? Park in the middle of the interstate like an idiot instead of pulling over and phone home for a mechanic?
It's not a strawman argument, it is a fact. Without the ability to audit the entire codebase of self-driving cars, there's no way to know if the manufacturer had knowingly hidden something in the code that might have caused accidents and fatalities too numerous to recount, but too important to ignore, that were linked to a fault in self-driving technology.
I was actually trying to find an article I'd read about Tesla's self-driving software reverting to manual control moments before impact, but I was literally flooded by fatality reports.
Strawman arguments can be factual. The entire point is that you're responding to something that wasn't the argument. You're putting words in their mouth to defeat them instead of addressing their words at face value. It is the definition of a strawman argument.
We can't audit the code for humans, but we still let them drive.
If the output for computers driving is less than for humans and the computer designers are forced to be as financially liable for car crashes as humans, why shouldn't we let computers drive?
I'm not fully in either camp in this debate, but fwiw, the humans we let drive generally suffer consequences if there is an accident due to their own negligence
And I'm not denying it. However, it takes a very high bar to get someone convicted of vehicular manslaughter and that usually requires evidence that the driver was grossly negligent.
If you can show that a computer can drive as well as a sober human, where is the gross negligence?
Because there's no valid excuse to prevent us from auditing their software and it could save lives. Why the hell should we allow then to use the road if they won't even let us inspect the engine?
A car isn't a human. It's a machine, and it can and should be inspected. Anything less than that is pure recklessness.
Why the hell should we allow then to use the road if they won't even let us inspect the engine?
How do you think a car gets approved right now? Do we take it apart? Do we ask for the design calculations of how they designed each piece?
That isn't what happens. There is no "audit" of parts or the whole. Instead, there is a series of tests to determine road worthiness that everything in a car has to pass. We've already accepted a black box for the electronics of a car. You don't need to get approval of your code to show that pressing the brake pedal causes the brake lights turn on; they just test it to make sure that it works.
We don't audit the code already for life critical software already. It is all liability taken on by the manufacturers and verified via government testing of the finished product. What is an audit going to do when we don't it already?
It is most definitely a strawman to frame my comment as considering the companies "infinitely altruistic", no matter what lies behind the strawman. It doesn't refute my statistics but rather tries to make me look like I make an extremely silly argument I'm not making, which is the defintion of a strawman argument.
The data you cited comes straight from manufacturers, who've repeatedly been shown to lie and cherry-pick their data to intentionally mislead people about driverless car safety.
So no it's not a straw man argument at all to claim that you're putting inordinate faith in manufacturers, because that's exactly what you did. It's actually incredible to me how many of you are so irresponsible that you're not even willing to do basic cross-checking against an industry that is known for blatantly lying about safety issues.
It may be the case that every line of code of all self driving vehicles is not available for a public audit. But neither is the instruction set of every human who was taught to drive properly on the road today.
I would hope that through protesting and new legislation, that we will see the industry become more safe over time. Which we simply will never be able to achieve with human drivers.
What do you mean, I'm sure the industry whose standard practices include having the self-driving function turn itself off nanoseconds before a crash to avoid liability is totally motivated to spend the time and money it would take to fix the problem. After all, we live in a time of such advanced AI that all the news sites and magazines tell me we're on the verge of the Singularity, and they've never misled me before.
I feel like I'm taking crazy pills because no on seems to know or give a shit that Tesla was caught red handed doing this. They effectively murdered those drivers.
You don't need to put faith into companies beyond the faith that is put into humans. Make companies just as financially liable as humans are, and you'll still see a decrease in accidents.
How is that different from the current system of large vehicular insurance companies spending a fraction of their wealth to make their lawsuits disappear?
Ok, but in the context of letting computers drive, I feel like people want to enforce this perfect system of liability on automated systems where we already have an existing criminal and civil legal system as is that is designed to nowhere near the same standard for humans.
Why are we willing to say that it is unacceptable that no computer can kill people on the road when almost 43,000 die in the USA due to humans driving?
Why are we willing to say that it is unacceptable that no computer can kill people on the road when almost 43,000 die in the USA due to humans driving?
This part is bogus to me as well. My friend who used to work in self-driving said that when self driving can be "just" better than human driving, technology has won. In statistical terms, it means having slightly lesser fatalities than humans (<43k fatalities with respect to the num of human drivers).
Now it's up for debate lesser by how much exactly. Just 5% reduction or 50% reduction. If we want to go for 99% reduction, we should stop building self-driving tech altogether.
Uh, because software can be fixed and those deaths can be prevented? How the hell can you ask this question seriously? I can't believe how many people are willing to blatantly shill for these companies, even if it gets people fucking killed.
And no you can't claim to be saving lives because these driverless cars very often kill people in situations that a human driver would easily navigate.
We're talking about autonomous vehicles here, no driver, company owned.
So is Alphabet responsible?
Do your homework, these vehicles are owned by the parent company of Google and Apple, Alphabet. These vehicles have no private owner. So again, who TF is responsible?
That's not a good example. Courts move slow and that just barely happened and AFAIK is still being investigated (plus searching, the participants signed wavers -- though wavers don't give immunity legal negligence).
There's plenty of examples of companies being punished for negligence. It happens all the time when, say, their poorly constructed building collapses, cutting corners causes an oil spill in the Gulf of Mexico, they falsified their vehicle emissions reports, or when they abuse their market dominance.
Corporations totally do get away with a lot, but I don't see why you'd expect self driving cars to be a place where that would happen, especially since manually driven cars are already so regulated and require insurance. And everyone knows that driving is dangerous. Nobody is under any false impressions that self driving cars don't have at least some of that same danger. I mean, even if the AI was utterly perfect, they'd still need insurance to protect against cases that aren't the AI's fault.
I'll take your word on that. I've edited my comment to reflect that, but last research I did a few years ago, both companies were under the umbrella of Alphabet.
AI driven cars are just as prone to mechanical issues as well. Is AI smart enough to deal with a flat tire? Will it pull over to the side of the road before phoning in for a mechanic, or will it just ignorantly hard stop right in the middle of the interstate?
What's AI do when there's a police officer directing traffic around an accident or through a faulty red light intersection? I've literally seen videos on that before, AI couldn't give two shits about a cop's orders as to which way to drive the vehicle.
are there actual datasets to look at and info regarding how data was collected? all the sources on that page are just domain links but don't appear to point to the data making the claims?
4.7 accidents per million miles doesn't mean much if the cars are limited to specific roads or include test tracks that give them an advantage. the degree of variance in different environments would also need to be measured such as weather effects, road conditions and traffic patterns.
I'm all for autonomous driving, but its not like companies don't fudge numbers all the time for their benefit.
I once had a crazy accident driving only like 15-20 MPH or so down a side road, then about 20 feet in front of me some idiot backed out of his parking spot right in front of me.
Broad daylight, overcast skies, no other vehicles blocking his view even. Dude just backed up without looking like a freaking idiot.
I responded in a split second. I did not hit the brakes, as I knew I didn't have enough time or distance to stop. If I had hit the brakes, his car would have had more time to back out further and I would have smacked straight on into the passenger side of his car.
Instead of hitting the brakes, I quickly jerked the steering wheel hard and fast to the left. See, I knew an impact was inevitable at that point, I made that move to clip his bumper instead of smacking into the passenger side and ruining both vehicles.
They tend to work on basic sensors and simplified logic. They don't tend to consider forward momentum and a vehicle pulling out perpendicular in front of you.
I believe half the programmers of autonomous vehicles never even drove a vehicle in their life.
DARPA figures out how to safely drive cars using LIDAR. Musk asked for a self driving car. Engineers come back the LIDAR solution. Musk fires them, says if humans can drive with two eyes, then so can computers. Cameras are cheaper than LIDAR. Second group tries it with cameras, can't get it to work, asked why they can't use LIDAR. Second group of engineers is fired. Third group comes up with something that 'kind of works'. People die. Big companies avoid self driving altogether, even though we have a perfect solution with LIDAR, all because Musk wanted to save a buck and can't get out of the way of his engineers.
I’ve worked on serious projects involving LiDAR. The LiDAR you need at these speeds and with this resolution cost almost as much as an Electric Car - it’s too expensive to reach wide adoption. But video processing with CNNs/RNNs has proven you can build the same level of data with cameras. You don’t even need binocular cameras now - if objects are moving you can generate binocular data by combining IMU data with time-series imagery.
As I understand it, Tesla’s delays aren’t related to image capture (which is where LiDAR could help). They’re related to trying to find universal actions to take against an almost infinite number of possible scenarios (mostly actions by human drivers).
the real funny here is how the USA has the most lax driving test standards in the developed world resulting in crazy amounts of road traffic accidents and really high mortality rates, but instead of dealing with shitty driving at the source there's a billion dollar industry in autonomous driving.
The testing doesn't do anything. There is no enforcement. Hell you are allowed legally to push every car out of your way on highways as long as you're willing to drive faster than the fastest vehicle.
When a for profit company is deciding how much time/energy/funds they want to invest in pedestrian safety, you get LOUD and you stay that way forever.
Your comment is blind to the reality we live in and the broken, out of touch people deciding if human lives are a businesses priority, and at what percentages, as these types of vehicles scale.
When humans get in an accident, there were choices/mistakes made, but there are things we can understand in certain situations and find closure often. When elon's failed experiment decapitates your grandmother by driving her under a semi and sheering off the top off the car, you'll probably never settle with that image as long as you live - and you'll see elon in the news each day being a tool and never seeing justice for that moment.
There's a difference with distinction in this conversation.
Imagine your dog gets run over, you rush them to the vet but ultimately they die and your thousands out of pocket. You call the corporate helpdesk to log a claim because there isn't anyone else to contact, they offer you $300 in credit for immediate resolution or you can dispute. You become upset because your dog was more than a credit refund, the call centre drone says that you've become aggressive, that you can call back during business hours and hangs up.
Oh ok I didn't realize a person's life was worth less of they're killed by the mistake of another person instead of the mistake of a computer. Since it'll be easier for their loved ones to blame a person and just get over it then that's better. Thanks for explaining that!
Did you read the article? The protests are in favour of affordable public transit, instead of using 'surveillance pods' as a way to build even MORE roads. The accidents are probably the least of their concerns, although still on the list
90 accidents a year is a LOT, if you stop to think that there are like only a few dozens of them out there, versus more than a hundred million human drivers.
They stop for no reason, cause gridlocks that require a human to comd out to it and pilot it, they've run over fire hoses being used and don't always get out of the way for emergency service vehicles. Nice statistic though.
If sarcasm could make the cars drive better I'd send you right out, but maybe you should leave the issue to people who at least understand the actual problem.
Im literally telling you what they say in the articles about why they're doing this and all you guys wanna do is joke and pretend theres no issues with an unproved technology because you saw some statistics about it. So compared to the other commenters in this chain Im Secretary Butigege.
Articles are not as factual as you seem to be making out. Every stupid thing you've mentioned has been done by humans 100 times over. The difference is we can fix the issue in self driving vehicles while humans will continue to make the mistakes.
I'm not denying they have problems I'm just saying that it's early tech and it's already better than humans in a lot of ways. People are working day in and day out of fixing and improving the tech. I'm not responsible for fixing it.
You're responsible for what you say and advocate for, especially in a democracy.
Advocating for throwing unproven and unverified technology on the road because you like tech and blindly trust data from a bunch of greedy corporations absolutely makes you partially responsible for every death that occurs due to this technology.
I've watched 5 hours + of unedited footage of tesla self driving mode driving in all kinds of places. It's really amazing what it can do. It's getting better at such a rapid rate I have no doubt that it will surpass the top 1% of human drivers. It's already surpassed the average driver.
Me talking about this does not make me responsible for every death that occurs from the technology. The only person responsible is the lawmakers that allow it on the road. It seems to me that a lot of journalists are exaggerating the dangers because they don't like Musk. This in turn convinces a lot of people that these cars are on the loose causing so many accidents. I am happy for ai to cause some road accidents while the technology grows. It's not like the roads have 0 deaths at the moment.
Comparing these two requires the number of cars with human drivers and the amount of time humans spend driving per year versus the number of autonomous vehicles and the amount of time they spend driving per year. I am not saying that you are wrong, I am just saying that comparing these numbers directly is like comparing apples with oranges.
I agree completely. My original post was just a stupid meme. I don't really think putting cones on the hoods of the cars is helping and that it's kind of dumb to do that and act smug about it. I'd rather people were sueing or something. I'm sure there is precedent for stopping manufacturers from making their vehicles more dangerous just to save a small percentage of money. I guess we do live in a capitalist utopia though so maybe I'm wrong but it seems like court might be more effective than trying to make these cars even more dangerous by adding a cone to the hood.