A driverless car in San Francisco drove right into wet concrete and got stuck after seemingly mistaking it for a regular road: 'It ain't got a brain' / The site had been marked off with constructio...
The site had been marked off with construction cones and workers stood with flags at each end of the block, according to city officials.
A driverless car in San Francisco drove right into wet concrete and got stuck after seemingly mistaking it for a regular road: 'It ain't got a brain' / The site had been marked off with constructio...::The site had been marked off with construction cones and workers stood with flags at each end of the block, according to city officials.
Every time one of these things happens, there's always comments here about how humans do these things too. Two responses to that:
First, human drivers are actually really good at driving. Here's Cory Doctorow explaining this point:
Take the much-vaunted terribleness of human drivers, which the AV industry likes to tout. It's true that the other dumdums on the road cutting you off and changing lanes without their turn-signals are pretty bad drivers, but actual, professional drivers are amazing. The average school-bus driver clocks up 500 million miles without a fatal crash (but of course, bus drivers are part of the public transit system).
Even dopes like you and me are better than you may think – while cars do kill the shit out of Americans, it's because Americans drive so goddamned much. US traffic deaths are a mere one per 100 million miles driven, and most of those deaths are due to recklessness, not inability. Drunks, speeders, texters and sleepy drivers cause traffic fatalities – they may be skilled drivers, but they are also reckless.
There's like a few hundred robot taxis driving relatively few miles, and the problems are constant. I don't know of anyone who has plugged the numbers yet, but I suspect they look pretty bad by comparison.
Second, when self-driving cars fuck up, they become everyone else's problem. Emergency service personnel, paid for by the taxpayer, are suddenly stuck having to call corporate customer service or whatever. When a human fucks up, there's also a human on the scene to take responsibility for the situation and figure out how to remedy it (unless it's a terrible accident and they're disabled or something, but that's an edge case). When one of these robot taxis fucks up, it becomes the problem of whoever they're inconveniencing, be it construction workers, firefighters, police, whatever.
This second point is classic corporate behavior. Companies look for ways to convert their internal costs (in this case, the labor of taxi drivers) into externalities, pushing down their costs but leaving the rest of us to deal with their mess. For example, plastic packaging is much, much cheaper for companies than collecting and reusing glass bottles or whatever, but the trash now becomes everyone else's problem, and at this point, there is microplastic in literally every place on Earth.
I'm not sure your second point is as strong as you believe it to be. Do you have a specific example in mind? I think most vehicle problems that would require an emergency responder will have easy access to a tow service to deal with the car with or without a human being involved. It's not like just because a human is there that the problem is more easily solved. For minor-to-moderate accidents that just require a police report, things might get messy but that's an issue with the law, not necessarily something inherently wrong with the concept of self driving vehicles.
Also, your first point is on shaky ground, I think. I don't know why the metric is accidents with fatalities, but since that's what you used, what do you think having fewer humans involved does to the chance of killing a human?
I'm all for numbers being crunched, and to be clear (as you were, I think) the numbers are the real deciding metrics here, not thought experiments.
And I think it's 100% true that autonomous transportation doesn't have to be perfect, just better than humans. Not that you disagree with this, but it is probably what people are thinking when they say "humans do this too".
I’m not sure your second point is as strong as you believe it to be. Do you have a specific example in mind? I think most vehicle problems that would require an emergency responder will have easy access to a tow service to deal with the car with or without a human being involved. It’s not like just because a human is there that the problem is more easily solved. For minor-to-moderate accidents that just require a police report, things might get messy but that’s an issue with the law, not necessarily something inherently wrong with the concept of self driving vehicles.
The fire department in SF has made it very clear that these cars are a PITA for them. They are actively driving through emergency situations, cannot follow verbal instructions, drive over fire hoses, etc.
Also, your first point is on shaky ground, I think. I don’t know why the metric is accidents with fatalities,
Fatalities is just the number we have to compare. Self-driving car companies have been publishing a simulated fatality metric for a while now. I totally agree there are other ways to think about it. My point is that AV companies have a narrative that humans are actually bad at driving, and I think this comparison pokes a hole in that story.
but since that’s what you used, what do you think having fewer humans involved does to the chance of killing a human?
I'm not sure, actually. The vast majority of driving is solo trips, so I'd expect not that much? There are some studies suggesting that people might actually use cars more if self-driving cars become a reality:
And that really gets to the heart of my problem with the self-driving cars push. When faced with complex problems, we should not assume there is a technological solution. Instead, we should ask ourselves to envision a better world, and then decide what technologies, if any, we need to get there. If self-driving cars are actually a good solution to the problem, then by all means, let's make them happen.
But I don't think that's what's happening here, and I don't think they are. American cities are a fucking disaster of planning. They are genuinely shameful, forcing their inhabitants to rely on cars, an excessively wasteful mode of transportation, all in a climate crisis. Instead of coming together to work on this problem, we're begging our technological overlords to solve them for us, with an added drawback of privatizing our public infrastructure.
Also, almost all safety numbers for transportation are meaningless unless normalized to miles driven. They also commented about these issues being "everywhere" then goes on a long diatribe against self driving cars. I rarely see anything about them likely based moreso on the media I consume. They clearly have a bias and the media they consume has likely been tailored to support that. Them seeing many articles on crashes or accidents is anecdotal at best (as it is with me having not seen many articles).
I'm not sure I see the point, AV will end up being better than humans if they are not already for the cutting edge tech.
There is nothing a human does a computer can't do faster. An AV has 360° view and can react in a matter of one millisecond, while being just as precise as if it had all the time in the world to do the maneuver.
Plus, and this is the part I truly don't understand with the quote : a computer can't get drunk, sleepy or get distracted. So by nature it fixes what the quote considers to be the biggest problem of human drivers.
There's another difference between humans and computers you forgot to mention. Once a computer 'learns" something, (like avoiding driving into wet concrete), it will never make that mistake again. Prople on the other hand continue making the same error over and over.
You are using an argument that is not new .. pilots have used it for decades (and some still do) to complain about automation on the flight deck. Yet every day tens of thousands of airliners fly to their destination (and sometimes land there as well) with no pilot intervention. Pilots could easily be eliminated from airplanes .. the reason they are still flying has more to do with PR and a public not willing to fly without a human up front. But automation has made air travel safer by an order of magnitude. It will do the same for cars.
Planes fly with significant distance between them, well above any major obstacles, along routes with very few turns. Cars on the other hand are close together, traveling along poorly marked routes that have significant amounts of turns, and need to dodge a lot of obstacles. It's quite rare for a plane to hit a cat.
That is such a massive oversimplification of how computer learning works that it's neither here nor there.
Also, automation might work in some cases and not others. Sometimes different things are similar, and sometimes they're different. Just because similar arguments have been made before about different things doesn't mean you get to discount them now in an different situation.
It's a software update away from getting better. Humans will be forever a risk to other humans when driving. I'm not saying it's good yet, but people in 2020 thought "driverless cars will be forever 5 years away"
Yet here we are, talking about how bad they are. That's an improvement from only limited testing a few short years ago
No, people in 2014 kept saying driverless cars will be 5 years away. And they kept pushing "no, wait, in 5 more years". It's 2023, and they still say it's "5 more years", or they pretend that it's already here and these cars have no problems whatsoever.
Actual, real, Level 5 automated driving is not here and will take at least 20 years to get there. Probably 50 years, realistically. What Cruise is doing is the same thing Elon Musk does with Tesla's: Call it a "self-driving car" when it's anything but.
These cars can follow lines and pretend to drive. They can't actually drive. They can't handle any of the edge cases. Their handlers completely ignore all of the accidents and mistakes they make on a daily basis. They brush aside the fatalities, and blame it on everything else except themselves, practicing a healthy dose of whataboutism when they compare their mistakes to humans.
I suspect the guys blocking the road could have prevented it by staying there, but assumed it wasn't really their problem after all and kinda wanted to see this shit happen.
Damn. I'm imagining the absolute ass-ton of litigation for a slow-moving, self-driving car injuring a construction worker on the job who purposefully stood in the way of it to try to force it to stop.
When only the bad things of something are reported it is biased by default. Better to have a more balanced story, which will probably be more boring because of that. A lot of people would then not be interested anymore because of the lack of controversy.