Cruise robotaxi collides with fire truck in San Francisco, leaving one injured::A crash between Cruise robotaxi and a San Francisco Fire Department truck occurred last night in Tenderloin. The incident happed a week after the California Public Utilities Commission (CPUC) approved 24/7 autonomous taxi ride services.
I’ve had my entire career working within industrial automation and I see the value AI and automated efforts being to the world.
I do not see the value in allowing private companies to playtest autonomous driving with human life as a potential collateral.
The argument keeps getting made — “how many humans make that same mistake daily?” — and it’s not equivocal; if autonomous vehicles cannot reach a 100% safety and accuracy feature, they should not be allowed to risk human lives.
The difference being that autonomous vehicles could reach 100% safety by removing all non autonomous vehicles from the road and imposing a communication standard between vehicles so they all know what the other vehicles are doing at all times.
That only applies to regions of the world where there's no snow because autonomous driving in a snowstorm will probably never be solved.
"autonomous vehicle" in the article can't handle most basic shit like emergency vehicle approaching. I've spent enough years in automotive engineering and all of this autonomous drive bullshit is ADAS with a few gimmicks and shouldn't be nowhere near full control of the car, however this got out of hand and this shit is on public roads somehow.
You're arguing that even if autonomous vehicles are safer drivers than humans, we should choose to make ourselves less safe by disallowing them? Fuck that. Nobody should have to die because AI makes you squeamish.
Unnecessarily hostile comment, too bad that attitude didn’t stay with Reddit.
AI doesn’t make me squeamish at all. Ignoring the context in which I stayed my background with automation was a choice, but the tub is using the general public to beta test hazardous equipment. Humans make errors and can be held responsible; corporations putting people at risk for no responsibility is reckless.
A passenger riding inside the Cruise self-driving vehicle suffered “non-severe injuries” and was transported in an ambulance, according to an official company post on X (formally Twitter) this morning.
“We are investigating to better understand our AVs performance, and will be in touch with the City of San Francisco about the event,” Cruise’s post reads.
The incident comes less than a week after the California Public Utilities Commission voted to allow paid 24/7 robotaxi services in San Francisco, handing companies like Cruise and Alphabet-owned Waymo a huge victory.
City officials and residents have pleaded with the state to slow down the efforts, citing incidents in which self-driving cars have interfered with emergency vehicles.
Since Cruise began testing in San Francisco, its vehicles have obstructed traffic on multiple occasions, including a situation where 10 autonomous vehicles halted traffic in a busy intersection during a music festival.
And a cement mason’s worst nightmare occurred on Tuesday when a Cruise vehicle reportedly got stuck in wet concrete.
The original article contains 285 words, the summary contains 164 words. Saved 42%. I'm a bot and I'm open source!
Allowing operation of autonomous vehicles in high risk situations should include stipulations for major fines and restitution for anyone hurt by a misbehaving vehicle. It's not the fire truck's responsibility to give way, Waymo/Cruise better figure it out, they have smart people on their team.