The evidence from robotaxis in San Francisco seems to show self-driving cars are already safer than human-driven ones. They'll keep getting better at driving, people won't.
Humans are taught to do their best to obey the law, even and especially when there are flashing lights around.
No that doesn't mean humans are perfect, but at least they're educated what to do when around emergency vehicles. But you throw some flashing lights and shiny reflectors around autonomous vehicles and they spaz out and go stupid.
At least you can charge humans for such negligence, but who do you hold accountable when the vehicle itself doesn't obey the rules of the road, especially when first responders are on the scene and even sometimes trying to tell the car what to do.
If there's an officer trying to direct the car around or through a detour, autonomous cars aren't trained for all that, so they don't even listen. Humans have ears and brain goop to follow orders and change their direction as necessary though.
I mean let's get real, do you trust bits of cryptic silicon over your own brain? When did humans start placing more trust in computers than their own noodles? God knows we've all seen our share of error messages and BSOD's...
They are right now because there's a focus on getting greenlit to grow this as a business. At some point, a companies will start to chip away at any regulation that may be in place and optimize for lower-cost cars with fewer sensors or prioritize say travel speed over safety, at which point it's likely going to place them exactly where humans are for much decision making.