As it turns out, the impact wasn't too severe. The Waymo cab, to its credit, hit the brakes immediately and avoided knocking the poor little thing over. And moments later, while the robotaxi is still in a daze, the Serve robot drives away like nothing happened.
It appears the video may have been sped up in the part just before the impact. I would like to see the unedited clip, I bet it was way less severe than it looked.
Notice the bias they even throw in to try to humanize the delivery robot. "Still in a daze" personifying it. No mention of who manufactured the robot who crossed a road with a do not walk sign, and then stopped in the road at the end for no reason
Aside from the obvious I, Robot movie allusion, this idea doesn't really work in the real world because robots have to be able to detect the presence and anticipate the actions of non-robots anyway. Unless you're willing to ban all the actual people from the street, which is unreasonable, robot-to-robot communication doesn't actually help you.
Little robot crossed against the signal, and couldn't navigate the curb, big robot cali rolled the right turn and didn't yield to the pedestrian walk way. What a shit show. Glad it wasn't someone in a wheel chair.
I don't know about the equipment of Waymo cars, but I would be surprised if they didn't have LIDARs or some other form of distance based environment detection.
And that should be sufficient to implement basic obstacle detection. You don't need to use machine learning if you can use sensors telling you that "something is too close".
The car collided after hitting the brakes, seems there wasn't any real damage.
It seems the system is designed to only lessen the impact when it detects the obstacle as non-human.
If it would have recognized the robot as human, it would have probably acted differently.
Better to hit the object and lessen the impact than to fully brake/avoid and risk worse.
Well, they should try to avoid any object in the road to be honest. Imagine a new toy comes out that a child is on. Sorry we killed that child l, we didn't train it on that new toy.
The Waymo did exactly what it was supposed to do. The robot ran a red light (really a don't walk through crosswalk) and unexpectedly stopped around a corner before exiting the road as it would normally do. The Waymo spotted it and breaked and got down to about 4 mph. Then the robot drove away and nothing was really damaged so no one is talking about it. All of that information was in the article, but the the titile is ragebait to point anger at self driving cars, it's easy to tell because we don't even know the name of the delivery robot or what company it was working for...
"Delivery robot causes car accident." would be a more accurate title.
But to your point, if a child is playing in the road on a cross walk around a corner when it is saying don't cross, yes it would be a problem. (But in the U.S. the insurance company of the human driver would claim no fault in killing that child and the parent would be charged for damages and possibly worse for letting their kid play in traffic) <- not what I like, but that's what happens