

So do you expect self driving tech to override human action? or do you expect human action to override self driving tech?
I expect the human to override the system, not the other way around. Nobody claims to have a system that requires no human input, aside from limited and experimental implementations that are not road legal nationwide. I kind of expect human input to override the robot given the fear of robots making mistakes despite the humans behind them getting into them drunk and holding down the throttle until they turn motorcyclists into red mist. But that’s my assumption.
With the boca one specifically, the guy got in his car inebriated. That was the first mistake that caused the problem that should never have happened. If the car was truly self driving automated and had no user input, this wouldn’t have happened. It wouldn’t have gone nearly 2.5x the speed limit. It would have braked long in advance before hitting someone in the road.
I have a ninja 650. We all know the danger comes from things we cannot control, such as others. I’d trust an actually automated car over a human driver always, even with limited modern tech. The second the user gets an input though? zero trust.
Pichai kissed the ring. He’s colluding with the person who tried to overturn the elections and install himself as a ruler.
All of the billionaires that were at the inauguration are in the same boat. I’m at a point where I believe the crimes of any of them should be tried and convicted with the punishments being doled out collectively to all colluders, but that’s me.
I’d be going for the death penalty from the prosecution side, since it seems like that is what we do now to people who cause one or more people to die, no matter how unethical the victim(s) were.