> “They are going to attempt to try to find who was possibly at fault and how we can better be safe, whether it’s pedestrians or whether it’s the vehicle itself.”
Autonomous vehicles should automatically be 100% liable in every accident in which a person is injured, and it should be paid for by a levy on every vehicle.
A levy is fair because the per vehicle risk is much more uniform than with a driver behind the wheel.
100% liability is fair because the software should be written so that the vehicle predicts the path of all objects within its range and adjusts is speed/trajectory so that there is negligible risk of collision given the worst case behaviour of the object. If an object doesn't show on the car's sensors, then its still the car's fault as the sensors should be built to they do pick up all objects. There is a complete power imbalance otherwise: software cannot be injured in a collision but people can, so all people should be 100% covered to give the software incentive to be the best it can.
Summary: a car is governed by predictable physics, hence there is no such thing as an accident involving an autonomous vehicle, only mistakes.
Given current road and vehicle design, that's an incredibly impractical goal. There simply isn't enough separation between pedestrians and roadways; to keep injury risk negligible you'd also have to keep speed incredibly low (way below speed limits) on all but the best limited access highways.
If the vehicle can only crawl at 5MPH because you never know when a pedestrian might just dive in front of you... that's just impractical for real vehicle operation.
It's one thing to hit the brakes because somebody decided to jaywalk without looking; another to expect there is always enough space to stop in those situations.
I disagree. Pedestrian density is generally low enough that the car could generally be doing 60km/h and not hit a "worst case" pedestrian.
For example, how many people do you see walking along freeways? If a person is on a freeway then cars should be slowing down for them. It's a rare enough event that cars could be programmed to avoid all collisions with minimal impact on average speeds.
If a car is around people then it should be going at a low speed. If there are enough people around that it can't get speed up, then that car is effectively in a "shared" or "local traffic" area and should be going slowly.
I'd argue that the effect on average speeds will be similar to having low speed limits on residential streets:a slower period at journey beginning/end but the majority of the journey would be at speed on sparsely populated arterial roads.
Here's another argument: intrinsically safe programming might even increase average speeds? Why do we even need traffic lights and pedestrian crossings with autonomous vehicles? If vehicles were 100% safe, we could do away with dedicated crossings, and there would be no need for vehicles to waste time at red lights, merely slowing down on the relatively rare times when it is necessary.
Autonomous vehicles should automatically be 100% liable in every accident in which a person is injured, and it should be paid for by a levy on every vehicle.
A levy is fair because the per vehicle risk is much more uniform than with a driver behind the wheel.
100% liability is fair because the software should be written so that the vehicle predicts the path of all objects within its range and adjusts is speed/trajectory so that there is negligible risk of collision given the worst case behaviour of the object. If an object doesn't show on the car's sensors, then its still the car's fault as the sensors should be built to they do pick up all objects. There is a complete power imbalance otherwise: software cannot be injured in a collision but people can, so all people should be 100% covered to give the software incentive to be the best it can.
Summary: a car is governed by predictable physics, hence there is no such thing as an accident involving an autonomous vehicle, only mistakes.