Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes. It certainly will. (Credentials: I've worked in the industry).

There are a lot of comments below which are variations on the theme of "this is a stupid scenario; just apply the brakes". These people have a terribly naive understanding of the reality in which cars operate.

The truth is that self-driving cars are part of a dynamic, open-ended system which includes roads of arbitrary quality, and pedestrians and cyclists who will be never be under automated control. Such a system is fundamentally unsafe by design. Its operation relies on probabilistic assumptions of safety rather than secondary safety systems, as you would have in a system which is designed to be safe (eg., airplanes and airports). These probabilistic assumptions include assuming that a tire will not blow out, or that a large pothole will not exist on the other side of a hump in a country road, or that a pedestrian will not trip laterally into the path of a high-speed vehicle. When such events do occur, property damage and/or loss of life is a certainty. This is a low-probability event, but one to which we have so utterly habituated that many people -- such as many of the commenters on this article -- are no longer consciously aware that it is even possible.

Nonetheless, it certainly is possible, and happens all the time. Currently, cars kill more than 2 million people per year, worldwide. Many of those fatalities are due to drunkenness or glitches in human attentiveness -- problems which a competently-designed self-driving car will not suffer from -- but many of these fatalities are due to scenarios such as I mention above. Those will continue. Unless we radically change the entire system -- that is, change the way way we build and maintain roads, and the way we integrate or segregate cars from other users of the road -- then this system will remain unsafe by design. Even if the world converts entirely to self-driving cars which operate perfectly, they will still, with an absolute certainty, kill hundreds of thousands of people per year.

Seriously, y'all need to stop being in denial about this.

Hundreds of thousands of fatalities per year is still a tremendous improvement over millions of fatalities per year. It's an unalloyed good and we should doubtless do it. Nonetheless, it represents a hell of a liability problem for the manufacturers. When a human being has a blowout at speed, we never question the correctness of their actions in the milliseconds immediately afterwards. Of course they don't have an opportunity to respond in an appropriate fashion. Human brains can't do that. Instead, we assign liability based on conditions before the crash. If the driver was speeding, we say that the consequences of the blowout (which can easily be fatal) are their fault. If they weren't speeding, it's the tire's fault. Liability ends somewhere between the tire and the driver. As long as the car company has not sold faulty tires, they have nothing to worry about.

When the driver is both created by the car company and has the ability to react in the milliseconds following an incident, it's a different matter. Lawsuits will be inevitable, and car companies will do everything they can to minimise their exposure.

The scenario where your self-driving car needs to make a decision between sending you over a cliff, or forcing a schoolbus over a cliff, may seem terribly contrived. But in reality, on a worldwide basis, this general class of scenario happens hundreds of times every day, creating a level of liability that manufactures must take damn seriously. If they are able to choose between lawsuits from your family, or lawsuits from every family of every kid on that bus, they'll choose the former every time.

And this is why your car will be programmed to kill you.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: