Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Note that the statistics we have to work with are relatively terrible. For example: Waymo's favorite statistic is "x miles driven", which is a terrible/useless statistic, because it treats all miles equally, and fails to note that the most complex driving is often short distances (intersections, merges, etc.) and doesn't account for the fact that most of those miles were repeatedly driving on a very small number of roads. But it looks good on marketing copy because it's a big number.

Additionally, our self-driving car statistics we tend to see today also tries to ignore the presence of test drivers and how frequently they intervene. As long as they can intervene, the safety record of self-driving cars is being inflated by the fact that there's a second decisionmaker in the vehicle.

EDIT: Oh, and human driving statistics are also bad because a lot of accidents don't even get reported, and then when they do get reported, a lot of it is through different insurance companies, that's before we get into nobody centrally tracking "miles driven", and that's why most statistics for human driving safety are more or less just an educated guess.



I think it’s fine to attribute miles to autonomous vehicles which have drivers that can intervene... as long as that is how they are used outside of tests as well.

Just a guess, but I doubt having a test driver that can intervene will help safety statistics for autonomous vehicles much. I think we’ll find test drivers will usually be unable to notice and react fast enough when they are needed.


My biggest concern is that test drivers can and have done the 'hard parts' of a lot of test routes, making the overall miles driven statistic kind of useless as a representation of the driving ability of the car.

But yeah, I'd agree there's a lot of difficulties expecting a test driver to immediately take over in a sudden event like a bike entering the roadway.


I agree the statistics we have are pretty terrible. However, in this case, I think Waymo's statistic is actually quite useful. It's likely that Waymo's x miles driven statistic is largely driven by the fact that Waymo has tested their cars on a small number of roads, in fairly safe settings. But that paints Uber in an even worse light. Waymo is supposedly ahead, or at least on par with Uber in self driving technology, and they have chosen to limit their testing and driving to safer and a limited number of roads. Uber has not. That seems to underscore the fact that Uber has pushed beyond their tech's capabilities even though their competitors have determined the tech isn't there yet.

Also, if someone had posted a poll a day ago as to which company's self driving cars were likely to be the first to kill somebody, I think the vast majority of people would have predicted Uber. I don't think that's a coincidence.


Note that Waymo was the loudest about how they shouldn't be forced to have a steering wheel in their cars, already, and that they're also massively ramping up in Arizona, because of the nonexistent regulations. In Arizona, Waymo won't be forced to disclose statistics on disengagements, for example, which California does require they hand over.


Waymo is already using autonomous vehicles in AZ without a safety driver behind the wheel. There are fully autonomous Waymo vehicles driving around in parts of AZ.


Which is why I am only a little bit surprised Uber beat Waymo to killing a pedestrian. Waymo is way too arrogant about it's capabilities, moving way faster than is reasonable or safe, and they use misleading statistics to insinuate their system is more capable than it is.

Note that they already know their cars will need help driving still, which is why they've assembled a call center for remote drivers to take over their cars in Arizona. Of course, those remote drivers likely can't intervene at nearly the speed of an onboard safety driver.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: