close
MENU
4 mins to read

On the first self-driving car fatality

Sympathy, and cold statistics. A Kiwi buyer reacts. With special feature audio.

Sat, 02 Jul 2016

40-year-old Joshua Brown, from Canton, Ohio has become the first self-driving car fatality.

On May 7, his Telsa Model S electric car, set to autopilot mode, ploughed its windshield into the bottom of a truck-trailer that drove out in front of it.

In a company blog post, Tesla says drivers are told to keep hands on the wheel at all times when Autopilot is engaged.

In this instance, Tesla says. "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky."

That seems a bit of an assumption. We'll likely never know if Joshua Brown did have its hands on the wheel, as advised, or did not — costing him crucial reaction time even if he did see the truck while Tesla's sensors did not. One report says he might have been watching a Harry Potter movie while his car drove itself. Telsa proponents counter that it's possible the truck pulled out too fast and unexpectedly for any man or machine to take evasive action.

Sympathy - and cold statistics
Tesla's blog offers sympathies, but opens it with a statistic to back autopilot: "This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles."

All Tesla models since October 2014 have had hardware support for self-driving, which allows  cars steer themselves, change lanes, avoid obstacles and park themselves.

It was only enabled — as a $US2500 option — after an October 2015 upgrade.

Mr Brown, an avid Tesla fan, was an early Autopilot adopter who often posted videos to YouTube. In April, just a fortnight before his death, one was even cited by Tesla founder Elon Musk as evidence the self-driving feature can help avoid accident:

Xero and Orion Health shareholders don't have to worry about key-man risk; Tesla fans Rod Drury and Ian McCrae would actually be safer in an Autopilot model than an average car.

The US National Highway Traffic Safety Administration (NHTSA) has opened an investigation into (as Tesla describes it) whether the Autopilot function performed as expected, or was at fault.

Even assuming Tesla is cleared, and its statistics are correct, Mr Brown's death is a psychological blow to nascent driverless car technology.

Local buyer reacts
NBR asked one of the first New Zealanders to order a Tesla, Datacom's Rob Purdy, whether the accident gave him second thoughts about his vehicle (due for delivery later this year).

"None at all. I've not ordered autopilot anyway, but my colleague at work who's ordering one with me is getting the option. No pause from him either."

He added, "I think is safe to a point and they've been exceptionally clear that it's in beta [a test phase] and drivers have to stay alert."

He also notes a recall would not be necessary if any issue is found with Autopilot. Tesla could simply turn off the feature through an over-the-air software update.

Those unpredictable humans
There have now been a number of crashes involving self-driving cars, including a collision between a Google self-drive test car and a bus in San Francisco (no one was hurt; like all vehicles on Google's pilot it had a human driver onboard). 

That February accident was billed as the first time a self-driving car had caused a crash. 

But there have been an unexpectedly high number of scrapes. The problem hasn't been the computerised cars, which meticulously follow the road rules, but the humans that don't. We drink, we get angry, we check our phones, we sneak around a corner as the light goes red simply so we can get to the supermarket 90 seconds earlier, we make an exaggerated  arc into the next lane as we pass a cyclist just because we're in the mood for a little passive-aggression.

Hit a pedestrian to save driver's life?
And then there's the emerging ethics debate.

If your self-driving car faces running over two pedestrians or ploughing its one occupant into a wall, which option does it take?

Does it put its lone passenger's life at risk to save the two pedestrians?

What if there's only one pedestrian, but their height indicates they're a young child. Does that mean their life is worth more than an ageing passenger of a self-driving car?

I'm not quite sure how this is handled, but it might involve the University of Auckland's philosophy department as much as the NZTA.

Some difficult arguments lie ahead. And with the first wave of New Zealanders due to take delivery of Tesla vehicles with an Autopilot option later this year (Tesla has taken 400,000 orders for its new, $US35,000 Model 3 worldwide) and more and more new cars featuring proximity sensors, automatic braking, self-parking and other semi-automated safety features, it's not that far ahead.

© All content copyright NBR. Do not reproduce in any form without permission, even if you have a paid subscription.
On the first self-driving car fatality
59597
false