Driver Killed: Tesla Model S Auto Pilot Raises Serious Concerns

Tesla Motors, the US-based manufacturer of electric and self-driving cars, is in the news for wrong reasons this time. Its Tesla Model S car was the subject of an accident recently, claiming the life of the driver in the car.

The self-driving car got into an accident while operating in its semi-autonomous Autopilot mode. The driver of the 2015 Tesla Model S was killed in the crash on May 7th in Williston, Florida.


Preliminary investigations have been started by the Federal authorities in US after the first of its kind fatal crash that occurred with Tesla’s self-driving car.

Tesla Motors’ Reaction to the Tragedy

Tesla Motors said in a blog post titled “A Tragic Loss”, that the crash occurred because of “extremely rare circumstances.”

It stated that the mishap involved a tractor trailer moving side by side with the ill-fated Tesla Model S on a highway. The car was operating on the semi-autonomous ‘Autopilot mode’ at the time.

The blog post said:

“Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.”

Company Downplays the Incident

The company tried to downplay the incident. They mentioned that this was the first fatality in 130 million miles driven in vehicles with Autopilot on. Normally in the US, there is an average of one death every 94 million miles driven and about one in 60 million miles worldwide.

Tesla also suggested that if the collision had been a little more precise, it would’ve allowed the car’s advanced crash safety system to kick into action and save the driver’s life.

Tesla added about the Autopilot system that:

(Tesla) disables Autopilot by default and requires explicit acknowledgement [by the driver] that the system is new technology and still in a public beta phase before it can be enabled.

It should be mentioned here that the Autopilot feature requires the driver to keep their hands on the steering wheel at all times and maintain control and responsibility for their vehicle, even if the vehicle is operating in a semi-autonomous capacity.

Concluding Thoughts

With the future holding more promise for self driving transportation solutions, its exactly these “extremely rare circumstances” which serve as a cause for concern. Does this mean that self-driving cars aren’t safe in conditions that involve heavy rains, blinding glares, etc?

Tesla CEO Elon Musk and makers of other self-driving vehicles have a long way to go before the public can be convinced about the safety standards of self-driving cars. At the same, considering the massive number of casualties each year in driving accidents, would we be willing to accept a lower number but shift the responsibility to an autonomous system?

Via Forbes

A techie, Overwatch and Street Fighter enthusiast, and Editor at ProPakistani.

  • If you watch Harry Potter during driving, this is likely to happen, as quoted by trailer driver when he approached the Tesla after accident, he heard Harry Potter being played, weather it was on Car’s screen or on laptop, that has to be investigated further.

    • The accident might have messed with the multimedia controls. Or the driver is just plain lying to save his own behind.

  • So we’ve the First ever -Algorithmic failure leads to the death- tragedy ever in homosepian history as well. Way to go Humanity.

    • Nope, not the first Algorithmic failure. haven’t you heard of autopilots failing in Planes causing them to crash

      • Oh yeah! didn’t came to mind that. let me rephrase then. “Algorithmic failure leads to road kill”

  • Ltd feature videos

    Watch more at LTD