By Patrick Lin
Tesla’s Autopilot had its first fatality, the company announced yesterday. Statistically, this was bound to happen. The self-driving car had broadsided a truck that its sensors didn’t detect, and the driver didn’t see it either.
Does Tesla have any responsibility for the accident, even if the driver was supposed to be watching the road at all times?
The argument for why the driver, not Tesla, was responsible is that the driver had agreed to always monitor the road, in case of emergency situations exactly like this that the car cannot handle. This is part of the company’s standard agreement before it allows customers to use the Autopilot feature, which is still in beta-testing mode since its introduction last October. (Beta-testing is working out the last bugs in a product before its official release to the public.)
Read the Full Article at www.forbes.com >>>>
ATTENTION READERS
We See The World From All Sides and Want YOU To Be Fully InformedIn fact, intentional disinformation is a disgraceful scourge in media today. So to assuage any possible errant incorrect information posted herein, we strongly encourage you to seek corroboration from other non-VT sources before forming an educated opinion.
About VT - Policies & Disclosures - Comment Policy