11/16/2022 / By Arsenio Toledo
Tesla is about to face a major manslaughter trial over a fatal crash caused by one of its vehicles that was operating on autopilot. The trial is expected to answer who will be responsible for fatalities caused by self-driving cars.
The main person on trial is Kevin George Aziz Riad, a 27-year-old limousine driver. In 2019, Riad exited a freeway in southern California while driving his Tesla Model S and ran a red light that crashed into a Honda Civic, killing Gilberto Lopez and Maria Guadalupe Nieves-Lopez.
Investigations showed that Riad’s Tesla’s autopilot, which can control steering, speed and braking, was engaged at the time of the crash that killed the couple. (Related: Self-driving vehicle legislation held up by the question of who to blame in a crash.)
Tesla itself does not face charges in Riad’s case. But the trial and its proceedings could shape public perceptions of the company and act as a test case for whether legal standards need to catch up with the speed with which technology is advancing.
“Who’s at fault, man or machine?” asked Edward Walters, an adjunct professor at the law school of Georgetown University, who specializes in the law regarding self-driving vehicles. “The state will have a hard time proving the guilt of the human driver because some parts of the task are being handled by Tesla.”
Riad’s lawyer said his client should not have been charged with a crime. But prosecutors are trying to argue that Riad could have reduced his speed and controlled his brakes.
In addition to the criminal trial, the family of Gilberto Lopez is suing Tesla.
“I can’t say that the driver was not at fault, but the Tesla system, autopilot and Tesla spokespeople encourage drivers to be less attentive,” said Donald Slavik, an attorney whose firm is representing Lopez’s family. He added that Tesla understands the risks of its system but consistently fails to manage these. “Tesla knows people are going to use autopilot and use it in dangerous situations.”
The trial comes as Tesla faces growing scrutiny and criticism over its autopilot feature as claims rise about it making drivers inattentive and contributing to accidents and deaths.
Last year, the National Highway Traffic Safety Administration opened an investigation into over a dozen incidents involving Tesla vehicles that crashed into parked emergency vehicles over a period of four years. These Teslas had their driver-assist features engaged, and the crashes resulted in multiple injuries and one death.
The Department of Justice itself is investigating whether Tesla should face criminal charges over its self-driving claims.
On its website, Tesla claims that the driver-assistance systems “require active driver supervision and do not make the vehicle autonomous,” strongly suggesting that the company believes that any crashes that occur while autopilot is engaged are not the fault of the technology.
But a class action lawsuit has already been filed in a San Francisco court alleging that Tesla’s marketing of its autopilot features and its Full Self-Driving technology is deceptive and misleading drivers into thinking they can do less work while on the road.
Riad’s lawyers are likely to point to this class action suit as well as the Justice Department’s ongoing investigation as evidence that their client does not bear sole responsibility for the fatal 2019 crash.
“It should not be assumed that Riad was blindly relying on autopilot simply because he was driving a Tesla,” said criminal defense attorney Cody Warner who specializes in autonomous vehicles. “But it’s hard to escape the conclusion that Tesla’s recent reputation for moving quickly and breaking things – even at the expense of public safety – has been imputed to Riad.”
More news about malfunctioning technology can be found at Glitch.news.
Watch this clip of a Tesla on autopilot going out of control and crashing into people and other cars.
This video is from the FalconsCafe channel on Brighteon.com.
Tesla deploys in-car cameras to spy on its own drivers.
Two killed as Tesla car with no one in driver’s seat crashes in Texas.
Video proves that Tesla autopilot can get you killed.
Sources include:
Tagged Under:
AI, artificial intelligence, autonomous cars, autonomous vehicles, autopilot, cars, computing, crime, Dangerous, Elon Musk, future tech, Glitch, information technology, inventions, robotics, robots, self-driving cars, technology, tesla
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2017 FUTURETECH.NEWS
All content posted on this site is protected under Free Speech. FutureTech.news is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. FutureTech.news assumes no responsibility for the use or misuse of this material. All trademarks, registered trademarks and service marks mentioned on this site are the property of their respective owners.