Federal safety regulators are investigating Tesla electric cars’ Autopilot and “Full Self Driving” (FSD) systems after a series of crashes left dozens of people injured or dead.
The National Highway Traffic Safety Administration (NHTSA) has an investigation team studying 26 Autopilot-related crashes since 2016 that involved at least 11 deaths. The agency is also looking into 12 accidents where Teslas that were using self-driving features crashed into emergency vehicles that had responded to earlier accidents. The self-driving vehicle accidents resulted in 17 injuries and one death.
Statements issued by the NHTSA suggest that the language used in marketing the Autopilot and Full Self Driving technology may be partly to blame. While Tesla states on its website that Autopilot and FSD do not currently allow cars to drive themselves, the misleading names of these technologies has convinced some drivers that they own fully autonomous vehicles. Drivers tend to be overconfident about the capability of these systems and so may be more likely to take their eyes off the road. The NHTSA and the automotive press have tried to get the message out that these systems don’t make vehicles autonomous. For example, Motor Trend’s July 2021 issue featured a column stating “A public service announcement: There are no self-driving cars for sale – anywhere – today. You don’t own a self-driving car, no one you know does, and anyone who tells you differently is wrong. And dangerous.”
In reality, Autopilot and FSD are types of Advanced Driver Assistance Systems (ADAS), which allow cars to “drive themselves” for short periods of time under ideal conditions (such as in good weather, during the daytime and on roads with clear lane lines). However, even the best ADAS systems must be monitored by the driver at all times. The drivers can take their hands off the wheel but must be ready to take control at any moment, or else a car accident with serious injuries can result.
These technologies also can be used to enable other features that could be dangerous. For example, Tesla issued an over-the-air software update that implemented a “rolling stop” capability into FSD, which intentionally programmed cars to slowly roll through stop signs under some conditions. In February 2022, under pressure from the NHTSA, Tesla recalled all 53,822 cars with FSD and agreed to disable the rolling stop feature.
At Gesmonde, Pietrosimone & Sgrignari, L.L.C., our experienced Connecticut lawyers are ready to help anyone who has been injured in a self-driving vehicle accident to obtain the compensation they deserve. We know how to investigate these crashes and determine whether a self-driving feature may have been involved. To talk to a lawyer in our Hamden or East Haven office about your car accident, please call 203-745-0942 or contact us online anytime.
Gesmonde, Pietrosimone & Sgrignari, L.L.C. is located in Hamden, CT and serves clients in and around North Haven, Hamden, Waterbury, Bethany, Milford, Wallingford, Prospect, Woodbridge, Northford, Madison, Beacon Falls, Branford, Cheshire, North Branford, East Haven, Naugatuck, Meriden, Ansonia and New Haven County.
Attorney Advertising. This website is designed for general information only. The information presented at
this site should not be construed to be formal legal advice nor the formation of a lawyer/client
relationship.
[ Site Map ]