The U.S. National Highway Traffic Safety Administration (NHTSA) is launching an investigation into Tesla's Autopilot driver-assistance system after a series of crashes.
The NHTSA, according to the Associated Press, is looking into 11 crashes that have taken place since 2018 in which a total of 17 people were injured and one was killed.
- Volkswagen ID.4 review
- These are the best electric cars out right now
- Plus: Tesla Model Y could get a 'super long range' model, and that's a huge deal
In these instances, the Tesla vehicles were in Autopilot mode, an assisted-driving mode that's meant to have the car partly navigate itself while recognizing surrounding traffic, highway lines and other potential hazards.
It's not the type of futuristic autopilot that allows drivers to fall asleep at the wheel and let the car completely take over the driving. That can be confusing given the feature's name.
Most of these Tesla crashes took place after dark, with some involving flashing lights on emergency vehicles, lit road flares or illuminated arrow boards — all things that could conceivably confuse a high-tech car's navigation systems.
"The National Highway Traffic Safety Administration is committed to ensuring the highest standards of safety on the nation’s roadways," an NHTSA spokesperson told Tom's Guide.
"In keeping with the agency’s core safety mission and to better understand the causes of certain Tesla crashes, NHTSA is opening a preliminary evaluation into Tesla Autopilot systems and the technologies and methods used to monitor, assist, and enforce the driver’s engagement with driving while Autopilot is in use."
Sign up to get the BEST of Tom's Guide direct to your inbox.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
According to the AP, there are 765,000 Teslas on U.S. roads that have autopilot capabilities activated. (Autopilot comes with all new Teslas, but cars built before September 2014 can pay to have it activated, according to the Tesla website.) The NHTSA is investigating all Tesla Models S, 3, X and Y vehicles produced between 2014 and 2021.
"NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves," said the NHTSA spokesperson. "Every available vehicle requires a human driver to be in control at all times, and all State laws hold human drivers responsible for operation of their vehicles."
Autopilot should only be used under ideal conditions
At the moment, the National Transportation Safety Board (NTSB), a different government agency that investigates transportation accidents, is recommending that Tesla drivers limit Autopilot use to areas where they know it can safely be operated.
The NTSB is also calling on Tesla to create better systems to ensure drivers are paying attention. Earlier this year, Consumer Reports found that it was very easy to fool Tesla's Autopilot. Tesla has since announced that the camera in the rearview mirror will be used to make sure drivers are awake.
In 2019, a driver in Delray Beach, Florida was killed when his Tesla Model 3 in Autopilot struck a semi-truck that was crossing the road. The car, nor the driver, braked. Another crash earlier this year was also blamed on Tesla's Autopilot due to a crossing semi-truck.
Interagency squabbling
Earlier this year the NTSB called out the NHTSA for lax rules regarding autonomous-driving technology. The NTSB said that the NHTSA failed to put in safeguard and pressure automakers to ensure these systems work properly.
"We believe that the Department of Transportation (DOT) and NHTSA must act first to develop a strong safety foundation that will support the framework envisioned for automated vehicles (AVs)of the future," said the NTSB in a letter.
"The foundation should include sensible safeguards, protocols, and minimum performance standards to ensure the safety of motorists and other vulnerable road users."
The AP attempted to contact Tesla for comment, but the company disbanded its PR department late last year.
Tesla and CEO Elon Musk say that anyone enabling Autopilot must still be fully engaged because the software is still in development, and there are many real-world variables that the system may not be able to fully account for.
Even then, that hasn't stopped people from posting videos online of them acting irresponsibly with Autopilot engaged. In one video, a TikTok star was comfortably sleeping with blankets and a pillow while on Autopilot.
Some of the cases the NHTSA is investigating involves Tesla's crashing into emergency vehicles that are parks on the side of the road.
There's no word on if the NHTSA will also look into Tesla's newer Full Self Driving Mode, which takes over even more duties from the driver, including changing lanes, making turns and parking.
Imad is currently Senior Google and Internet Culture reporter for CNET, but until recently was News Editor at Tom's Guide. Hailing from Texas, Imad started his journalism career in 2013 and has amassed bylines with the New York Times, the Washington Post, ESPN, Wired and Men's Health Magazine, among others. Outside of work, you can find him sitting blankly in front of a Word document trying desperately to write the first pages of a new book.