It’s terrifyingly easy to trick Tesla’s Autopilot

tesla model y
(Image credit: Tesla)

Tesla’s Autopilot driver assist system is reportedly very easy to trick, and can be made to drive even if someone isn’t in the driver’s seat, according to a report from Consumer Reports.

This comes just after some concern over the safety of Autopilot, following the death of two passengers after a Tesla crashed in Texas. Local authorities say that there was nobody behind the driver’s seat at the time of the crash.

Consumer Reports’ researchers were able to get a 2020 Tesla Model Y to drive from a complete stop, and then drive around a closed test track for “several miles”.

The researchers were able to do this using a weighted chain to simulate hand pressure around the steering wheel, keeping the driver’s seat belt buckled, and not opening the door during the test. 

While one of the researchers was in the driver’s seat when Autopilot was engaged he was able to move to the passenger seat without it deactivating. He was then able to activate the car’s acceleration using the Tesla’s steering wheel dial.

According to the report, the car didn’t offer any warning, or indicate the driver’s seat was empty at any point. Because the ‘driver’ had been sitting on top of the seatbelt, the Tesla didn’t register that he had moved. However, had he been buckled in, Tesla’s Autopilot would have disengaged after he unclipped his seatbelt.

Jake Fisher, Consumer Reports’ senior director of auto testing, says this testing shows Tesla is falling behind other automakers. The likes of Ford and GM have developed driver assist technology that ensures the driver is actively looking at the road, meaning these situations shouldn’t happen.

“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat,” Fisher says. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.” 

Following the Texas crash, Elon Musk tweeted that data logs “so far show” that Autopilot wasn’t enabled at the time of the crash. What’s more, he claimed Autopilot wouldn’t have activated on that stretch of road, because it lacked painted lane lines.

Musk has also been a staunch defender of Autopilot, even in the face of criticism that the name is misleading. Musk has repeatedly insisted that Autopilot is a perfectly accurate name, because much like the autopilot in commercial planes it’s designed to lessen the burden of the driver and not offer full automation.

But an argument can be made that ordinary people may not know that, and associate the term autopilot with full automation. In fact, just last year German courts ruled that Tesla’s claims about Autopilot were misleading. Because marketing material and the term ‘autopilot’ suggested fully autonomous driving. Musk has called similar criticisms “idiotic”.

Consumer Reports has made it clear that nobody should try driving a Tesla this way. The site’s testing was done on a closed track, never exceeding 30 miles per hour, and with crash teams at the ready. Doing it on a public road would be incredibly dangerous to you, passengers and everyone else around you.

There are no fully-autonomous cars that can be purchased by regular people. So all those smart driver assist features are just that: They're designed to help assist the driver and are not an excuse for you to take your eyes off the road.

TOPICS
Tom Pritchard
UK Phones Editor

Tom is the Tom's Guide's UK Phones Editor, tackling the latest smartphone news and vocally expressing his opinions about upcoming features or changes. It's long way from his days as editor of Gizmodo UK, when pretty much everything was on the table. He’s usually found trying to squeeze another giant Lego set onto the shelf, draining very large cups of coffee, or complaining about how terrible his Smart TV is.