It's not just Tesla: Report claims all semi-autonomous cars are easily fooled

tesla model s plaid
(Image credit: Tesla)

Tesla gets a lot of flak for how easy it is to trick its Autopilot driver-assistance system into thinking there’s an attentive driver when there isn’t. As deserving as that criticism is, it’s worth remembering that Tesla isn’t the only automaker with this problem.

Car and Driver put 17 different cars, all equipped with semi-autonomous driving systems, to the test to see just how easy it was to fool them. As it turns out, they can all be fooled pretty easily, proving that the auto industry has a lot of explaining to do.

It’s all very concerning, especially since Car and Driver’s first two tests were to unbuckle the seat belt and take the hands off the steering wheel while adaptive cruise control and lane centering were switched on. 

A number of cars, including two Teslas and a Cadillac, immediately slowed to a gentle halt. Others, like a Subaru, warned the driver to cut that out. Most of the cars did absolutely nothing.

Car and Driver did find that cars would eventually slow down if the driver took their hands off the wheel for a certain amount of time. However, that time differed from car to car. In the most extreme case, a Hyundai drove for 91 seconds (covering a mile and a half at highway speeds) without hands on the wheel.

The hands-off test was repeated using 2.5-lb. ankle weights strapped to the steering wheel to try to trick the car into thinking a driver was touching the wheel. That's a trick we’ve heard about before. 

In most cases, it appears that the cars were easily fooled, since they measured torque on the steering wheel as a way to gauge driver attentiveness. Some cars didn’t, including a BMW and a Mercedes, since their systems rely on touch instead. Those cars also couldn’t be tricked with tape or zip ties touching the wheels.

GM Super Cruise, considered by many the best autonomous driving system on the market, does allow for hands-free driving on pre-mapped stretches of road for extended periods of time. You can’t trick a system that has been designed to let you sit around with your hands off the wheel. Or can you?

Super Cruise uses an infrared camera pointed at the driver's face to check whether the driver is actually looking at the road ahead. So Car and Driver altered their test by taking their eyes off the road — and the car responded by coming to a stop. 

However, the testers found that GM's Super Cruise could easily be fooled with a pair of gag glasses with printed-on eyes. That’s right — the kind of thing you’d expect to see in a Simpsons gag can fool one of the most advanced driver-assistance systems around, even when there's no one in the driver’s seat. 

In fact, every car tested was able to drive around by itself without a human in the driver’s seat. Some of them required a weight on the steering wheel to keep driving, but they would otherwise continue indefinitely without an actual driver.

So what does this mean?

We’ve already seen instances in which drivers are willing to turn on a semi-autonomous system and use it as an excuse to not pay attention to the road. People keep getting arrested for doing just this in Teslas, and even have been caught literally sleeping in the passenger seat while the car drove around by itself.

As Car and Driver points out, drivers are willing to share tips on how to fool their car’s safety systems, in a wave of irresponsibility that is beyond idiotic. The fact that these systems are so easily fooled is a problem that really needs addressing — especially, as the video points out, because a lot of basic precautions are missing.

Not a single one of the 17 cars tested had any sort of sensor to detect whether someone was actually in the driver’s seat or not. This is weird, considering that passenger seats have to include a weight sensor so the car knows whether to deploy an airbag in the event of a crash. 

A weight sensor in the driver's seat wouldn’t stop drivers from intentionally disregarding basic road safety, but it would stop them climbing out of the seat.

Plus, as Jalopnik points out, the fact that some cars brake to a halt when the driver takes hands off the wheel is dangerous in itself. Stopping in the middle of the highway is a pretty bad idea, especially if it’s because the driver fell asleep. 

Jalopnik points out that teaching a system to pull over to the side of the road, then coming to a halt, would be a much safer option — although that may not be possible until Level 3 autonomous systems become more commonplace.

This is clearly an issue facing the entire auto industry rather than just one or two manufacturers. And just because everyone is doing it, doesn’t excuse the likes of Tesla who get the most negative attention.

Just remember that the limited autonomy your car has is designed to assist the driver, not take over. You can’t buy a fully-autonomous car just yet, and for the safety of you and everyone around you, it’s essential you familiarize yourself with the limitations of your own car’s autonomous-driving capabilities.

TOPICS
Tom Pritchard
UK Phones Editor

Tom is the Tom's Guide's UK Phones Editor, tackling the latest smartphone news and vocally expressing his opinions about upcoming features or changes. It's long way from his days as editor of Gizmodo UK, when pretty much everything was on the table. He’s usually found trying to squeeze another giant Lego set onto the shelf, draining very large cups of coffee, or complaining about how terrible his Smart TV is.