Hackers Tricked Self-Driving Teslas Into Accelerating 50 MPH With a Piece of Tape

A two-inch piece of tape was all that was needed to manipulate a 2016 Model X and Model S.

Hackers Tricked Self-Driving Teslas Into Accelerating 50 MPH With a Piece of Tape

Hackers have figured out a way to fool Tesla’s self-driving Autopilot system, and all it took was a piece of tape.

On Wednesday, the MIT Technology Review published a story outlining how hackers were able to manipulate two different Teslas into accelerating by 50 mph. According to the journal, McAfee researchers Steve Povolny and Shivangee Trivedi were able to trick the cars’ Mobileye EyeQ3 camera by slightly altering a 35 mph sign.

Povolny and Trivedi were able to manipulate a 2016 Model X (pictured above) and Model S by affixing a thin black strip of tape to the sign, extending the middle line on the number three. When the cars drove by the altered sign at 35 mph sign with self-driving cruise control engaged, their cameras misread the number as 85, and both sped up 50 mph.

McAfee says that it informed both Tesla and MiracleEye about the findings. Tesla did not respond to the MIT Technology Review’s request for comment but did acknowledge the research and said it had no plans to fix the generation’s technology. MiracleEye, meanwhile, dismissed the findings, saying the same mistake could have been made by the human eye. Tesla’s vehicles are now outfitted with propriety cameras, while MiracleEye has released several new versions of the cameras in the years since.

Tesla did not respond to a request for comment from Robb Report.

McAfee's altered 35 mph sign

McAfee’s altered 35 mph sign McAfee

The research is just the latest experiment to expose vulnerabilities in autonomous driving systems, according to the journal. Last year, a UC Berkeley professor used stickers to trick an autonomous vehicle into thinking a stop sign was a 45 mph sign, while hackers made a Tesla veer into the wrong lane of traffic by placing stickers on the road. This flaws aren’t limited to self-driving cars, either; a recent study showed that medical machine-learning systems are also at risk.

News of the Tesla experiment comes at something of an inflection point for makers of autonomous vehicles. These kinds of flaws raise serious questions about the safety of driver-less vehicles at a time when automakers are racing to invest serious amounts of time and money into autonomous technology. Time will tell whether these issues get resolved before automakers’ patience and resources wear thin. For now, it’s clear that our robot driving overlords will need to keep us around for just a little bit longer.

More Cars

Comments