Simple trick fools Tesla’s autopilot

A CHINESE cyber security firm has demonstrated the fallibility of Tesla's Autopilot by placing three small stickers on the road.

The Tencent Keen Security Lab performed "experimental security research" on a Tesla Model S on a closed road, including testing of the lane recognition system that is a crucial component of the self-steering functionality.

Researchers found the car could be fooled by placing three small "interference stickers" on the road, something that sent the car into the wrong lane.

"Based on the research, we proved that by placing interference stickers on the road, the Autopilot system will capture this information and make an abnormal judgement, which causes the vehicle to enter into the reverse lane," said Tencent Keen in publishing its findings.

The Tesla Model S’ electric driveline and driver aids shook up the car industry.
The Tesla Model S’ electric driveline and driver aids shook up the car industry.

The security firm was scathing of how easily the Tesla Autopilot could be tricked, suggesting low level attacks would be easy to deploy.

"This kind of attack is simple to deploy, and the materials are easy to obtain."

The company suggests Tesla should do more with its software and controls to identify fake scenarios, as a human would when driving.

"In the scene we build, if the vehicle knows that the fake lane is pointing to the reverse lane, it should ignore this fake lane and then it could avoid a traffic accident."

During the same test Tencent managed to control the Tesla using a wireless gamepad controller, something Tesla says it previously fixed with "a robust security update in 2017, followed by another comprehensive security update in 2018".

But Tesla clearly doesn't believe there is a problem with Autopilot and doesn't appear to have plans to address the potential pitfall.

"The researchers adjusted the physical environment (for example placing tape on the road or altering lane lines) around the vehicle to make the car behave differently when Autopilot is in use," Tesla said in a statement to Tencent Keen. "This is not a real-world concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should be prepared to do so at all times."

Tesla has been the most aggressive car maker in promising full self-driving functionality.

Since late 2016 Tesla said all cars it produced were fitted with the hardware for full self-driving capability, raising eyebrows at rival car makers, who believed more hardware - including lidar, or laser radar - was required.

In 2017 Tesla promised that one of its cars would drive itself unaided from Los Angeles to New York, a trip yet to take place.

Tencent Keen was the company that hacked and remotely operated a Tesla in 2016 by connecting it to a malicious Wi-Fi hotspot.

The vulnerability prompted Tesla to provide a software update that was implemented in 10 days.

Tencent Keen has also previously unearthed security concerns with some BMWs.

Property boom: The suburbs reaching staggering prices

Premium Content Property boom: The suburbs reaching staggering prices

Is your suburb on the list of crazy sales?

Broken bones and maggots: Nan’s horror death in aged care

Premium Content Broken bones and maggots: Nan’s horror death in aged care

She died with maggot-infested bed sores from negligent nursing home

TERRIFYING: Dashcam vision shows highway crash

Premium Content TERRIFYING: Dashcam vision shows highway crash

Shocking Maclean dashcam footage shows moments before crash on new Pacific...