On Wednesday of this week, an Israeli firm called Regulus Cyber issued a press release stating that "spoofing attacks on the Tesla GNSS (GPS) receiver could easily be carried out wirelessly and remotely." In the firm's demonstration attack on a Model 3, "the car reacted as if the exit was just 500 feet away — abruptly slowing down, activating the right turn signal, and making a sharp turn off the main road," Regulus said. "Tesla's official response could best be described as" brusque. "
" These marketing claims are simply a for-profit attempt To use Tesla's name to mislead the public into thinking there is a problem that would require the purchase of this company's product.
Tesla official spokesperson
So, a company most of us have not heard of tell us that it's demonstrated disturbing vulnerabilities in Tesla. Tesla, in effect, says the company is just looking for a buck and there's no problem, but it doesn't really provide any details. Where does the truth lie? That question necessitates a look at the merits of this specific Regulus-vs-Tesla claim — and then a broader glance into the history, technology and possibilities of GNSS spoofing itself.
A closer look at the Regulus demo
If you read the opening paragraph of this article and thought that evil hackers took remote control of a car and made it go violently off-road, no strings attached, don't feel bad ̵
We'll get into some of the hairy technical details later, but GNSS spoofing is typically a broadcast attack which can be expected to affect a large area. Putting an antenna on the roof of the model 3 allowed Regulus to use less power than would otherwise be required, and therefore the company could not be accidentally impacted other, unrelated GPS devices nearby. That said, I don't mind giving them a pass on this one; presumably real bad guys would have fewer constraints and would not need to be with the physical antenna and wiring in order to attack someone's car. The real problem is a little less obvious, and you are unlikely to spot it unless you find Regulus Cyber's actual blog post on the experiment – which is much more detailed, and conspicuously not linked directly from the press release
This video from an earlier experiment is an excellent example of the child of "Pied Piper" attack that Regulus successfully carried off against the Model 3. It's entirely possible – even somewhat trivial, if you don ' t mind becoming an instant felon – to use GNSS spoofing to be an autonomous or semi-autonomous car that is not where it was thought, and it should turn on the wrong road.
or wrong the map on a family vacation: sure, you might get lost but the wrong map won't plow the car into a tree. Just like the human driver in our example, an autonomous or semi-autonomous automotive application only uses the GPS to decide which road to take; what is or is not a road at all is decided by local sensors. In a human driver's case, "local sensors" mostly a pair of good old-fashioned Mk I Eyeballs; in the Tesla's, it's radar, ultrasonic, and a suite of eight cameras enabling full-time 360-degree visual coverage. I reached out to spokespersons from Tesla, Uber, and Cruise, and all made similar statements. Essentially, these companies say GPS helps cars decide which road to take, but it has nothing to do with a car decision about what is or is not a road in the first place.
Listing image by Regulus Cyber