You can confuse self-driving cars by altering street signs

It doesn't take much to send autonomous cars crashing into each other.


While car makers and regulators are mostly worried about the possibility of self-driving car hacks, University of Washington researchers are concerned about a more practical threat: defacing street signs. They've learned that it's relatively easy to throw off an autonomous vehicle's image recognition system by strategically using stickers to alter street signs. If attackers know how a car classifies the objects it sees (such as target photos of signs), they can generate stickers that can trick the car into believing a sign really means something else. For instance, the "love/hate" graphics above made a computer vision algorithm believe a stop sign was really a speed limit notice.
It's easy to see the potential problems. You could make these stickers using a printer at home, so anyone from dedicated attackers to pranksters could try this. It might lead to a crash the moment someone alters the sign, but it could also produce long-term chaos -- picture your city closing a road until maintenance crews can scrape the stickers off a sign.
There are ways to fight this. The research team suggests using contextual information to verify that a sign is accurate. Why would you have a stop sign on the highway, or a high speed limit on a back road? We'd add that local governments could also install signs that use an anti-stick material, or put them out of reach. Whatever happens, something will have to change if passengers are going to trust self-driving cars' sign-reading abilities.
Source: ArXiv.org

Featured Post

Connected Cars landscape