Living in the 21st century: drones versus autonomous cars

Sometimes the 21st century actually sounds like the 21st century of futurism and science fiction.  Sometimes it appears that way through a certain story or the development of a given trend.

And sometimes trends coincide, then produce something new.

Case in point: an Israeli research team just published a paper describing a very creative hack combining trends in drones and autonomous cars.   They used a drone-mounted projector to mess with a car’s advanced driver assistance systems.

Let me break this down, then explore implications.  I’ll engage in some good, old-fashioned futurist speculation.  And I’d love to hear everyone’s thoughts.

First, what’s the story?

From the paper’s abstract:

We injected spoofed traffic signs into Mobileye to assess the influence of environmental changes (e.g., changes in color, shape, projection speed, diameter and ambient light) on the outcome of an attack. To conduct this experiment in a realistic scenario, we used a drone to carry a portable projector which projected the spoofed traffic sign on a driving car. Our experiments show that it is possible to fool Mobileye so that it interprets the drone carried spoofed traffic sign as a real traffic sign.

“Attacking a car while driving (a) the drone with the projector used in our experiments, (b) visualization of the threat model implementation, (c) the moment of the attack (the projected sign is boxed in blue, the attacker’s drone is boxed in purple, and the victim’s car is boxed in red).”

In other words, they used a drone to fool a car’s autonomous systems, convincing it of a false speed limit.  Not in a simulation, not in a lab, but on a street, using a real car (Renault Captur), real software, and real drone.

This isn’t *exactly* hacking a self-driving car with a drone.  It does involve a drone.  It is a hack, but advanced driver assistance systems aren’t full autonomous car drivers.  A human controls an ADAS-equipped vehicle, relying on the software for information and, in some cases, driving assistance, so it’s a level 0 (of 5) car, in the SAE rankings.   Read the paper for more details, like the color of projected light, size and shape of projection, etc.  And check their short video:

(Note that three of the five authors are interns.  One’s a grad student.)

But you can see where this might be headed in the near future: drones versus autonomous cars.  As self-driving car tech gets better and more widely deployed, these drone attack tactics could keep pace, evolving new capabilities and dimensions. It’s a 21st century arms race.  For the sake of argument, let’s bank on certain present-day forces continuing to exert themselves: the ingenuity of hackers; the creativity of the emerging autonomous car world; our social love of gadgets.

Now, if this is one possible way those forces and technologies are interacting, where could it all go?  What does this metatrend portend?

Since the idea is out there now, it’s likely that people will try implementing and iterating variations of the attack.  Based on what we’ve seen of drones, imagine: human-operated versus automated attacks; multiple drones, building up to swarms; multiple attack vectors (imagine one shining lights within the car to distract the driver, while another targets the LIDAR, a third projects images on other surfaces including other cars, etc.).  Autonomous cars have to deal with the classic trolley problem; how about drones convincing them that there’s a small child darting into the road, forcing the car to stop or skew suddenly?  Or several children, one after the other?  That’s all short of doing physical attacks, like sacrificing a drone under a tire or against a windshield, printing and pasting down AI-confusing stickers, or directly delivering an explosive charge.

Remember that drones have far greater freedom of movement than cars do.  They are also much cheaper, and getting more so.  Imagine spamming an autonomous car with dozens or hundreds of drones, each with a different attack vector and mission.  You could be driving down a highway, or being driven in your new autonomous car, when the car alongside you is engulfed in a blizzard of buzzing, flashing confusion.

Such attacks could be launched from a nearby site or directed from around the world, as the American global drone war has proven.  If it’s from nearby, a van stuffed full of equipment and personnel could follow the target at a safe distance, disgorging drones in swarms.  Or drones could be launched by hand from pedestrians walking nearby, equipped only with duffel bags or carrying cases.

If people actually deploy this kind of hack or attack significantly, or if the idea scares us enough, countermeasures are inevitable.  Based on recent history and technological practices, we could imagine watermarking traffic signs and requiring car software to only acknowledge those.  Nassi et al suggest using QR codes

contain[ing] a signed message to the approaching car, informing the car of the traffic sign ahead, and its type, coordinates, and digital signature for authentication during the recognition process.

The report also suggests using updated digital information, a la Waze, shared between cars.  Or perhaps we could make signs into IoT devices, networked and smart, so need to make a handshake exchange with each car for the car’s software to work with it.  Naturally the attackers can try spoofing or otherwise outflanking such countermeasures, so we could expect a car-sign-drone arms race.  And the short history of IoT provides all kinds of ways this can go wrong.  Imagine a distributed, denial of service attack mounted through smart traffic signs or, less grandiosely, car collisions piling up because a smartsign’s been compromised or just failed.

How else could cars and their passengers defend themselves?  The aforementioned Waze-like system could also carry information on the latest drone attacks, updated as each iteration occurs.  Perhaps cars can fire off drones of their own, leading to aerial combat.  There’s precedent for this in the design world, as Box Clever has already proposed car-launched drones for traffic guidance and Blade Runner 2049 portrayed aerial cars sending off little drones.  Meanwhile, some drivers, especially in America, might feel impelled to shoot down enemies with weapons of all sorts, which might make things awkward for other drivers at shots ricochet and tumbling drone parts rain down.

In a crowded urban environment such countermeasures might not work, given rapidly changing conditions and the dense congestion of many non-hostile drones, cars, and buildings.

Surely this provides better cover for an attack.  In fact, a proliferation of drones could make cities much more crowded, even dangerously so.

Such an attack could easily be invisible to the human eye.  In the report linked above is one quiet, fascinating detail:

We discovered that a projection speed of 100 ms is sufficient for fooling the system. We were unable to fool the system with faster projection speeds probably due to the frame per second rate of the optical sensor of the Mobileye.

Did you catch that?  The projection only needed to last for 100 milliseconds to work.  It’s just a flash, not a drive-in (drive-by!) movie.  Can attackers in a crowded area command drones to rapidly flicker images, indistinguishable to humans from the ambient lighting of ads, streetlights, etc?  Or could such hack-flashes be embedded in otherwise benign projections, the autonomous car’s equivalent of a subliminal ad?

Perhaps the threat would be enough to influence behavior.  Consider, for example, a driver receiving a message (through text, or a drone-projected image, or a phone call) informing them that unless they take certain steps (drive to a certain location, cough up some bitcoins, host drones for another attack) they’ll be subject to a drone hack.  It would be ransomware for the 4th Industrial Revolution.

Let’s scale up from the question of an individual car and think in terms of groups and cultures.  Imagine a malign actor herding cars in certain vectors through projector-wielding drones.  Multiple, simultaneous attacks on multiple vehicles could wreck havoc in a given area.  This could take the form of a large, sudden crash, or several different hits across an urban area.  They could, for example, collide with key infrastructure points, seize up traffic chokepoints, block emergency vehicles. How much would it take for drone-car attacks to paralyze a city?

How would we react, once such attacks are known to occur?  We might not change our behavior, of course; as every IT professional knows, we have a long history of happily ignoring security recommendations.  We could also become irrationally afraid, shunning all self-driving cars and/or drones, or at least avoiding certain areas of town because of rumors about attacks there.  “It’s a bad area, all right.  My sister’s cousin knew someone whose car was swarmed with trolley problems!  He couldn’t get to work for hours!”  “I saw hundreds of drones coming off of that building on CNN.com!  You couldn’t catch me anywhere near it.”

If cities are easier areas for such drone hacks, perhaps we’ll see more open spaces as safer.  This could darken our imagination of cities and add a bit of allure to rural areas.

Business opportunities would ramify.  Armored car firms would have to prove their security against such attacks.  Autonomous car providers would face pressure to provide defenses, perhaps drawn from new third party companies.  Imagine a defensive drone deployment service which nervous car passengers could summon from a mobile device.  Data-fueled digital giants could see opportunities to deploy their strengths and expand markets.  Illicit businesses would obviously appear as well.

Imagine the regulatory challenges.  Would car defense drones have to be licensed, or all drones carefully monitored to make sure they aren’t up to no good? I could see local and national governments requiring drones to not be able to carry projectors, somehow, or mandating geofencing or certain sonic signatures to keep them away from roads.  Does a chauffeur’s license start requiring familiarity with anti-drone skills?  If congestion is a fear, cities could start charging drone (owners) congestion pricing to keep space clear. Would drone overcrowding cause cities to try to reduce aerial congestion?  Could police forces use drone-projectors to safely stop suspects in high-speed pursuits, such as by convincing cars to pull right over?  Or would some local police forces whitelist approved drones and promise to down all others?  A well publicized drone-car attack could stampede politicians into short-sighted and ill-informed regulations, not for the first time.

The politics could become quite tricky.  Industries that depend on stable transport would lobby the heck out of government officials for protection. There’s always the chance of politically motivated drone-car hacking, of course, with one group (revolutionary cell, nation state) using projections to cause spectacular crashes, or immobilizing part of a city with carefully misdirected traffic. As Ars Technica observes, a drone projector attack doesn’t leave much of a trail, so it’s good for anonymous attacks.

No physical alteration of the scenery is required; this means no chain of physical evidence, and no human needs to be on the scene. It also means setup and teardown time amounts to “how fast does your drone fly?” which may even make targeted attacks possible—a drone might acquire and shadow a target car, then wait for an optimal time to spoof a sign in a place and at an angle most likely to affect the target with minimal “collateral damage” in the form of other nearby cars also reading the fake sign.

Autonomous car policies and politics could separate by ideology, party, region, or even religion.  “My opponent is weak on car defense – typical of a Democrat!”  “Drone defense is a right, not a privilege!”  “Our state doesn’t suffer from projections, unlike our southern neighbors.  Coincidence?”  “Drones swarming cars are a sign of divine displeasure.”   Anti-technology groups would see this as further evidence of society ignoring Kaczynski’s warnings, and maybe win some adherents.  Today’s critics of digital giants would scrutinize Waze-like operations most carefully.

All of this is just within a few years of today.  I haven’t started connecting this dual-trend story with other trends, but you can.  Think of developments in technology and how they could be involved: AI, 3d printing, augmented/virtual/mixed reality, for starters.  Consider non-tech trends and how they might impact and be impacted: demographics, identity politics, globalization and its opponents, just for starters.

Then imagine what happens once this wave rolls through civilization.  Say we have a running arms race between increasingly smart cars and ever more capable drones, and some of the basic political, cultural, social, and technological implications I’ve speculated upon here have come to pass.  That’s when the harder question becomes: what’s next?

(drone in case photo by Cola Richmond; drones in front of crowd by Ars Electronica; ditto drones on grass; drone on red background by Michael Coughlan; America’s Watch by Jeff Gates)

Liked it? Take a second to support Bryan Alexander on Patreon!
Become a patron at Patreon!
This entry was posted in technology, trends. Bookmark the permalink.

One Response to Living in the 21st century: drones versus autonomous cars

  1. Pingback: Verkehrszeichenerkennung?! – b1

Leave a Reply

Your email address will not be published. Required fields are marked *