I have long been fascinated by why people are so disquieted by drones. While pretty much every one alive in 2019 experiences some degree of acute tech anxiety, drones, as a category of objects, still inspire an unusual amount of disquiet – much more so than, say, an iPhone. This distrust extends to both consumer drones you can buy at the mall and to enormous militarized drones: anything with the word ‘drone’ appended to it inspires clickbaity headlines and nervous conversations at bars. Some of this, I think, can be attributed to the fact that we still lack widely-agreed upon modifiers for the word ‘drone’ in our society, which differentiate between objects in the overall ‘drone’ category – which means that it’s easy for people to assume that a $500 multirotor you can buy at the mall and a Predator drone are actually somewhat similar objects.
But that’s not all that’s going on. It is apparent to me that there isn’t a single, easily-explained reason why people distrust and fear drones as a general category of things. There are actually many interlocking reasons, ranging from the very obvious (there isn’t a person in them and some of them can be used to blow things up) to those that are more subtle.
What I want to do with this series of blog posts is to describe some of the reasons for drone-distrust that I’ve come across or conjectured about. This is both for my own amusement, and for a more practical reason: public distrust of drones drives me and my drone sector colleagues absolutely nuts, and the first step towards figuring out ways to address the public fears is clearly describing what they are. So let’s begin with this reason: Drones are inscrutable.
By that, I mean that it is very hard for people to know how much a drone knows, or how smart it is, just by looking at it. Most people do not know very much about drones as a general category of flying objects. Certainly they haven’t (and have no desire to) delve into the various technical details regarding each platform’s specific capacities. Nor is it likely that this is going to change much. Despite the lofty hopes of many consumer drone companies, we do not presently and probably never will live in a world where the average person has much use for a drone, beyond their obvious utility as a last-minute present for Father’s Day. While civil service and public utility drones are becoming much more common, they’re still nowhere near typical, as boring and unnoticeable as a city truck with its flashers on. Nor do most people know much about, or have reason to think much about, the distinctions between a cheap consumer drone and, say, a massively expensive Reaper UAS used by the military to zap people on other continents. This understandable lack of public familiarity with drones leaves plenty of room for confusion, misapprehension, and outside influence.
The average person’s knowledge gap around drones is then reinforced by our media (writ large), which absolutely loves to portray drones – even small, cheap-looking ones – as much more intelligent and sophisticated than they actually are today. (This is also why my loved ones hate watching TV with me now). TV and movies today are absolutely lousy with small drones that can: fly for days at a time, stealthily and silently track people both inside and outside of buildings, shrug off terrible weather and the occasional bullet, and can make complicated strategic decisions on their own, without human input. TV and movie drones are majestically capable, terrifying devices. They make real drones, the kind I actually fly, look like total delicate dumbshits.
Which, let’s be frank, they are. Real drones are dumb and fragile as hell, especially when we compare them to their fictional brethren. In Real Life, my (nice!) $1200 quadcopter drone is not capable of: reliably navigating around trees, following me in an area that isn’t an open field, avoiding taking photos of things I don’t want it to take photos of, being controlled from more than about a mile and a half away, and reliably avoiding plummeting into a lake or something if it loses radio signal. Its battery lasts for approximately 25 minutes if I’m lucky, and it makes a sound somewhere between a dentist drill and a demented, enormous bumblebee. It cannot fly in rain or snow beyond a gentle drizzle, or navigate winds above 20 miles an hour. While it can navigate a pre-programmed flight path using GPS capability, and take photographs at pre-determined intervals (both very cool features), it’s still not capable of making its own value judgements about where to go or what to photograph.
It has no autonomy, but instead just does exactly what I tell it. Drone facial recognition software, while a frightening possibility, remains mostly theoretical and is prone to even more errors than the already horrifyingly-error ridden terrestrial type of facial recognition tools. While there are fancier small drones on the market, they’re really not capable of doing much more than mine can. But again, the average person doesn’t know much about the actual dumbshitness of civilian drones, and doesn’t really have a good way to know that, unless they use drones themselves. Certainly drone companies aren’t eager to brag about how stupid their drones actually are, and ‘Drone Fails to Do Anything Exciting’ is not a compelling tech magazine headline.
This leaves people to draw their own conclusions about drones, and what I think they often conclude is that drones are much more intelligent and capable than they actually are. However, the actual parameters of that smartness are also confused, because these media drone portrayals often vary widely in what they show drones doing. Taken together, this means that people know 1. Drones are smart and capable and 2. It’s unclear exactly how smart and capable they are, or what that actually means in practice, which is scary. In other words, to most people, drones are inscrutable.
This inscrutability is compounded by a few factors. For starters, consider that a drone, visually, betrays almost no information about what it is up to or who is flying it to the average schmo on the ground. Police and fire vehicles, even news helicopters, usually have some kind of marking on them that gives us a hint about where they’re from and what they’re up to. Drones do not, and even if they did, they’re so small and so distant that it would be hard for us to see it. We still lack any system that might allow someone on the ground to identify a drone, or at least read off a drone license number: the only way to know what a drone is for is to find the pilot. There’s no clear means of determining exactly who is looking through the computerized eyes of a drone, and what they’re looking for. Or what is looking through the drones eyes, because we are also uncertain about the degree to which drones have minds of their own.
While no one relishes the idea of being spied upon by a person, being spied upon by a creepy human being with a telescope is not a particularly novel or shocking idea. A human being, even a creepy one, operates within human parameters and can presumably be subject to human reason and justice. Being spied upon by a machine, however, is much more disturbing. A good old-fashioned creeper, in the pre-Internet era, had a limited ability to disseminate imagery of you in your underpants far and wide. A machine-equipped creeper has the ability to disseminate those images to everyone on the planet with a pulse in 15 minutes, before you’ve even known that it’s happened. This is terrifying. Even more terrifying, perhaps, is the idea of a machine itself making the decision to disseminate those images – or perhaps making the decision in collaboration with a creepy human being.
To be sure, this human concern over what it means to have a machine mediating our perception of the world is a relatively old one, an anxiety that ebbs and flows in our culture. Andre Bazin wrote, in 1945: “Originality in photography as distinct from originality in painting lies in the essentially objective character of photography. For the first time, between the originating object and its reproduction there intervenes only the instrumentality of a nonliving agent… All the arts are based on the presence of man, only photography derives an advantage from his absence.”
Drone photography take this modern dynamic that Bazin describes even further, because a human being is by definition absent (or at least, is in control of the situation from very far away) when a drone snaps a picture. Perhaps even more importantly, we – living as we do in an era of AI hype – may not be entirely confident that a drone *is* a “nonliving agent.” AI is new, and so are drones, and there is plenty of kerfuffle in the media about AI being attached to drones.
We therefore shouldn’t be surprised if some people also assume that all drones are equipped to some degree of AI that is much more sophisticated (and frightening) than what is actually possible today, as part of their overestimation of the general abilities of the technology. Insofar as they are aware, a drone may very well be looking at you – and making choices about following you – in and of itself, without being explicitly guided in its every action by a human being.
The drone is turned into something less like a ‘dumb’ remote controlled airplane and much more into something more like a trained attack dog, or even a trained spy. It may be carrying out sinister directives at the behest of someone else, but it is also capable of making its own choices in the process. It is an object that has been imbued with an unusual amount of agency: one could argue that it has become an agent in and of itself. If we look at this from the lens of actor-network theory, drone of uncertain (and currently impossible) intelligence is an entity that exists in a weird place, somewhere between being an artifact and being a social actor in and of itself. We don’t know that it isn’t exerting a certain amount of automated capacity to choose, or to act.
It is a thing that is ontologically uncertain, and mostly, we don’t like that. (I wouldn’t either, if I didn’t know about drones). This makes sense. After all, it is considerably harder to outsmart something that wants to hurt you or spy on you that is imbued with its own intelligence and autonomy than it is to outsmart a remote-controlled machine controlled by a far-away human that can only see so much ‘through’ the eyes of their mechanical avatar. While we may distrust the motives of the people who made our Amazon Echo or our iPhone, we generally don’t think that the object itself has bad intentions. Yet I often get the sense that people who distrust drones are imbuing the drone itself with bad motives or intent, because they suspect that the drone is capable of doing such a thing.
In another sense, our uncertainty about what drones are up to and how much they know makes us animists, leads us to anthropomorphize them to an extent that we do not anthropomorphize, say, our smart thermostat or our iPad. My drone is not very smart, but its blinking lights and what I interpret as its occasionally temperamental opinions about the conditions it will operate under lead me – even though I know better – to treat it more like a pet at times than like a tool. (I’ve certainly found myself talking to it).
Multirotor drones even move in ways that suggest living objects, more like darting hummingbirds than like conventional aircraft. When I’m bringing my drone in for a landing, it hovers expectantly in front of me: it is an object, but it’s definitely not inanimate. The fact that drones move, too, is another source of distrust. Even if we distrust our Amazon Echo or our iPhone, they are un-moving objects that stay where we’ve put them. Drones, though – they move, and presumably might be able to do so without human input. Creepy.
So then, if drones are inscrutable: what can we do to make them less inscrutable, to make their motives more apparent? Time will certainly help, in that we generally become less frightened of technologies as we grow more accustomed to them. The development of systems that will permit the average person to get a sense of what a drone is up to – perhaps by pointing their phone at it to pull a “license plate,” or something of that nature – will help as well. Still, I don’t see this central ontological issue with drones as something that we are somehow going to fix, or explain away. They are human, and therefore we are going to have to learn to live with them.