Drones, also known as unmanned aerial vehicles or UAVs, are flying robots that can be as small as a dragonfly or as big as a house. Depending on how you define them, they've been around in one form or another since World War I, but what they all have in common is that the living brain—whether it belongs to a fly or a fighter pilot—has been removed from the aircraft. And that means that those smarts have to be recreated from ground control. In fact, it takes considerable support in both equipment and people to operate a military UAV, for example.
History to Today
The rise of the drone was made possible over the past few decades by a host of technological innovations, including small digital sensors for navigation and flight stabilization, GPS, long-range data links, and lightweight materials. Depending on how you count them, the U.S. military today has between 7,000 and 15,000 UAVs, according to Peter Singer, Director of the 21st Century Defense Initiative at Brookings. Missy Cummings, a former fighter pilot who is now a professor of aeronautics and astronautics at MIT, considers interest in manned fighters ...old-fashioned. Manned aircraft? Who does that anymore?"
Smart visual sensors capable of seeing an object on the ground and identifying it are only improving.
Consistent use of UAVs began during the Korean and Vietnam Wars for targeting and reconnaissance, a role which continued with the so-called Predator that was used in the 1990s in Bosnia. Beginning in 2001, weapons were added to the Predator. The main advantage of UAVs, weaponized or not, is to save lives by keeping soldiers out of harm's way. But UAVs also save money. They're less expensive than comparable aircraft because they don't have to be as reliable; if they crash or are shot down, no pilots are lost. And the years of training that fighter pilots put in are no longer required. In fact, it turns out that non-pilots are better at controlling UAVs, perhaps because flying a drone requires different skills; untrained officers don't have to overcome the habits fighter pilots have formed from years of experience.
While reports of civilians killed during drone strikes continue to make headlines, in theory the use of UAVs should enable more accurate attacks and reduce the number of innocent bystanders who are killed or wounded. As counterintuitive as it may seem, a pilot flying 35,000 feet directly above a target may have a harder time aiming carefully than a UAV operator some 7,000 miles away, in part because the pilot is multitasking. Meanwhile, smart visual sensors capable of seeing an object on the ground and identifying it are only improving.
Advantages aside, the current generation of UAVs suffer from a number of vulnerabilities. Operator boredom is one of them. While backyard drone inventors enthusiastically set aside hours of free time to pilot their craft, a military operator must maintain attention on a scene where nothing important can happen for days on end. Inevitably, vigilance decreases over time.
On the flip side, this same operator will often get to know their targets well. They see their patterns—when they get up in the morning, where they go, and who they live with. And, after the operator has hit a target, they continue to monitor the scene, sometimes over the course of several hours. This can provide valuable intelligence, but it also means the operator witnesses death, destruction, and the aftermath, including the reaction of people on the ground.
"UAVs sometimes aren't that smart," says Bill Sweetman, Chief Editor for Defense Technology at the Aviation Week and Space Technology Group.
"I think the worst part about it," says Cummings, "is not necessarily that you're engaged in a fight that led to a death, but then your mission's over, and then you drive 35 minutes back to north Las Vegas, and then you go to church the next day." In the past, soldiers were surrounded by people who understood and could talk about what they had experienced that day. Not so with drone operators.
But arguably the greatest vulnerability of the current generation of UAVs is their intelligence, or lack thereof. "UAVs sometimes aren't that smart," says Bill Sweetman, Chief Editor for Defense Technology at the Aviation Week and Space Technology Group. "Their self-diagnosis isn't that smart, and by the time they have a problem, it's too late. They lose a link, they lose power, and you're gone." When that happens, information relayed up to the aircraft, such as changes in direction, altitude, or even whether to terminate the mission, is lost. Information coming back down to the operator—such as real-time streaming images from the aircraft and an understanding of how its systems are performing—is also gone. The data link can go down because of a simple equipment malfunction, but it can also be successfully jammed by enemies, or even "spoofed," tricking the UAV to do something its operator didn't intend. In 2012 scientists at the University of Texas at Austin succeeded in doing just that, hacking into a drone and hijacking the flight.
WHAT MAKES A DRONE SMART?
For the purposes of a UAV, intelligence boils down to how independently the aircraft can operate. What decisions can the machine make on its own, and what requires human intervention? One of the most advanced UAVs is the Navy's X47-B, which will be capable of taking off and landing on an aircraft carrier on its own. While this requires significant instrumentation so that the aircraft can respond to the environment and be aware of how it's performing, not everyone sees it as a breakthrough achievement. "It is not a particularly smart aircraft in its current instantiation," says Cummings. "We've been able to take airplanes off and land themselves on the carrier for my entire career in the Navy."
Experts make a distinction between automation—the ability of a system to follow a pre-programmed course—and autonomy. Paul Eremenko is the Deputy Director of the Tactical Technology office at DARPA. "Autonomy," he says, "is the ability of systems to take independent action and reason about the environment, perceive the environment, build some model of the world around them, and adapt to the world around them without human intervention to that environment."
Some of the greatest advances are likely to come out of the development of ground robots.
In practical terms, this would mean, for example, being able to instruct a UAV to go to a region in Iraq, make observations until something noteworthy happens, and then relay that information back to ground control without a human being in the loop at all times. The benefits of autonomy are hardly unique to UAVs. In fact, some of the greatest advances are likely to come out of the development of ground robots, such as driverless cars that must navigate difficult terrain, or from interstellar space probes or rovers on distant planets that must be capable of operating on their own for longer periods of time than a UAV.
Other advances in autonomy come from the world of tiny drones that swarm—just like bees or ants—requiring them to sense and maintain a set distance apart, free of any link to ground control. The ultimate in autonomy would be so called adaptive reasoning, in which the UAV would be able to adapt to an infinite number of possibilities. Chuck Heber, who managed the development of two UAVs for DARPA in the 1990s—Global Hawk and DarkStar—believes that ultimately "UAVs will be able to do just about everything autonomously short of pulling the trigger." Some argue that an even greater challenge than making UAVs truly autonomous will be in proving that they can operate reliably enough so we can depend on them. As Eremenko puts it, "We really are quite far away from being able to prove that those kinds of systems will always behave in a reasonable manner, in a manner where either the human operator or those around it can depend and trust the system to behave appropriately."
Today, almost everyone agrees that autonomous vehicles will require human monitoring and the ability to intervene. As Bill Sweetman describes it, a UAV is a very smart dog running around a lot of people with the potential to trip them. Someone has to be responsible. Even the UAVs of the future, which will likely be much smarter than a smart dog and better than humans at many things, will be missing human moral judgment, or, as Peter Singer puts it, "a sense on the back of your neck that something's not right. You don't know why, but you can react to it. Your moral compass...knowing when and when not to use force." In essence, the path forward will be to develop UAVs that are good at what machines can be good at, such as planning trajectories and withstanding long trips and high Gs, while leaving such things as judgment, the creative response to the unexpected, and social intelligence, to the humans.