“Someday, most major developed cities in the world will live under the unblinking gaze of some form of wide-area surveillance,” writes Arthur Holland Michel in Eyes in the Sky, a startling, disturbing, and deeply reported account of the powerful new technologies that promise safety and imperil privacy on an unprecedented scale.
Imagine an aerial video camera with the power to spot an object 15 centimeters wide from an altitude of 7,600 meters in a frame twice the width of Manhattan. Actually, you don’t need to imagine it. The US Defense Advanced Research Projects Agency (DARPA) developed the ARGUS camera a decade ago with those capabilities. In 2014, 10 were deployed in Afghanistan on Reaper drones, in a covert program known as Gorgon Stare.
In Greek mythology, the Gorgons were underworld monsters whose rigid, unblinking stare were their most frightening feature. The most famous Gorgon, the snake-haired Medusa, turned people to stone with just a glance. In real life, “wide-area motion imagery,” known as WAMI, can record everything that happens on a battlefield, in a conflict zone, or within a city neighborhood, as well as zero in on any target it chooses.
Using a system of 368 compact 5-million-pixel cameras to capture images of about 1.8 billion pixels, the Gorgon Stare technology enables the US military and, increasingly, police departments to surveil vast areas from the safety of the skies. If an enemy attack (or a street crime) is detected, the administrators of the system can rewind the video and see where the perpetrators came from and where they went. Early adopters called it “combat TiVo.”
Michel, a scrupulous and genial reporter, is both scared of and intrigued by WAMI. He traces its origins back to the 1998 movie Enemy of the State, which imagined such a surveillance system, and prompted some scientists to think about how one could be created. The spread of improvised explosive devices (IEDs) in the Iraq war theater after 2003 made the task urgent.
Scientists from the Central Intelligence Agency and Lawrence Livermore lab crashed on a project to create ever more powerful cameras to track, identify and eliminate the attackers who planted IEDs. One engineer boasted online that his US Air Force unit had captured or killed 1,200 insurgents with WAMI. “The little evidence that is available suggests that WAMI’s short history has been a violent one,” Michel notes.
WAMI ‘opens our lives to a view from the heavens that had, from the dawn of civilization until not so long ago, been reserved for the gods and the stars’
Yet its future seems unlimited. WAMI, he writes, “opens our lives to a view from the heavens that had, from the dawn of civilization until not so long ago, been reserved for the gods and the stars.” Now a half-dozen technology firms are offering wide-area surveillance systems to US police departments; Chinese, Israeli and Hungarian firms are selling these systems all over the world.
“It is impossible to deny that such a power could be used as a force for good,” Michel writes. “But there is no single widely used surveillance technology that has not, at some point, crept too far beyond its original purpose or given rise to unintended consequences.… The twin furies of WAMI and automation have the potential, if mismanaged, to make surveillance much more invasive, unscrupulous, mysterious, and unfair.”
The potential benefits of WAMI for combat soldiers, law-enforcement officers and natural-disaster responders are obvious. The Baltimore Police Department secretly deployed a persistent surveillance system in 2016 to address the US city’s catastrophic crime problem and passionately defended its effectiveness after its existence was revealed. According to advocates, the system generated information about 105 crimes, and identified 73 people and vehicles deemed suspects in nine months. After the program was publicly revealed, daytime shootings in the city dropped from six a day to one, according to a police spokesman.
In 2017, the US Forest Service deployed WAMI to fight wildfires all over the country. In 2018, the Indiana National Guard deployed wide-area surveillance in the aftermath of Hurricane Florence to identify critical infrastructure, blocked roads, and stranded survivors.
But the effectiveness of WAMI as a war-fighting or crime-fighting tool is not well documented. The eyes in the skies did not ensure victory in Afghanistan, where the US forces are now drawing down after 19 years. The USAF, as an institution, has repeatedly declined to make any claims about what the technology achieved, Michel notes. In the US, a Police Foundation study concluded only that “persistent surveillance holds potential [emphasis added] for helping solve crime.”
Persistent surveillance also has potential for abuse, especially when fused with other data streams from facial-recognition software, social-media archives, and public records. One Chinese company now offers a system that can recognize faces from closed-circuit television cameras and automatically trace all of a person’s movements across all other cameras for the past week, as well as identify his or her car. “In a fully fused city,” Michel observes, “there may be nowhere to hide.”
Public suspicions are an obstacle to adoption. Baltimore had to cancel its WAMI program because of public protest. When the government of Dade county, Florida, applied for a US Justice Department grant to study a WAMI program for Miami’s crime-ridden north side, objections from elected officials killed the idea. But with 600 US police departments already using drones, the adoption of WAMI is likely to spread.
In early August, The Guardian reported that the Pentagon was testing unmanned balloons equipped with persistent surveillance systems over five Midwestern states. The purpose: “to locate and deter narcotic trafficking and homeland security threats.”
Like Google Maps or the iPhone, WAMI is “a technology with far more uses than its creators could have possibly imagined themselves,” Michel says. In the hands of a police department, it might curb crime. In the hands of a military, it is a potent battlefield weapon. In the hands of secret intelligence agencies, it can serve as an instrument of social control. Other uses – and abuses – are surely coming.
In closing, Michel offers a thoughtful agenda for regulating the use of WAMI. Governments and private companies need to practice transparency and accountability to gain public trust to enable the benefits of WAMI. They need to protect the data collected and dispose of it to protect privacy and civil liberties. They need to limit the use of artificial intelligence in making decisions about the use of force or probable cause.
But WAMI will also have another effect that will be harder to control, he notes: its impact on our thinking about privacy in public spaces. “While it could be a good thing if violent criminals thought twice before stepping out into the open air,” he writes, “that’s no way for everyone else to live.”
Michel fears that our worries of the all-seeing eye in the sky may subside:
“We might learn to steer clear of protest or political gatherings or bodegas [frequented by undocumented migrants]. We’d think twice before parking our cars in majority-Muslim neighborhoods or interacting with people who, for whatever reason, have come under government or police scrutiny. We would eventually come to accept that those sacrosanct spaces are no longer ours– but that’s probably the scariest scenario of all.”
This article was produced by the Deep State, a project of the Independent Media Institute, which provided it to Asia Times.