The radio signal that is the lifeblood of the Global Positioning System originates from a constellation of twenty-four satellites, orbiting more than twelve thousand miles above Earth. When it reaches the ground, after about sixty-seven milliseconds, it is so weak as to be almost imperceptible. (G.P.S. experts often compare processing the signal to trying to read by the light of a single bulb in a city thousands of miles away.) The signal tells the receiver the precise moment at which it left the satellite. Given four of these cues, processed simultaneously, the receiver can extrapolate its position in three dimensions. A timing error of as little as a millisecond can throw its calculation off by nearly two hundred miles.
The United States Air Force, which runs the G.P.S. Master Control Station, in Colorado, calls G.P.S. “the world’s only global utility.” Wholly owned by the U.S. government, the system is available free to everyone, everywhere; an ISIS terrorist glancing at his phone for a position fix benefits from the Pentagon’s largesse as much as a commuter on I-95. Since the first G.P.S. satellite was launched, in 1978, the system has steadily become the most powerful of its kind. (Other countries have navigation satellite networks, but none are as dependable or as widely available.) There are now around seven G.P.S. receivers on this planet for every ten people. Estimates of the system’s economic value often run into the trillions of dollars.
That’s a lot of responsibility for such a weak signal. The Pentagon’s Defense Advanced Research Projects Agency recently determined that, within thirty seconds of a catastrophic G.P.S. shutdown, a position reading would have a margin of error the size of Washington, D.C. After an hour, it would be Montana-sized. Drivers might miss their freeway exits, but planes would also be grounded, ships would drift off course, commuter-rail systems would be tied up, and millions of freight-train cars with G.P.S. beacons would disappear from the map.
Fortunately, a worldwide G.P.S. failure is unlikely. A hacker or terrorist would require either a weapon powerful enough to destroy the satellites or a way to infiltrate the heavily fortified Master Control Station. The bigger worry is spoofing, the transmission of a bogus G.P.S. signal that nearby receivers mistake for the real thing. Although localized, such an attack could have profound consequences. The U.S. Department of Homeland Security classifies sixteen infrastructure sectors—including dams, agriculture, health care, emergency services, and information technology—as critical, and therefore particularly vulnerable to sabotage. All but three require G.P.S. for essential functions. Most use the system for timing, not positioning. The atomic clocks aboard the satellites, synchronized to within nanoseconds, are used to link clocks over large geographic distances. G.P.S. time allows cellular calls to bounce flawlessly between towers, regulates the measurement of power flowing though electrical grids, and time-stamps financial transactions—particularly important in the era of high-frequency trading, when milliseconds are worth millions.
Security officials have been concerned about the susceptibility of G.P.S. to spoofing since at least the early two-thousands. Fourteen years ago, a team at Los Alamos National Laboratory, in New Mexico, built a spoofer by modifying a G.P.S.-signal simulator (a legal device that tests receivers’ accuracy) and aiming it at a stationary receiver more than a mile away. The receiver’s display revealed that it believed it was zipping across the desert at six hundred miles per hour. The world’s most powerful spoofer, however, wasn’t built for another six years. It began as a graduate-school project by Todd Humphrey’s, now an engineering professor and the head of the Radionavigation Laboratory at the University of Texas at Austin. Humphreys believed that the government was underestimating G.P.S.’s vulnerability to the Los Alamos team’s spoofer, but he also felt that the device would be readily detected in the real world. So he set about building a more covert version. An expert in software-defined radio—the modification of radio signals with a computer, as opposed to mixers, amplifiers, and other hardware—Humphreys used a general-purpose processor to build what he calls a “formidable lying machine,” a box that “listens” to the G.P.S. signal, gradually builds a bogus signal that aligns perfectly with the real, and then slowly overtakes it.
Humphreys team debuted its spoofer in 2012. In front of an audience of military and government officials, Humphreys fed faulty G.P.S. coordinates to a test drone aircraft from a third of a mile away, causing it to plummet. (He would later hear that officials were “shocked” upon reviewing video of the demonstration.) A month after that, he testified before Congress. The good news, he said, was that there were, at most, a hundred people in the world who could build a spoofer as mighty as his. For the moment, it was probably beyond the reach of organized crime or terrorist groups but “well within the capabilities of near-peer nation-states.” He spent the next few years documenting how his spoofer could induce the kinds of clock errors that might undermine power companies, telecoms, and financial firms. The last of these, Humphreys thinks, are particularly vulnerable to attack. Although he believes that the major trading houses, such as the New York Stock Exchange, have installed strong anti-spoofing measures, he worries about individual traders, who often use their own timing feed, jacking directly into the unsecured G.P.S. data stream. A spoofing attack on these targets could have disastrous ripple effects: blackouts, communications breakdowns, and market failures akin to the “flash crash” of 2010.
Humphreys’s spoofer consists almost entirely of code. Even if nobody in the world has the expertise to recreate it, it is still vulnerable to so-called script kiddies, hackers with enough skill to steal and install it. In the three years after Humphreys testified, F.B.I. agents regularly visited his office to check that the code was secure. Last August, he gave them bad news. The hackers hadn’t gotten to him, Humphreys said, but they now had other options. A Japanese researcher had recently uploaded a software-defined G.P.S. simulator to the online repository GitHub. Humphreys assumed that the researcher’s intentions were good—hardware simulators, used by engineers to design and test G.P.S. equipment, can cost hundreds of thousands of dollars—but that the larger ramifications had probably not occurred to him. “He was uploading a spoofer,” Humphreys wrote in an op-ed for the journal IEEE ComSoc Technology News. “He was handing his code to the script kiddies.” That same month, at DEF CON, a hacker conference held annually in Las Vegas, a team from a Beijing-based cybersecurity company showed off a new G.P.S. spoofer, which it had built by cobbling together publicly available code. Suddenly, it seemed much easier to tell a formidable lie to G.P.S., the technology that runs the world. How long before a terrorist or a script kiddie decides to aim one at a cell tower or electrical substation, just to see what happens?