How Do Military Drones Fly Without GPS? | Ian Laffey, Theseus
There is a quiet assumption buried inside most modern navigation technology: that the sky will cooperate. GPS works because a constellation
The Problem With Knowing Where You Are
There is a quiet assumption buried inside most modern navigation technology: that the sky will cooperate. GPS works because a constellation of satellites broadcasts precise timing signals, and a receiver on the ground — or in the air — triangulates its position from the geometry of those signals. The system is elegant, global, and deeply fragile. Jam the signal, spoof it, or simply operate somewhere the satellites cannot reliably reach, and the whole edifice collapses. For a commercial delivery drone, that is an inconvenience. For a military drone operating over a contested battlefield, it is a matter of catastrophic failure, and the adversary knows it.
This is the central problem Ian Laffey and the Theseus team are grappling with: how does an autonomous aerial vehicle continue to navigate, make decisions, and complete a mission when the one positioning system it was probably designed around has been taken away? The answer, it turns out, requires rethinking navigation from first principles — which is precisely why this podcast sits in that intellectual neighborhood.
Dead Reckoning, Reborn
The classical solution to navigating without external reference is dead reckoning: you know where you started, you know your velocity and heading, you integrate over time, and you estimate where you must be now. Inertial measurement units do exactly this, using accelerometers and gyroscopes to track motion. The problem is that errors accumulate. Every tiny imprecision in the sensor compounds with the next, and over minutes the position estimate drifts into uselessness. This is not a new observation — naval navigators knew it centuries ago — but the rate of drift in low-cost MEMS inertial sensors makes it particularly vicious for small drones operating on tight tolerances.
What Laffey’s work points toward is the fusion of multiple weak signals into something more robust than any individual source. Inertial data provides high-frequency, short-term accuracy. Visual odometry — using cameras to track how the world appears to move beneath the vehicle — provides a kind of optical dead reckoning that is independent of radio signals. Terrain-relative navigation matches observed ground features against stored maps. Barometric altitude gives a stable vertical reference. None of these is sufficient alone. Together, weighted intelligently by a Kalman filter or one of its more sophisticated descendants, they can maintain a workable position estimate across meaningful operational timescales.
The insight here is architectural rather than merely technical: resilience comes from redundancy across diverse sensing modalities, not from making any single sensor better. This is a design philosophy, and it has implications well beyond drones.
The Adversarial Dimension
What makes this problem genuinely hard, and what distinguishes military from civilian autonomy, is that the environment is not merely uncertain — it is actively hostile. A drone flying over Ukraine or any other contested zone is not dealing with sensor noise as an abstract statistical property; it is dealing with an adversary who has read the same papers, understands the vulnerabilities, and is deliberately trying to exploit them. GPS spoofing, for instance, does not simply remove the signal; it injects a false signal that looks plausible. A naïve fusion algorithm might accept the spoofed GPS and reject the correct inertial estimate, because the GPS “looks” more certain. Building systems that can detect and reject adversarial inputs — without generating so many false positives that the vehicle becomes useless — is a genuinely open research problem at the boundary of signal processing, machine learning, and formal verification.
This connects naturally to the broader literature on robust estimation and anomaly detection. The same intellectual scaffolding that supports fraud detection in financial systems or fault isolation in industrial control applies here, adapted for the latency and weight constraints of a vehicle that cannot afford either a missed detection or a false alarm.
Navigation as Epistemology
There is a deeper thread worth pulling. Navigation without GPS is, at its core, a problem of knowing what you know and knowing what you do not know — maintaining a calibrated uncertainty over your own state and updating it as new evidence arrives. This is Bayesian inference in hardware, running in real time, under adversarial conditions, with the stakes measured in human lives and strategic outcomes.
This frame connects the problem to philosophy of mind, to robotics, to autonomous vehicles, and to the literature on reliable systems more broadly. The question of how a system maintains coherent self-knowledge under sensory degradation is not categorically different from questions asked in cognitive science about human spatial navigation, or in reliability engineering about how a power grid knows its own state under partial observability. The specific domain is military autonomy; the general domain is epistemic robustness under adversarial uncertainty.
Why This Matters Now
The proliferation of drone warfare has moved faster than the policy, the doctrine, and frankly the engineering. Systems that were designed with GPS as a given are being fielded in environments where GPS is unreliable or actively denied. The gap between what the hardware was designed to do and what it is being asked to do is not theoretical — it shows up as mission failures, navigational errors, and in the worst cases, as autonomous systems making consequential decisions based on corrupted state estimates.
Work like Laffey’s matters because it is trying to close that gap honestly, from first principles, without papering over the hard parts. The unglamorous reality of sensor fusion, filter tuning, and adversarial robustness is where the actual capability lives. Understanding that is prerequisite to understanding what autonomous military systems can and cannot responsibly be asked to do.