Articles of War has featured discussion of the law of armed conflict (LOAC) rules concerning booby-traps (see, e.g., here, here, here, and here). All have been based in the land domain. I am interested in the application of the idea in the cyber domain.
Under the Protocol on the Prohibitions or Restrictions on the Use of Mines, Booby-Traps and Other Devices (Protocol II) to the Convention on Certain Conventional Weapons (CCW), a booby-trap is defined as, “any device or material which is designed, constructed or adapted to kill or injure, and which functions unexpectedly when a person disturbs or approaches an apparently harmless object or performs an apparently safe act.”
That definition was crafted long before malware, phishing links, or cyber industrial control systems. Yet its simplicity and silence on physical form gives it surprising relevance today. The question is not whether cyber booby-traps exist (they do) but whether the law, drafted in an analogue age, can meaningfully restrain them in a digital one.
This post argues three simple things. First, cyber booby-traps are real, and their potential for harm is no longer hypothetical. Second, existing international legal frameworks, especially the CCW and its Protocol, whilst domain-agnostic, were not built with cyberspace in mind and their fit is uncomfortably loose. Third, closing the gap requires both legal imagination and technical humility.
A Conventional Understanding of the Convention on Conventional Weapons
At its core, a booby-trap weaponises trust. It makes the safe feel dangerous and the benign lethal; the tripwire hidden under leaves, the grenade wired to a door, or the explosive disguised in a child’s toy. These devices work not because they are powerful, but because they are unexpected. You step, lift, open … and then you suffer the consequences. The definition of booby-trap can be broken down into four parts:
(a) device or material;
(b) designed, constructed or adapted to kill or injure;
(c) functions unexpectedly; and
(d) an apparently harmless object or is part of an apparently safe act.
The first debate usually concerns whether code can be a “device.” Purists argue that a device implies physicality. But the law’s object is not preserving hardware terminology, it is preventing harm. Code that causes foreseeable physical harm can stand in for a device, just as cyber operations are already accepted as “attacks” under the LOAC when they have destructive effects.
The second element, intent to injure, is also workable in cyberspace. International law already accepts that “killing or injuring” includes direct and foreseeable effects. Outside the CCW, the International Committee of the Red Cross considers that physical harm need not be instantaneous whilst the Genocide Convention’s logic supports inclusion of mental and indirect harm when intentional and foreseeable. Even if not directly related to the CCW, it is clear international law is moving towards a wider perspective on killing or injuring. If malicious code is designed so that triggering it, knowingly or deliberately, cuts electricity to life-support machines, intention to cause lethal harm could be inferred even in the absence of a bullet or bomb.
The third and fourth elements, unexpected functioning tied to a harmless-seeming act, are the heart of the booby-trap concept. Gary Pattison has expertly covered it already. Historically, this covered everything from sharpened bamboo stakes hidden under leaves to grenades wired to door handles. A rifle fired in battle is expected. A minefield marked or recorded on a map is foreseeable. A teddy bear tied to a detonator is forbidden because it subverts the normal logic of war: it punishes trust, not threat. And that principle does not magically evaporate when the battlefield moves online. Cyberspace is saturated with digital normalcy; icons, updates, PDFs, login screens are all trusted, all mundane. When a “normal” click or insert unleashes destructive code, the victim’s expectation mirrors the soldier stepping on a covered pit. The domain has changed; the deception has not.
The Drafters Saw This Coming … Sort Of
The CCW was born out of the wreckage of the twentieth century’s wars, drafted in 1980 to address a world still littered with the remnants of conflict. Its negotiators were concerned with landmines that maimed long after wars ended, incendiary weapons that burned indiscriminately, and booby-traps that turned homes and hospitals into death zones. Protocol II was a response to conflicts in places like Vietnam, Laos, and Cambodia, where makeshift devices like grenades tied to doors, and explosives hidden in toys inflicted decades of suffering on civilians long after combatants had gone. The CCW’s mission was humanitarian: to limit the effects of weapons that refused to distinguish between peace and war, soldier, and civilian.
Although cyber warfare was beyond the imagination of 1970s diplomats, the treaty’s negotiating history reveals deliberate foresight about technological evolution. The negotiating history reveals that the definition of “booby-trap” was no simple matter. The drafters chose the definition to adopt a function-based approach, which focused on effects, rather than mechanism of detonation (p. 289). Even before the advent to the ubiquity of cyberspace, negotiators were concerned with a range of technologically advanced devices: remote activation, delayed functions, or disguised as civilian objects (p. 351). Delegations emphasised that the definition should be broad enough to encompass such scenarios, while maintaining humanitarian safeguards for both civilians and combatants (p. 396).
Negotiation of the Technical Annex further illustrates the drafters’ intentions. States like the United Kingdom submitted specific proposals concerning detectability standards, while France, Russia, the United Kingdom, and the United States jointly addressed broader technical issues for inclusion in the Annex, including the criteria for triggering mechanisms, and the safe clearance of devices after hostilities have ceased (p. 249). These discussions demonstrate that drafters were not only concerned with defining what constitutes a booby-trap, but also with practical technical parameters that minimise disproportionate harm (p. 412).
State interventions reveal points of convergence, but also points of divergence in the drafting process. The United Kingdom emphasised detectability as a core technical safeguard (p. 438). The United States focused on implementation and compliance issues, proposing mechanisms to monitor adherence to Protocol provisions (p. 39). France and Canada jointly submitted technical specifications. Collectively, these interventions reflect that some States sought more precise technical standards, while others prioritised flexibility to allow for practical implementation in a variety of situations (p. 307). The travaux préparatoires also clarify the drafters’ intentions regarding the prohibition on booby-traps disguised as apparently harmless portable objects.
Delegations emphasised that traps should not be attached to, or associated with certain objects such as toys, food, or medical supplies (p. 27). There was extensive discussion on whether “associated with” included proximity. While the travaux do not provide a definitive legal test, they indicate that the drafters’ approach was guided by the principle of minimising harm to civilians (p. 207).
The travaux préparatoires also show attention to technical safeguards like detectability standards, clearance procedures, and markings, as well as debates about whether proximity to civilian-like objects counted as “association.” This reflected a deeper principle: civilians must not unknowingly walk into hidden harm. Applied today, that principle easily accommodates cyber operations. In fact, the digital environment is arguably more vulnerable to disguised threats than a jungle battlefield. The modern civilian spends more time with computers than with unexploded ordnance. A malicious attachment posing as a medical lab form is far closer to a child’s toy rigged with explosives than to a conventional missile.
So what? The idea of a “cyber booby-trap” is a misnomer, of course, and again, the booby-trap definition is domain-agnostic. We don’t talk about “maritime booby-traps” or “space booby-traps.” Yet the term captures these digital mechanisms designed to function unexpectedly when someone performs an ordinary act: opening an email, plugging in a USB stick, updating a server, and that act unleashes harm on people, property, or critical systems. It is a niche area, the apparently harmless object (the email, the USB, the update) needs to aim to kill or injure.
Why the Law Struggles to Keep Up
The CCW’s spirit—to protect civilians and limit unnecessary suffering—translates perfectly into cyberspace. Yet its text struggles to stretch that far. The treaty’s basic assumptions clash with the realities of digital technology.
First, the treaty’s references to devices and materials presume physicality. Software isn’t tangible, but it performs the same function. The law can be read functionally, but doing so requires interpretive confidence that many States still lack. Expanding the meaning of “device” to include code risks political friction, as it touches national cyber capabilities and secrecy.
Second, uncontrolled spread, the defining danger of cyberspace, creates a new kind of indiscriminate effect. A physical booby-trap remains where it’s placed; a digital one can propagate across networks, mutate through updates, and travel across borders in milliseconds. The CCW prohibits weapons that cannot be directed at specific military targets, yet the interconnected nature of civilian and military systems means containment is nearly impossible. How does a commander comply with a rule designed for fixed geography in a domain without borders?
Third, attachment and protected objects. Article 7 of Protocol II prohibits booby-traps connected to protected persons or objects, such as medical equipment, food, or children’s toys. In cyberspace, these attachments become virtual. Embedding malware in a hospital’s update server or in widely used medical device firmware is functionally the same as lacing a med kit with explosives, but harder to detect, trace, or remove. The physical notion of “attachment” simply fails to capture the distributed reality of modern systems.
Fourth, recordkeeping and removal obligations. Articles 9 and 10 of Protocol II require belligerents to record the locations of mines and booby-traps and to remove them after hostilities. In 1980, this meant maps and grid references. In 2025, it would mean hashes, IP ranges, and digital signatures. Yet these forms of data are often classified or proprietary. Worse, remediation may be technically impossible. Eradicating malware from civilian systems may require revealing national vulnerabilities or compromising private infrastructure. The legal obligation remains sound, but its execution falters in the digital environment.
Fifth, the very notion of harm has changed. The drafters of the CCW were preoccupied with blood and bone. Cyber operations rarely cause immediate bodily injury, but they can cripple essential systems with lethal consequences. A ransomware attack on a hospital, a corrupted water treatment system, or a power grid collapse can all kill indirectly. The law’s humanitarian logic applies, but its thresholds need reinterpretation to reflect functional harm, not just physical injury.
Finally, there’s the problem of accountability. The CCW assumed a world of identifiable actors and physical evidence. Cyber operations thrive in attributional grey zones. Tracing a malicious implant to its source often takes months, if it’s possible at all. That gap undermines deterrence and weakens compliance: laws that depend on attribution struggle in a domain built on anonymity.
In short, the CCW’s framers built a treaty for a world of tangible weapons, where harm could be seen and mapped. The cyber domain dissolves those boundaries. The law’s moral compass remains sound, but its instruments need recalibration.
Bridging Law and Code
How, then, can we bring the old prohibition into the digital age? The answer is not to write a brand-new treaty, but to interpret and implement existing rules with the same purpose that guided their creation: reducing human suffering.
1. Interpret by function, not form:
States should affirm, perhaps through an interpretive protocol or official guidance, that the CCW’s definition of a “device” includes software, firmware, and virtual mechanisms designed to cause harmful effects. The law already provides the moral foundation; what’s needed is an authoritative statement that technology doesn’t create immunity.
2. Define a “cyber booby-trap,” reflecting the unique situation that cyberspace creates
A working definition helps focus the debate: “A cyber booby-trap is a digital mechanism that functions unexpectedly when a person performs an ordinary act, and that has the potential to cause direct or foreseeable indirect harm to persons, property, or essential systems.” It makes clear that foreseeability and function are key, not the medium.
3. Legal and technical reviews before deployment
Many countries already review new weapons for compliance with international law. Cyber capabilities should undergo the same scrutiny. Pre-deployment reviews must examine the likelihood of spread, civilian impact, and deactivation options. Confidential summaries of those reviews could be shared internationally to build mutual trust.
4. Build in containment and deactivation
Good engineering can reinforce good law. Code can include containment mechanisms like sandboxing, rate limits, or authentication checks to prevent uncontrollable spread. Developers can include kill-switches or time limits that automatically deactivate the code once an operation ends or peace is restored.
5. Post-conflict responsibility
The obligation to clean up after conflict shouldn’t vanish online. States that deploy cyber capabilities should retain and share enough technical data to assist in remediation including: code hashes; affected systems; and mitigation steps. A neutral international body could coordinate post-conflict cybersecurity clean-up, similar to how mine-clearance agencies operate on the ground.
Conclusion
Cyber booby-traps reveal something about the law’s resilience. The basic humanitarian instinct to limit deceitful, indiscriminate harm has survived every technological shift. What’s missing is the confidence to extend those instincts into code.
The CCW’s framers could not have imagined a world where the same legal concept that banned a hidden grenade might one day govern a malicious software update. But the principle is timeless: war should not rely on treachery disguised as trust. If cyberspace has blurred the front lines, then the law must learn to travel across them too.
Bringing booby-traps into the digital era doesn’t require rewriting humanitarian law from scratch. It means reaffirming that the rules we already have still apply, that protection of civilians doesn’t depend on whether the trap is made of steel or script. The law must follow the function, not the form. Ultimately, preventing cyber booby-traps is not just about regulating technology; it’s about safeguarding the predictability and trust that civilian life depends on. A booby-trap, whether digital or physical, violates that trust. The only real difference now is speed and scale. If we fail to update our interpretation of the old rules, the next trap won’t be triggered in a foxhole. It will be sprung in a hospital, a power grid, or a child’s laptop. And by then, it will be far too late to rewrite the law.
***
Dr Samuel White is the Senior Research Fellow in Peace and Security at the National University of Singapore’s Centre for International Law.
The views expressed are those of the authors, and do not necessarily reflect the official position of the United States Military Academy, Department of the Army, or Department of Defense.
Articles of War is a forum for professionals to share opinions and cultivate ideas. Articles of War does not screen articles to fit a particular editorial agenda, nor endorse or advocate material that is published. Authorship does not indicate affiliation with Articles of War, the Lieber Institute, or the United States Military Academy West Point.
Photo credit: Getty Images via Unsplash