The Israeli Defense Force (IDF) is using new drone technology in its war against Hamas, and accounts of sniper drones targeting civilians underscore the urgent need for meaningful human oversight in the use of military technology.
Eyewitness reports describe firearm-equipped drones used for precision targeting in Gaza, but Israel’s restrictions on journalists in Gaza prevent comprehensive reporting and verification of events. As a result, information about the drones is limited, relying heavily on information from healthcare workers in Gaza. Dr. Nizam Mamode is a British surgeon who worked at a hospital in Central Gaza. He gave testimony to a committee in the UK Parliament that sheds light on the reports of sniper drones, called “quadcopters,” and their devastating impact on civilians.
According to Mamode, the quadcopters are typically deployed following airstrikes and have become a common occurrence in Gaza. Some civilians describe instances where the drones targeted people trying to pull others from the remnants of a building hit by an airstrike, while other reports describe quadcopters shooting at civilians moving through the streets and trying to get to hospitals. Notably, healthcare workers have been targeted by drones while trying to provide care to patients. The total number of civilians wounded or killed by these drone attacks is unknown, but one doctor attributed over 20 injuries in one day to the quadcopters.
Eye witnesses describe the quadcopters as being slightly larger than commercial drones, with four small rotors on top, a camera, and a long rifle barrel capable of being fired remotely. According to Dr. Mamode, the drones fire “small, cuboid pellets” that ricochet inside of a human body, resulting in multiple internal injuries for the victims. Adding to the ambiguity, it is still unclear which manufacturer is responsible for making the sniper drone the IDF is deploying in Gaza.
What can be done to protect civilians in Gaza from gun-wielding drones?
Duke Robotics, Elbit Systems, and Smartshooter all have firearm-equipped drones on the market and a relationship with the IDF. While Smartshooter has stated the IDF is not using their Smash Dragon drone in Gaza, a look at the Smash Dragon’s capabilities can provide insight into the technology on the market: the Smash Dragon is a firearm-equipped drone “designed to lock, track, and hit unknown targets” with “real-time fire control algorithms.” In the case of Smash Dragon, the drones are remotely controlled and require a human to pull the trigger, but the target tracking algorithms ensure the drone does not fire until the shot is sure to hit the target.
When asked about its use of sniper drones, the IDF declined to comment but maintains it does not target civilians. However, the IDF has already been experimenting with using AI to make targeting decisions, and the lack of information about the drones’ targeting processes raises an important question: what can be done to protect civilians in Gaza from gun-wielding drones?
The Path Forward
The use of “sniper drones” in conflict zones raises issues that extend far beyond Gaza, highlighting the broader implication of evolving military technology on targeting decisions. Addressing this global concern requires a twofold approach. The immediate priority is to enhance protections for civilians in Gaza by holding the IDF accountable and ensuring adherence to international humanitarian law. It is a tricky proposal given the protections and support the US has afforded Israel throughout the war, and the impunity with which the Israeli government has been acting despite repeated international calls for restraint. The second, long-term concern is for the international community to critically assess the challenges and legality of killer robots so that institutions can establish regulations that evolve alongside technology.
Meaningful Human Oversight
On Dec. 2, 2024, the United Nations General Assembly passed a resolution on lethal autonomous weapons (LAWs), adding a discussion regarding the legality of LAWs to the 2025 agenda. One avenue member states may choose to explore is taking the emerging norm of “meaningful human control” and establishing a legal threshold for militaries to meet to ensure human oversight over emerging technology.
The concept of meaningful human control revolves around ensuring humans remain the ultimate decision-makers for the “critical functions” of a weapon system, regardless of technological advancements, including decisions about who or what is targeted and the level of force used. Proponents of codifying this concept argue that establishing a minimum threshold for meaningful human control upholds accountability under international law, ensures weapon use complies with international law, and guarantees human operators retain the ability to intervene, override, or disable a weapon system in case of malfunction or misuse. While critics may point toward definitional disputes, rapid evolution of technology, and monitoring and accountability challenges to codifying the concept, the passage of the latest resolution demonstrates a willingness to discuss options.
The devastating use of drones in Israel makes all too real the consequences of these theoretical legal arguments. While the international community can take time to weigh the costs and benefits of placing restrictions on autonomous weapons, there is an urgent need for a coordinated effort to pressure Israel into better protecting civilians.