Home Electronics How an ordinary umbrella can trick and capture autonomous drones

How an ordinary umbrella can trick and capture autonomous drones

Credit: Shaoyuan Xie / UC Irvine.

Autonomous drones that can follow people or objects on their own are becoming more common in policing, border security, and surveillance.

These aircraft use camera-based artificial intelligence systems to lock onto a target and track it without human control.

While the technology offers many benefits, new research shows it also has a serious weakness that could be exploited in the real world.

Computer scientists at the University of California, Irvine have discovered a simple but powerful way to manipulate these drones using nothing more than a specially patterned umbrella.

Their study, presented at the Network and Distributed System Security Symposium and posted online as a preprint, demonstrates how attackers could lure drones closer and potentially capture or crash them.

The researchers developed an attack method they call FlyTrap. It targets the visual tracking systems that allow drones to follow a selected subject automatically.

Many consumer drones include this feature under names such as “active track” or “dynamic track.” The system works by analyzing camera images and adjusting the drone’s position to keep a consistent distance from the target.

FlyTrap exploits how these systems interpret visual information. By placing a specific pattern on an umbrella, the researchers were able to fool the drone into thinking the target was moving away, even when standing still.

To maintain its programmed distance, the drone moved closer and closer to the umbrella holder. Eventually, the aircraft could be caught with a net or forced to collide with an object.

This type of attack is especially concerning because it happens entirely in the physical world. It does not require hacking the drone’s software or intercepting wireless signals. Simply opening the umbrella in view of the drone is enough to trigger the effect, and it works under different lighting and weather conditions.

The team tested the method on several popular consumer drones and found it worked consistently. In demonstrations, they were able to pull the aircraft close enough to be captured or to cause crashes. The researchers have reported the vulnerability to the manufacturers involved.

The implications are wide-ranging. Criminals could use the technique to disable surveillance drones used by law enforcement or border patrol. On the other hand, people who feel threatened by intrusive drones might use the same method for self-protection. The study highlights how technologies designed for safety can also create new risks if security weaknesses are not addressed.

Experts say the findings show the need for stronger safeguards before autonomous tracking drones are deployed in sensitive environments. Systems that rely heavily on visual data may need additional checks to prevent manipulation.

As drones become more advanced and widespread, understanding their vulnerabilities will be crucial. The UC Irvine researchers hope their work will encourage manufacturers and policymakers to improve security so that the benefits of autonomous flight can be realized without exposing the public to new dangers.

Source: UC Irvine.