Experimental drone uses AI to identify violent persons in crowds; all you need to know

By Elton Gomes

Researchers from the United Kingdom and India are working on a method to employ artificial intelligence (AI) and drone surveillance to identify violent behaviour among crowds. The project employs an affordable drone, named the Parrot AR 2.0, to watch crowds of people from above, and then uses AI to spot individuals with violent poses, as reported by CNET.

In a paper named Eye in the Sky, the researchers describe their plans where they state the Parrot AR 2.0 drone is used to transmit video footage over a mobile internet connection for real-time assessment. An algorithm that has been trained using deep learning is employed to estimate the pose of humans in the video. The algorithm then matches the poses to postures that the researchers have designated as violent. A total of five poses are defined as violent for the purpose of the project: strangling, punching, kicking, shooting, and stabbing, the Verge reported.

Although drone-based surveillance remains questionable, the researchers hope that their system can be used to detect crime in crowded public spaces and large events. A source of motivation to develop such a system was an event like the Manchester Arena bombing in 2017, lead researcher Amarjot Singh told the Verge. Singh added that the use of surveillance cameras could prevent such attacks, as they can spot suspicious behaviour, like someone leaving a bag unattended.

The report in the Verge mentioned that the system has an accuracy rate of 94%. However, Singh admitted the accuracy rate could drop if more people appear in the frame. The accuracy rate fell to 79% when 10 people were inspected. To give the system a real taste of the outside world, Singh plans to test the drones at festivals like Technozion and Spring Spree, which will take place at the National Institute of Technology, Warangal, Telangana.

Drones and aerial surveillance

Several questions remain unanswered while dealing with drones and aerial surveillance – questions about abuses of power and the reliability of facial recognition systems. The government might use drone surveillance as an excuse to record aerial footage of people in public spaces, with the intention to clamp down on dissent. Organisations, therefore, need to make it explicitly clear that people are being videotaped.

On the other hand, people are using facial recognition to save victims of human trafficking. A software called Traffic Jam uses facial recognition to match a victim’s photo from a missing ad or social media and checks if the victim’s photo appears anywhere on advertisements for sex.

Technology undoubtedly has allowed us to better control certain security situations, but questions of privacy have also been raised. In the future of surveillance, drones and AI should be used with credibility and caution.


Elton Gomes is a staff writer at Qrius

Drone