Dr Tham (left) receiving the award certificate on behalf of the team
The DeepUTAR team from UTAR Lee Kong Chian Faculty of Engineering and Science (LKC FES) won third place at the eighth edition of the Drone-vs-Bird Detection Grand Challenge, held on 3 July 2025. The competition was part of the 2025 International Joint Conference on Neural Networks (IJCNN 2025), hosted at the Pontifical Gregorian University in Rome, Italy, and organised by the International Workshop on Small-Drone Surveillance, Detection and Counteraction Techniques (WOSDETC).
The team consisted of Doctor of Philosophy (Engineering) students Wong Yi Jie and Wingates Voon, LKC FES academic Assoc Prof Ir Ts Dr Tham Mau Luen, Department of Electrical and Electronic Engineering Head Ir Prof Dr Chang Yoong Choon, Assoc Prof Ir Ts Dr Hum Yan Chai, and Consultancy and Commercialisation Deputy Head Assoc Prof Dr Kwan Ban Hoe. The team was awarded a certificate for their achievement.
Dr Tham, who served as the team’s advisor, said, “I am incredibly proud of my students’ outstanding achievement in winning third place at the 8th WOSDETC Drone-vs-Bird Detection Data Competition during IJCNN 2025 in Rome. Their innovative model, WRN-YOLO, successfully tackled the challenge of distinguishing drones from birds in complex visual environments. This accomplishment reflects their technical skill, creativity, and the rigorous research training they received at UTAR. It was a privilege to guide such a dedicated team, and their success is a proud moment for our university.”
Meanwhile, the PhD students enthused, “We joined this competition to test our ideas in an international arena, particularly one co-hosted at the prestigious IJCNN 2025 conference. Our team developed a customised model to tackle the challenges of drone detection and created a synthetic dataset to enhance the diversity of the training set. We are proud that our solution was recognised among the top three winning teams. It’s a meaningful validation of our technical rigour and collaborative effort.”
The students added, “We thank UTAR for providing a platform that cultivated our critical thinking skills; skills that guided the design of our detection model and synthetic dataset. The research training we received empowered us to plan and execute a systematic ablation study to rigorously evaluate each component of our proposed solution.”
Candid photo of Wingates Voon (left) and Wong Yi Jie (right)
Their award-winning project, titled WRN-YOLO: An Improved YOLO for Drone Detection using Wide ResNet, aimed to improve drone monitoring and mitigate the risks associated with unauthorised drone activity in restricted zones.
They explained, “Drone detection and surveillance are critical tasks to prevent unauthorised UAVs from entering restricted zones. However, distinguishing drones from birds remains a significant challenge. To address this, we propose a novel detection model, WRN-YOLO, which integrates the Wide Residual Network (WRN) architecture with the You Only Look Once (YOLO) object detection framework, enhancing feature extraction and detection accuracy. Recognising the complexities of real-world environments, we also developed a synthetic dataset to train WRN-YOLO. This dataset incorporates a wide range of challenging conditions, including intricate backgrounds and confounding elements. The proposed dataset helps improve the diversity of the training dataset for our model.”
This project also aligns with United Nations’ Sustainable Development Goal (SDG) 9: Industry, Innovation, and Infrastructure by contributing to advancements in smart surveillance and threat mitigation technologies. It also aligns with SDG 16: Peace, Justice, and Strong Institutions by aiding security enforcement and monitoring efforts, reducing risks from drones in restricted areas like government facilities or borders.
Example of drone detection in a challenging environment: The box highlights the detected object, and the score represents the confidence level of the model
© 2025 UNIVERSITI TUNKU ABDUL RAHMAN DU012(A).
Wholly owned by UTAR Education Foundation (200201010564(578227-M)) LEGAL STATEMENT TERM OF USAGE PRIVACY NOTICE