Best Paper Award at IWAIT 2022

UTAR Faculty of Information and Communication Technology (FICT) Bachelor of Computer Science (Honours) student Khoo Chia Hong won Best Paper Award at the International Workshop on Advanced Image Technology 2022. Held physically in Hong Kong from 4 to 6 January 2022, the workshop was also livestreamed for the participants.

Organised by The Hong Kong Polytechnic University of Hong Kong, IWAIT 2022 provided an international forum for researchers, scientists, engineers and students who are interested in the field of advanced image technologies. The field of interest of IWAIT concerns all aspects of image, video, audio, multimedia, and information, including capture, processing, recognition, classification, communication, networking, computing, system design, security, implementation, and related technologies with applications to various scientific, engineering, health, and social areas.

The international workshop was also co-organised by the Korean Institute of Broadcast and Media Engineers (KIBME), Korea; Institute of Electronics, Information and Communication Engineers (IEICE), Japan; Institute of Image Information and Television Engineers (ITE), Japan; Japanese Society of Precision Engineering (JSPE-IAPI), Japan; National Science Council (NSC), Taiwan; IFMIA; ASIA; and University of Macau, Macau.

Khoo’s research titled Action Detection System for Alerting Driver Using Computer Vision was selected for the Best Paper Award and he walked away with a certificate.

Supervised by FICT academic Assoc Prof Ts Dr Lau Phooi Yee, Khoo’s research mainly focused on the driver’s secondary tasks recognition using the action detection method. He explained, “Nowadays, the increasing number of careless drivers on the road had resulted in more accident cases. Driver’s decisions and behaviours are the keys to maintaining road safety. However, many drivers tend to do secondary tasks like playing with their phones, adjusting the radio player, eating or drinking, answering phone calls, and worst case, reading phone texts. In previous efforts, many kinds of approaches had been introduced to solve the task as well as recognise and capture potential problems related to careless driving. In this project, the work will mainly focus on the driver’s secondary tasks recognition using the action detection method. A camera will be set up inside the car for the real-time extract of driver’s action. The video will undergo a process called the human pose estimator framework to extract the human pose frames without background. Inside this framework, raw images will be placed into a CNN network that computes human key points activation maps. After that, key points coordinate will be computed using the output activation maps and drawn on a new blank frame. Then the frames will be input into Pose-based Convolutional Neural Network for the action classification. If the action performed by the driver is considered a dangerous secondary task, an alert will be given. The proposed framework will be able to achieve a higher speed if it is run on Raspberry Pi CPU. It is able to detect 10 different actions by the driver; only talking to passengers and normal driving will not trigger the buzzer that alerts the driver.”

He added, “The human pose frame will be used to classify the driver’s action. This system is done inside Raspberry Pi, hence the pose estimation framework is relatively small, and by using multi-threading technique, both accuracy and speed can be achieved.”

When asked about how he felt being announced as one of the winners, he enthused, “I am extremely honoured to receive this award. For this, I want to thank my FYP supervisor Dr Lau Phooi Yee for encouraging me to participate in this conference and providing support all the time. This reward helps to strengthen my presentation skills, as well as research skills especially in the field of deep learning.”

Khoo was awarded a certificate for his win



Dr Lau (top row, second from left) and Khoo (third row, second from left) during the virtual Best Paper Award ceremony


© 2022 UNIVERSITI TUNKU ABDUL RAHMAN DU012(A).
Wholly owned by UTAR Education Foundation (200201010564(578227-M))         LEGAL STATEMENT   TERM OF USAGE   PRIVACY NOTICE