FICT researchers get Best Paper Award at IWAIT 2021
A team of three lecturers from UTAR Faculty of Information and Communication Technology (FICT), namely Ts Lai Siew Cheng, Assoc Prof Ts Dr Lau Phooi Yee and Assoc Prof Ts Dr Tan Hung Khoon received the Best Paper Award at the International Workshop on Advanced Image Technology 2021 (IWAIT 2021) which was held on ZOOM from 5 to 6 January 2021.
Providing an international forum for researchers and engineers who are interested in the field of advanced image technologies, IWAIT 2021 was organised by Kagoshima University of Japan and co-organised by Korean Institute of Broadcast and Media Engineers (KIBME), Korea; Institute of Electronics Information and Communication Engineers (IEICE), Japan; Institute of Image Information and Television Engineers (ITE), Japan; Japanese Society of Precision Engineering (JSPE-IAPI), Japan; and Image Processing and Pattern Recognition Society (IPPR), Taiwan.
Titled, “3D Deformable Convolution for Action Classification in Videos” the award-winning research paper was one of the best papers among the 18 best papers awarded in the annual conference. The principal investigator Ts Lai and her two co-researchers walked away with a certificate of acknowledgement.
Focusing on the convolutional neural network to analyse visual imagery, the research paper investigated the 3D deformable filter in deep neural network for human action classification in the video domain. Ts Lai explained, “The conventional 3D filter is usually uniform in shape and this might not be able to capture the human motion during convolution process since human motion is usually not in a uniform shape. Thus, we want to apply the 3D deformable filter, which is not in a uniform shape, to a 3D convolutional neural network which is more appropriate to capture the human motion.”
When asked what inspired her to take on the topic, she replied, “My research areas are related to human action classification using deep learning methods. Human actions are difficult to predict since there are many different types of actions and the domains it covers are huge. However, recently using deep learning methods to classify human action have shown good performances and this motivates us to take up this research in the hope that we can contribute some improvements to the existing methods.”
She added, “We are very honoured to be awarded one of the best papers among the 18 best papers selected from this conference. I would also like to thank both of my supervisors, Dr Lau Phooi Yee and Dr Tan Hung Khoon for their ideas, guidance and support provided throughout the preparation of this paper. This award is an encouragement and motivation for me to further my research in human actions classification using deep learning.”