A team of three Doctor of Philosophy (Engineering) students from the Lee Kong Chian Faculty of Engineering and Science (LKC FES) achieved outstanding success at the RoboCup Japan Open 2025 @Home League. Held from 2 to 5 May 2025 at the Shiga Daihatsu Arena in Otsu City, Shiga, Japan, and organised by the RoboCup Japan Regional Committee, the competition brought together leading robotics teams from around the world to demonstrate innovative solutions for service and assistive robots in domestic environments. The UTAR team won Second Place in the Overall Game, Best Poster Award, and 1st Place in the Open Challenge, all in the Bridge Competition.
From left: Chin, Kenny Chu, Dr Danny Ng, and Sim
The UTAR team, comprising team leader Sim Sheng Wei and members Chin Wee Jian and Kenny Chu Sau Kang, was supervised by LKC FES academic Ir Dr Danny Ng Wee Kiat and LKC FES Department of Mechatronics and BioMedical Engineering Head Assoc Prof Ir Dr Goh Choon Hian. Representing UTAR in the Open Platform League’s Bridge Competition, the team was tasked with designing a fully autonomous mobile service robot capable of handling real-world tasks in a home setting. Unlike the Domestic Standard Platform League where teams use identical hardware, the Open Platform League allowed full creative freedom in hardware and software development, encouraging innovation in design and architecture.
The team with their developed mobile service robot
The competition focused on two main tasks: Carry My Luggage and Receptionist. In the former, the robot had to recognise, pick up, and transport luggage items while navigating through a dynamic environment. In the latter, the robot was required to interact with guests using vision-based detection and natural speech capabilities to provide assistance and information. The team’s robot demonstrated its competence by completing both tasks efficiently, reflecting strong integration of cutting-edge AI technologies with practical robotics systems.
Kenny Chu introducing the team’s robot
A key highlight of the team’s project was the integration of Large Language Models (LLMs) and Vision-Language Models (VLMs) into the robot’s cognitive architecture. These models were employed alongside the Robot Operating System 2 (ROS 2) to achieve real-time modular performance. The LLM enabled natural language understanding and task reasoning, enhancing human–robot interaction, while the VLM facilitated context-aware object recognition and scene interpretation. This allowed the robot to understand and respond to commands such as identifying objects, locating people, and navigating to specific locations in real-world scenarios.
Reflecting on their achievement, the team shared that the competition provided a unique opportunity to bridge the gap between academic theory and real-world implementation. “We joined RoboCup Japan Open because it offers a realistic and challenging platform to test our service robot in actual domestic scenarios. It pushes us to go beyond simulation and theory, and to build systems that can perform reliably in real-world environments,” they said. They emphasised that participating in the Open Platform League was especially rewarding as it allowed the team full freedom to innovate. “The league aligns perfectly with our research on integrating large AI models into robotics,” the team enthused.
Expressing pride in their achievement, they added, “We’re incredibly proud to have secured Second Place in such a competitive and technically demanding event. It’s a testament to our team’s hard work and to the strength of our design, particularly our use of large language and vision-language models to enhance robot autonomy and interaction.” They further noted the broader impact of their accomplishment, stating, “This achievement validates our research direction. It proves that LLMs and VLMs are not just powerful in controlled settings, they can be deployed on actual robots to improve adaptability, perception, and interaction.”
The winning entry showcased the team’s expertise in developing an AI-powered mobile service robot with robust capabilities in perception, interaction, navigation, and task execution. Through this accomplishment, the UTAR team has demonstrated the feasibility of deploying advanced foundation models in real-world service robotics and reinforced the University’s standing in the field of intelligent systems and applied AI.
© 2025 UNIVERSITI TUNKU ABDUL RAHMAN DU012(A).
Wholly owned by UTAR Education Foundation (200201010564(578227-M)) LEGAL STATEMENT TERM OF USAGE PRIVACY NOTICE