11 Nov 2024
Student's creative art pieces selected in overseas competitions
By placing a smartphone inside regular VR goggles, students can practise in real-life interpretation and speaking scenarios, greatly enhancing the sense of reality.
Dr Chan's research results show that the XR MALL can motivate students to learn independently and enhance their learning outcomes.
Screenshots of XR MALL showing virtual interpretation and public-speaking scenarios.
Download XR MALL
Simultaneous interpretation and public speaking both emphasise on-the-spot reaction. However, it's not an easy task to provide students with real-life interpreting and speaking scenarios for practice. Therefore, Dr Venus Chan, Assistant Professor in the School of Arts and Social Sciences of Hong Kong Metropolitan University (HKMU), led a team to develop the “XR MALL” interactive mobile app, which combines Virtual Reality (VR) and Augmented Reality (AR) technologies to offer immersive simultaneous interpretation, consecutive interpretation and public-speaking scenarios on a smartphone. It enables students, as well as the general public, to overcome the constraints of time, space and resources to practise interpretation and public speaking anytime, anywhere, providing a flexible, student-centred learning model. The project, funded by the University Grants Committee (UGC), was found to significantly enhance students' learning outcomes, experience and motivation.
XR MALL consists of 16 modules, including eight modules on interpretation, covering bi-directional Chinese-English interpretation, consecutive interpretation and simultaneous interpretation training, and eight modules that focus on public speaking. Each module contains interactive learning content developed by Dr Chan and her team, allowing students to learn independently. Upon completion of each module, students can place their smartphones inside regular VR goggles and experience real-life interpreting or speaking situations in virtual scenarios, such as press conferences and large-scale conferences. The app records the students' performance for review and sharing, providing automatic voice analysis for brief evaluation, and integrates virtual scenarios with real-life environments using XR technology. This integration enables students to simultaneously practise note-taking while interpreting, providing a highly realistic reproduction of real-life situations and deepening their learning experience.
To evaluate the effectiveness of the app, Dr Chan and her team conducted a two-semester trial with some 60 local university students. The study findings revealed that the participating students outperformed the control group in the areas of visual interpretation, consecutive and simultaneous interpretation, and English speaking. Dr Chan elaborated, “The XR MALL programme addresses students' enthusiasm for new technologies and utilises a variety of technological solutions to create an extended environment that blends the virtual and the real, facilitating learning anytime, anywhere. XR MALL's virtual teaching environment creates contextual learning experiences, empowering students to explore virtual worlds through immersive role-playing and first-person experiences.”
In addition, students can gain a deeper understanding of their strengths and weaknesses through feedback from both the app and their peers by using the app's built-in features, such as automatic voice analysis and online discussion forums. Dr Chan added, “Research data demonstrates that XR MALL significantly enhances students' interpreting and public-speaking performance, as well as their cognitive and affective development. It enriches the learning experience and increases motivation.” She emphasised that the app is not only for students studying interpretation, but also for individuals interested in public speaking to practise comfortably in a virtual environment on their own.
Dr Chan's team received funding in 2020 for two projects, titled “Developing public speaking skills via virtual speaking practice” and “Development of a mobile-based mixed reality application for interpreting learning”, to develop VR web apps for English public speaking and VR mobile apps for interpreting exercises, respectively, and to study the effectiveness of the applications. Building on these two research projects, the team received nearly HK$800,000 from the UGC to initiate the XR MALL project.
Looking ahead, Dr Chan's team will continue to fine-tune the design of XR MALL and incorporate more innovative and flexible learning content. This includes offering more scenarios to train students in different disciplines to deliver oral presentations and speeches. The team also plans to integrate AI for real-time evaluation of students' body language so that students can get more comprehensive feedback, unlocking further possibilities for next-generation learning.
By placing a smartphone inside regular VR goggles, students can practise in real-life interpretation and speaking scenarios, greatly enhancing the sense of reality.
Simultaneous interpretation and public speaking both emphasise on-the-spot reaction. However, it's not an easy task to provide students with real-life interpreting and speaking scenarios for practice. Therefore, Dr Venus Chan, Assistant Professor in the School of Arts and Social Sciences of Hong Kong Metropolitan University (HKMU), led a team to develop the “XR MALL” interactive mobile app, which combines Virtual Reality (VR) and Augmented Reality (AR) technologies to offer immersive simultaneous interpretation, consecutive interpretation and public-speaking scenarios on a smartphone. It enables students, as well as the general public, to overcome the constraints of time, space and resources to practise interpretation and public speaking anytime, anywhere, providing a flexible, student-centred learning model. The project, funded by the University Grants Committee (UGC), was found to significantly enhance students' learning outcomes, experience and motivation.
XR MALL consists of 16 modules, including eight modules on interpretation, covering bi-directional Chinese-English interpretation, consecutive interpretation and simultaneous interpretation training, and eight modules that focus on public speaking. Each module contains interactive learning content developed by Dr Chan and her team, allowing students to learn independently. Upon completion of each module, students can place their smartphones inside regular VR goggles and experience real-life interpreting or speaking situations in virtual scenarios, such as press conferences and large-scale conferences. The app records the students' performance for review and sharing, providing automatic voice analysis for brief evaluation, and integrates virtual scenarios with real-life environments using XR technology. This integration enables students to simultaneously practise note-taking while interpreting, providing a highly realistic reproduction of real-life situations and deepening their learning experience.
To evaluate the effectiveness of the app, Dr Chan and her team conducted a two-semester trial with some 60 local university students. The study findings revealed that the participating students outperformed the control group in the areas of visual interpretation, consecutive and simultaneous interpretation, and English speaking. Dr Chan elaborated, “The XR MALL programme addresses students' enthusiasm for new technologies and utilises a variety of technological solutions to create an extended environment that blends the virtual and the real, facilitating learning anytime, anywhere. XR MALL's virtual teaching environment creates contextual learning experiences, empowering students to explore virtual worlds through immersive role-playing and first-person experiences.”
In addition, students can gain a deeper understanding of their strengths and weaknesses through feedback from both the app and their peers by using the app's built-in features, such as automatic voice analysis and online discussion forums. Dr Chan added, “Research data demonstrates that XR MALL significantly enhances students' interpreting and public-speaking performance, as well as their cognitive and affective development. It enriches the learning experience and increases motivation.” She emphasised that the app is not only for students studying interpretation, but also for individuals interested in public speaking to practise comfortably in a virtual environment on their own.
Dr Chan's team received funding in 2020 for two projects, titled “Developing public speaking skills via virtual speaking practice” and “Development of a mobile-based mixed reality application for interpreting learning”, to develop VR web apps for English public speaking and VR mobile apps for interpreting exercises, respectively, and to study the effectiveness of the applications. Building on these two research projects, the team received nearly HK$800,000 from the UGC to initiate the XR MALL project.
Looking ahead, Dr Chan's team will continue to fine-tune the design of XR MALL and incorporate more innovative and flexible learning content. This includes offering more scenarios to train students in different disciplines to deliver oral presentations and speeches. The team also plans to integrate AI for real-time evaluation of students' body language so that students can get more comprehensive feedback, unlocking further possibilities for next-generation learning.
SIGN UP FOR OUR LATEST NEWS
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.
If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.