Oct 14. 2025 - Lecture

Designing the Built Environment through XR and Multisensory Simulation

Designing the Built Environment through XR and Multisensory Simulation

Time: 14:00 — 15:00, Tuesday, Oct. 14th

Venue: Lecture hall, 11th FL, Zhiyuan C1

Working language: English

Zoom meeting ID: 892 9529 9854

Passcode: 782467

 


 

Abstract:

Our work expands the boundaries of XR by demonstrating that immersive technologies can and should engage the full spectrum of human senses to create more meaningful, responsive environments. We’ve shown that XR is not just a tool for visualisation, but a platform for deeply embodied experiences that can influence wellbeing, behaviour, and social connection. What’s exciting is how these insights are already translating into other domains. We’re applying our findings to the design of multi-sensory HCI interfaces for fully autonomous vehicles, where immersive feedback can improve safety and user trust. We’re also exploring how haptic and spatial simulations in XR can support social interaction for older adults, helping to reduce isolation and enhance emotional connection in virtual settings. By grounding XR design in real human responses, measured through biometric and cognitive data, we’re helping to shape a future where immersive technologies are not only more realistic but also more empathetic, inclusive, and impactful across a wide range of applications.

 

This project aims to rethink how we design and evaluate built environments by creating immersive, multi-sensory XR simulations that go far beyond traditional visual tools (ocular-centric design). We’re developing virtual spaces that people can not only see, but also feel through temperature, touch, spatial sound, and even smell, making the experience much closer to real life. What makes this project especially innovative is our use of brain activity tracking and biometric data, such as heart rate, skin response, and eye movement, to understand how people react to different environments. This helps us measure emotional and physical reactions in real-time, giving us deeper insight into how a space might affect someone’s comfort, stress levels, or overall well-being. We’re not just building digital models; we’re creating interactive environments that respond to the human body and mind. This allows designers and stakeholders to test and improve spaces before they’re built, making them more inclusive, responsive, and supportive of human experience from the very beginning.

 

Introduction of Speaker:

Dr. Globa is a researcher, academic and designer working in the field of design and architecture, with strong research interests in algorithmic design, interactive systems and multi-sensory simulations. Dr. Globa is currently holding the position of Associate Head of Research Education and Senior Lecturer in Computational Design and Advanced Manufacturing at the University of Sydney. She is a member of the CoCoA research lab, the Sydney Environment Institute, the Net Zero Institute, and the SydneyNano research centre, and she closely collaborates with the Computer-Aided Architectural Design in Asia Association as a member of its executive committee. The research work and most of the teaching focus on the use of immersive technologies, multisensory simulation, biomaterials, biodiversity, and sustainability in architectural design and beyond. Dr. Globa developed multiple research projects using interactive Virtual (VR) and Augmented (AR) environments. VR and AR were applied in the projects involving design exploration, data visualisation and pre-occupancy simulation. She has extensive experience working with advanced computational analysis and simulations, real-time interactive applications and immersive analytics using modelling computer environments and gaming engines. Dr. Globa specialises in developing and testing digital and physical prototypes, interdisciplinary design, and academic and industry collaborations.

related news