HUMAN PERCEPTION EXPERIMENT IN UNDERGROUND SPACE ENHANCED WITH MIXED REALITY

19th WORLD CONFERENCE OF THE ASSOCIATED RESEARCH CENTRES FOR THE URBAN UNDERGROUND SPACE, Belgrade, Serbia, November 4-7, 2025. (Paper No: 4.13.65,  pp. 777-783)

 

АУТОР(И) / AUTHOR(S): Mingyi Lin, Fang Liu

 

Download Full Pdf   

DOI:  10.46793/ACUUS2025.4.13.65

САЖЕТАК / ABSTRACT:

Underground spaces are rapidly expanding, with their functions increasingly extending to commercial, entertainment, and residential areas. However, the inherent characteristics of closure and complexity in these environments significantly impact user experience and safety. An experimental system was developed to explore human perception in underground spaces.

The system consisted of three main components: a mixed reality (MR) device, a set of wearable biosensors, and a data platform. Typical elements of the underground environment were built by digital tools and overlaid onto real-world physical space using the MR device. Scenarios, such as fire hazards or earthquakes, were simulated based on physical mechanism and introduced into the MR environment. During these simulations, human responses were monitored via wearable biosensors. All data were transmitted to and stored on the data platform, where analysis revealed correlations between human responses and environmental factors, providing insights for the optimization of underground spaces.

A fire drill scenario was studied as the application of this system. Smoke is generated by mixed-reality device according to the data derived from Fire Dynamics Simulator(FDS), after which the participant was asked to find the exit under low visibility. Meanwhile, the response of participants, including electroencephalogram(EEG), electrodermal activity(EDA) and photoplethysmographic(PPG), was collected for further analysis. It was indicated that the system could effectively build a vivid experimental scenario with less modeling work and higher movement freedom than virtual reality(VR) and provided a more immersive experience than augmented reality(AR). The human physiological signal can be recorded and utilized as a supplement for subjective response in the future.

КЉУЧНЕ РЕЧИ / KEYWORDS:

human factors, underground space, mixed reality, fire drill

ПРОЈЕКАТ / ACKNOWLEDGEMENT:

ЛИТЕРАТУРА / REFERENCES:

  • Cailing, W. (2018). Effects of Light and Signage on Evacuation Wayfinding Behavior in Underground Commercial Buildings. Chongqing University.
  • Chirag Deb, A. R. (2010). Evaluation of thermal comfort in a rail terminal location in India. Building and Environment, 45(11), 2571–2580.
  • Demirkan, D. C. (2020). An Evaluation of AR-Assisted Navigation for Search and Rescue in Underground Spaces. 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), 3–4.
  • Mann, S., Furness, T., Yuan, Y., Iorio, J., & Wang, Z. (2018). All Reality: Virtual, Augmented, Mixed (X), Mediated(X, Y), and Multimediated Reality. Steve Mann, Tom Furness, Yu Yuan, Jay Iorio, and Zixin Wang. https://arxiv.org/pdf/1804.08386.
  • Milgram, P., & Kishino, F. (1994). A Taxonomy of Mixed Reality Visual Displays. IEICE Trans. Information Systems, E77-D(12), 1321–1329.
  • Mitsuhara, H. (2024). Metaverse-Based Evacuation Training: Design , Implementation, and Experiment Focusing on Earthquake Evacuation. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 53(4), 1–26.
  • Mossberg, A., Nilsson, D., & Wahlqvist, J. (2021). Evacuation elevators in an underground metro station: A Virtual Reality evacuation experiment. Fire Safety Journal, 120(April 2020), 1-9.
  • Lu, M. Rodriguez, Z. Feng, D. Paes, A. B. Daemei, R. Vancetti, S. Mander, T. Mandal, K. R. R. & R. L. (2025). A Virtual Reality Exit Choice Experiment to Assess the Impact of Social Influence and Fire Wardens in a Metro Station Evacuation.1-24.
  • S, G. (2019). Evaluation of Smart Underground Mine Evacuation Efficiency through Virtual Reality Simulations. University of Nevada.
  • Sun, L., Ding, S., Ren, Y., Li, M., & Wang, B. (2022). Research on the Material and Spatial Psychological Perception Virtual Reality. Buildings, 12(9), 1–19.
  • Wang, C., Li, C., Zhou, T., Wang, D., An, X., Lv, J., & Wang, J. (2025). Immersive virtual reality experiments for emergency evacuation response in deep underground space. Tunnelling and Underground Space Technologyincorporating Trenchless Technology, 163(April), 1–19.
  • Wang, S., Xu, J., Minping, L., & Ling, H. (2021). VR-based Data Acquisition and Evaluation of Pedestrian Traffic Behaviors in Buildings. South Architecture, 6, 32–37.
  • Yang, W., & Jun, H. (2018). Cross-modal effects of illuminance and room temperature on indoor environmental perception. Building and Environment, 146(May), 280–288.
  • Yang, W., & Jun, H. (2019). Combined effects of acoustic, thermal, and illumination conditions on the comfort of discrete senses and overall indoor environment. Building and Environment, 148(December 2018), 623–633.
  • Yang, W., & Moon, H. J. (2018). Combined effects of sound and illuminance on indoor environmental perception. Applied Acoustics, 141(July), 136–143.
  • Yao, G., Yusong, Z., Kaiyun, C., Yiying, C., & Xianzhi, S. (2019). Solving Method of Effective Steying Activity Coefficignt for Underground Commercigi Streets Based on VR Experiment. Research and Exploration in Laboratory, 38(12), 10–13.
  • Zhuang, Y., Wang, F., Huang, Y., Song, X., & Rao, Y. (2025). Influence of guidance signs on platform evacuation in suburban railway tunnel under smoke and obstacle environment. High-Speed Railway, 3(1), 1–16.