Meskir, Arvando L.
Unknown Affiliation

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

Human-Centered AI for Immersive XR Environments: A Multisensor Fusion Approach for Adaptive Interaction and Cognitive Modeling Meskir, Arvando L.; Juvens, Talira N.; Vorsteyn, Junelle
Jurnal Teknik Informatika C.I.T Medicom Vol 17 No 4 (2025): Intelligent Decision Support System (IDSS)
Publisher : Institute of Computer Science (IOCS)

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.35335/cit.Vol17.2025.1397.pp219-229

Abstract

Immersive Extended Reality (XR) systems are rapidly expanding across education, training, healthcare, and industrial applications, yet most existing frameworks lack real-time adaptivity and personalized support based on users’ cognitive and emotional states. This research proposes a human-centered AI framework that integrates multisensor fusion with cognitive state modeling to enable adaptive and intelligent interaction within XR environments. The system combines data from eye tracking, body and hand motion capture, environmental sensors, audio input, and physiological signals such as EEG, EMG, and HRV. A hierarchical fusion engine performs low-, mid-, and high-level integration of multimodal signals, while deep learning models including CNNs, LSTMs, and multimodal transformers estimate user states related to attention, workload, fatigue, and emotion. The framework dynamically adapts the XR environment through real-time modifications to UI complexity, lighting, haptic feedback, content pacing, and virtual assistant behavior. Experimental results demonstrate substantial improvements in cognitive load prediction accuracy, interaction robustness, and user immersion compared to single-sensor or static XR systems. Users experienced reduced cognitive overload, enhanced task performance, and greater engagement across various simulated tasks.  Overall, this research advances human-centered AI by demonstrating how multisensor fusion and cognitive modeling can transform XR from passive simulation platforms into adaptive, perceptive, and user-responsive environments. The findings offer a foundation for next-generation XR systems that prioritize human well-being, performance, and comfort through continuous AI-driven personalization.