DEMO
Zoe Steine-Hanson, Musa Mahmood, Richard Waltman, Tharun Iyer, Blake Larkin, Jameson Moore, Joseph Artuso, Conor Russomanno
OpenBCI
Advances in wearable physiological sensing are enabling real-time inference of cognitive states such as attention and stress, creating new opportunities for adaptive virtual reality (VR) experiences. By moving beyond traditional input methods, real-time biosensing can enable VR systems to dynamically respond to a user’s internal state, enabling more immersive, personalized, and responsive interactions. However, reliably estimating cognitive states from physiological data remains challenging due to noise, variability across users, and sensitivity to real-world conditions. We will demonstrate a real-time, multimodal approach to cognitive state detection within VR. We collected data from 45 participants performing attention, stress and cybersickness-inducing tasks in VR while wearing the OpenBCI Galea headset, which records EEG, EMG, EOG, EDA, PPG, and eye-tracking signals. We used this data to train machine learning models that we then deployed in a real-time pipeline to generate continuous estimates of cognitive state during VR experiences.
Participants will engage in a live VR flight simulator demo, and get to see their physiological signals processed in real time. This demo highlights how real-time cognitive state detection can transform VR into an adaptive medium, enabling applications in training, performance optimization, and interactive environments that respond intelligently to the user.