Quiros Ramirez, Maria A.; Wichert, Thomas; Hempel, Tom; Streuber, Stephan (2026)
Research Square Preprint.
DOI: 10.21203/rs.3.rs-8583699/v1
Streuber, Stephan; Rogula, S; Quirós-Ramírez, Miguel; others, others (2026)
Scientific Reports 16, 3721.
DOI: 10.1038/s41598-026-35955-y
Physiological synchrony refers to the temporal alignment of bodily signals, such as heart rate variability, between two or more individuals during social interaction. It reflects implicit, often unconscious processes that arise when people share attention, emotions, or behavioral rhythms in close physical proximity. Because these coordinated physiological patterns are linked to social cohesion, rapport, and effective communication, physiological synchrony provides a valuable window into the quality and dynamics of social interaction. Here, we study physiological synchrony during virtual interaction where interaction partners are not physically co-located but remotely connected via technology. This allows us to capture aspects of social connectedness that are not accessible through self-report or behavior alone, making it a powerful tool for understanding how people engage and collaborate across different media. In our study, triads of participants performed a collective creativity task in one of three conditions: face-to-face (F2F) collaboration, remote collaboration using video conferencing (Video), or remote collaboration using immersive Virtual Reality (VR). To quantify social interaction quality, we measured creative group performance, social presence, and heart rate variability synchrony (HRVS) as a marker of social cohesion. As expected, creative group performance and social presence were highest in the F2F condition and significantly reduced in the VR and Video conditions. However, we observed strong HRV synchrony in the VR and F2F conditions and significantly weaker HRV synchrony in the Video condition. Our study supports the idea that VR (unlike video conferencing) supports physiological synchronization processes important for social interactions. Future studies need to identify the underlying physiological and psychological processes.
Quiros-Ramírez, Alejandra Quiros; Haberland, Sarah; Hempel, Tobias; Arlt, Richard; Keune, Paul; Streuber, Stephan (2026)
Quiros-Ramírez, Alejandra Quiros; Haberland, Sarah; Hempel, Tobias; Arlt, Richard...
Empathic Computing 2, 202523.
DOI: 10.70401/ec.2025.0014
Aims: This study introduces and evaluates a virtual reality (VR) prototype designed for the Loving-Kindness Meditation (LKM) to support
mental health rehabilitation and relaxation in clinical contexts. The aims include the co-creation of a VR-based mindfulness experience with
clinical experts and the evaluation of its usability, user experience, and short-term effects on relaxation, affect, and self-compassion.
Methods: Following a design thinking and co-creation approach, the VR-based LKM experience was developed iteratively with input from
clinicians and computer scientists. The final prototype was implemented for the Meta Quest 3 and included five immersive scenes
representing phases of the LKM and transition scenes guided by a professionally narrated audio track. Eleven participants (M = 36.5 years,
SD = 14.6) experienced the 12-minute session. Pre- and post-session measures included relaxation, positive and negative affect schedule,
and self-compassion, complemented in the end by the Igroup Presence Questionnaire, usability measures and a semi-structured qualitative
interview.
Results: Participants reported significant decreases in negative affect (t(10) = -2.512, p = .0307, d = -1.037) and stress (t(10) = -3.318, p = .007,
d = -1.328), as well as increases in relaxation (t(10) = 5.487, p < .0001, d = 2.471) and self-compassion (t(10) = 2.231, p = .0497, d = 0.283).
Usability was rated as excellent (M = 92.5), and presence as good (M = 4.0, SD = 0.43). Qualitative feedback described the experience as
calming, aesthetically pleasing, and easy to engage with, highlighting the falling leaves and pulsating orb as effective design elements.
Conclusion: The co-designed VR-LKM prototype was perceived as highly usable and beneficial for inducing relaxation and self-compassion,
suggesting its potential as a supportive tool for clinical mindfulness interventions. The results indicate that immersive VR can effectively
facilitate engagement and emotional regulation, providing a foundation for future clinical trials and broader implementation in therapeutic
and wellness settings.
Schuil, Isabel; Kalamkar, Snehanjali; Simm, Stefan; Grubert, Jens; Streuber, Stephan (2025)
Schuil, Isabel; Kalamkar, Snehanjali; Simm, Stefan; Grubert, Jens...
DGMP/DGMS Kongress, Jena, Germany.
Streuber, Stephan; Wetzel, Nicole ; Pastel, Stefan ; Bürger, Dan; Witte, Kerstin (2025)
Springer Virtual Reality 29, 56.
DOI: 10.1007/s10055-025-01111-6
Virtual reality (VR) technologies are increasingly used in neuropsychological assessment of various cognitive functions. Compared to traditional laboratory studies, VR allows for a more natural environment and more complex task-related movements with a high degree of control over the environment. However, there are still few studies that transfer well-established paradigms for measuring attentional distraction by novel sounds in laboratory settings to virtual environments and sports activities. In this study, the oddball paradigm, which is well established in laboratory settings for studying attention, is transferred to table tennis in a virtual environment. While 33 subjects played virtual table tennis, they were presented with a task-irrelevant sequence of frequent standard sounds and infrequent novel sounds. Trials in which an unexpected novel sound preceded the ball’s appearance resulted in a delayed racket movement compared to trials in which a standard sound was presented. This distraction effect was observed in the first part of the experiment but disappeared with increasing exposure. The results suggest that unexpected and task-irrelevant novel sounds can initially distract attention and impair performance on a complex movement task in a rich environment. The results demonstrate that versions of the well-established oddball distraction paradigm can be used to study attentional distraction, its dynamics, and its effects on complex movements in naturalistic environments.
Behrens, Simone; Giel, Katrin; Schroeder, Philipp; Capobianco, Antonio; Quirós-Ramı́rez, Marı́a; Streuber, Stephan; Beck, Anne; Lenz, Bernd; Wolbers, Thomas; Karger, André; others, others (2025)
Behrens, Simone; Giel, Katrin; Schroeder, Philipp; Capobianco, Antonio...
Der Nervenarzt, 1–5.
DOI: 10.1007/s00115-025-01924-5
Schuil, Isabel; Kalamkar, Snehanjali; Grubert, Jens; Streuber, Stephan; Meißner, Karin (2024)
Schuil, Isabel; Kalamkar, Snehanjali; Grubert, Jens; Streuber, Stephan...
Mind-Bull. Mind-Body Med. Res 3, 16-17.
Albrecht, Matthias; Streuber, Stephan; Assländer, Lorenz; Streuber, Stephan (2024)
Springer Virtual Reality 28 (28).
DOI: 10.1007/s10055-024-01006-y
Falls are a major health concern. Existing augmented reality (AR) and virtual reality solutions for fall prevention aim to improve balance in dedicated training sessions. We propose a novel AR prototype as an assistive wearable device to improve balance and prevent falls in daily life. We use a custom head-mounted display toolkit to present augmented visual orientation cues in the peripheral field of view. The cues provide a continuous space-stationary visual reference frame for balance control using the user’s tracked head position. In a proof of concept study, users performed a series of balance trials to test the effect of the displayed visual cues on body sway. Our results showed that body sway can be reduced with our device, indicating improved balance. We also showed that superimposed movements of the visual reference in forward-backward or sideways directions induce respective sway responses. This indicates a direction-specific balance integration of the displayed cues. Based on our findings, we conclude that artificially generated visual orientation cues using AR can improve balance and could possibly reduce fall risk.
Assländer, Lorenz; Albrecht, Matthias; Diehl, Moritz; Missen, Kyle J. ; Carpenter, Mark G. ; Streuber, Stephan (2023)
Assländer, Lorenz; Albrecht, Matthias; Diehl, Moritz; Missen, Kyle J. ...
Scientific Reports 13 (1), 2594 | 2594.
DOI: 10.1038/s41598-023-29713-7
Sensory perturbations are a valuable tool to assess sensory integration mechanisms underlying balance. Implemented as systems-identification approaches, they can be used to quantitatively assess balance deficits and separate underlying causes. However, the experiments require controlled perturbations and sophisticated modeling and optimization techniques. Here we propose and validate a virtual reality implementation of moving visual scene experiments together with model-based interpretations of the results. The approach simplifies the experimental implementation and offers a platform to implement standardized analysis routines. Sway of 14 healthy young subjects wearing a virtual reality head-mounted display was measured. Subjects viewed a virtual room or a screen inside the room, which were both moved during a series of sinusoidal or pseudo-random room or screen tilt sequences recorded on two days. In a between-subject comparison of 10 6 min long pseudo-random sequences, each applied at 5 amplitudes, our results showed no difference to a real-world moving screen experiment from the literature. We used the independent-channel model to interpret our data, which provides a direct estimate of the visual contribution to balance, together with parameters characterizing the dynamics of the feedback system. Reliability estimates of single subject parameters from six repetitions of a 6 20-s pseudo-random sequence showed poor test–retest agreement. Estimated parameters show excellent reliability when averaging across three repetitions within each day and comparing across days (Intra-class correlation; ICC 0.7–0.9 for visual weight, time delay and feedback gain). Sway responses strongly depended on the visual scene, where the high-contrast, abstract screen evoked larger sway as compared to the photo-realistic room. In conclusion, our proposed virtual reality approach allows researchers to reliably assess balance control dynamics including the visual contribution to balance with minimal implementation effort.
Albrecht, Matthias; Assländer, Lorenz; Reiterer, Harald; Streuber, Stephan (2023)
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR), 691--701.
DOI: 10.1109/VR55154.2023.00084
Peripheral vision plays a significant role in human perception and orientation. However, its relevance for human-computer interaction, especially head-mounted displays, has not been fully explored yet. In the past, a few specialized appliances were developed to display visual cues in the periphery, each designed for a single specific use case only. A multi-purpose headset to exclusively augment peripheral vision did not exist yet. We introduce MoPeDT: Modular Peripheral Display Toolkit, a freely available, flexible, reconfigurable, and extendable headset to conduct peripheral vision research. MoPeDT can be built with a 3D printer and off-the-shelf components. It features multiple spatially configurable near-eye display modules and full 3D tracking inside and outside the lab. With our system, researchers and designers may easily develop and prototype novel peripheral vision interaction and visualization techniques. We demonstrate the versatility of our headset with several possible applications for spatial awareness, balance, interaction, feedback, and notifications. We conducted a small study to evaluate the usability of the system. We found that participants were largely not irritated by the peripheral cues, but the headset's comfort could be further improved. We also evaluated our system based on established heuristics for human-computer interaction toolkits to show how MoPeDT adapts to changing requirements, lowers the entry barrier for peripheral vision research, and facilitates expressive power in the combination of modular building blocks.
Fakultät Elektrotechnik und Informatik (FEI)
Friedrich-Streib-Str. 2
96450 Coburg
T +49 9561 317 648 Stephan.Streuber[at]hs-coburg.de