Albrecht, Matthias; Streuber, Stephan; Assländer, Lorenz; Streuber, Stephan (2024)
Springer Virtual Reality 28 (28).
DOI: 10.1007/s10055-024-01006-y
Assländer, Lorenz; Albrecht, Matthias; Diehl, Moritz; Missen, Kyle J. ; Carpenter, Mark G. ; Streuber, Stephan (2023)
Assländer, Lorenz; Albrecht, Matthias; Diehl, Moritz; Missen, Kyle J. ...
Scientific Reports 13, 2594 (1), S. 2594.
DOI: 10.1038/s41598-023-29713-7
Sensory perturbations are a valuable tool to assess sensory integration mechanisms underlying balance. Implemented as systems-identification approaches, they can be used to quantitatively assess balance deficits and separate underlying causes. However, the experiments require controlled perturbations and sophisticated modeling and optimization techniques. Here we propose and validate a virtual reality implementation of moving visual scene experiments together with model-based interpretations of the results. The approach simplifies the experimental implementation and offers a platform to implement standardized analysis routines. Sway of 14 healthy young subjects wearing a virtual reality head-mounted display was measured. Subjects viewed a virtual room or a screen inside the room, which were both moved during a series of sinusoidal or pseudo-random room or screen tilt sequences recorded on two days. In a between-subject comparison of 10 6 min long pseudo-random sequences, each applied at 5 amplitudes, our results showed no difference to a real-world moving screen experiment from the literature. We used the independent-channel model to interpret our data, which provides a direct estimate of the visual contribution to balance, together with parameters characterizing the dynamics of the feedback system. Reliability estimates of single subject parameters from six repetitions of a 6 20-s pseudo-random sequence showed poor test–retest agreement. Estimated parameters show excellent reliability when averaging across three repetitions within each day and comparing across days (Intra-class correlation; ICC 0.7–0.9 for visual weight, time delay and feedback gain). Sway responses strongly depended on the visual scene, where the high-contrast, abstract screen evoked larger sway as compared to the photo-realistic room. In conclusion, our proposed virtual reality approach allows researchers to reliably assess balance control dynamics including the visual contribution to balance with minimal implementation effort.
Albrecht, Matthias; Assländer, Lorenz; Reiterer, Harald; Streuber, Stephan (2023)
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR), S. 691--701.
DOI: 10.1109/VR55154.2023.00084
Peripheral vision plays a significant role in human perception and orientation. However, its relevance for human-computer interaction, especially head-mounted displays, has not been fully explored yet. In the past, a few specialized appliances were developed to display visual cues in the periphery, each designed for a single specific use case only. A multi-purpose headset to exclusively augment peripheral vision did not exist yet. We introduce MoPeDT: Modular Peripheral Display Toolkit, a freely available, flexible, reconfigurable, and extendable headset to conduct peripheral vision research. MoPeDT can be built with a 3D printer and off-the-shelf components. It features multiple spatially configurable near-eye display modules and full 3D tracking inside and outside the lab. With our system, researchers and designers may easily develop and prototype novel peripheral vision interaction and visualization techniques. We demonstrate the versatility of our headset with several possible applications for spatial awareness, balance, interaction, feedback, and notifications. We conducted a small study to evaluate the usability of the system. We found that participants were largely not irritated by the peripheral cues, but the headset's comfort could be further improved. We also evaluated our system based on established heuristics for human-computer interaction toolkits to show how MoPeDT adapts to changing requirements, lowers the entry barrier for peripheral vision research, and facilitates expressive power in the combination of modular building blocks.
Fakultät Elektrotechnik und Informatik (FEI)
Friedrich-Streib-Str. 2
96450 Coburg
T +49 9561 317 648 Stephan.Streuber[at]hs-coburg.de