Beyond Being Real

Parastoo Abtahi, Sidney Q. Hough, James A. Landay, Sean Follmer

This diagram shows the Beyond Real Framework illustrated as a simplified control block diagram, with the focus on the sensory integration component at the center. On the left, there is a block representing the real world. On the right, there is a block which represents the VR system. The real world is connected to the VR system through the beyond-real transformation block. The relevant features of the signal path from the real world to the beyond-real transformation block are the sensing & tracking features (the physical user state, cognitive user state, and environment state). The beyond-real transformation has the following features: the interaction task (selection, manipulation, and locomotion), the remapping parameters of the transformation (time(t), space(x,y,z), and body representation), and the mapping type (direct, fixed remapping, and dynamic remapping). The beyond-real transformation is then fed into the VR system block. In the virtual system, the key features are the remapping signifiers (invisible, egocentric, and exocentric). The real world and VR system also produce sensory feedback that feeds into a summing point and then into the sensory integration block. To capture the multisensory signals feeding into the sensory integration block,  there is a connected subcomponent representing the human sensory systems (vestibular, somatosensory, gustatory, olfactory, auditory, and visual). The sensory integration block feeds into the internal model, and the internal model feeds back into the sensory integration block.

The Beyond Being Real Framework: The VR system receives input from the real world, applies beyond-real transformations, and renders the remapping in VR. Users receive sensory information from both the real world and the VR system which are then integrated. Understanding sensory integration and how the user’s internal model is updated accordingly is integral for exploring open research questions around beyond-real VR interactions.

Abstract

We can create Virtual Reality (VR) interactions that have no equivalent in the real world by remapping spacetime or altering users’ body representation, such as stretching the user’s virtual arm for manipulation of distant objects or scaling up the user’s avatar to enable rapid locomotion. Prior research has leveraged such approaches, what we call beyond-real techniques, to make interactions in VR more practical, efficient, ergonomic, and accessible. We present a survey categorizing prior movement-based VR interaction literature as reality-based, illusory, or beyond-real interactions. We survey relevant conferences (CHI, IEEE VR, VRST, UIST, and DIS) while focusing on selection, manipulation, locomotion, and navigation in VR. For beyond-real interactions, we describe the transformations that have been used by prior works to create novel remappings. We discuss open research questions through the lens of the human sensorimotor control system and highlight challenges that need to be addressed for effective utilization of beyond-real interactions in future VR applications, including plausibility, control, long-term adaptation, and individual differences.

Supplementary Materials

Papers