We develop advanced technologies in robotics, mechatronics, and sensing to create interactive, dynamic physical 3D displays and haptic interfaces that allow 3D information to be touched as well as seen. We are specifically interested in using these novel interfaces to support richer remote collaboration, computer aided design, education, and interfaces for people with visual impairments. In pursuit of these goals, we use a design process grounded in iterative prototyping and human centered design and look to create new understanding about human perception and interaction through controlled studies.
Our research in Human Computer Interaction and Human Robot Interaction currently directed in five areas:
Dynamic physical shape displays
Wearable Haptics for grasping in VR
Ubiquitous Robotic Interfaces
Soft actuation and Sensing
See a video of our vision!
How Kinesthetic Haptics Affects Causal Perception
Creating spatial tactile effects automatically by analyzing cross-modality features of a video
A Haptic and Audio Guidance System To Support Tactile Graphics Exploration
Extending the Limits of Haptic Mobile Robots with Redirection in VR
User-defined Swarm Robot Control
Augmenting Perceived Softness of Haptic Proxy Objects
Shape changing truss robots that crawl and engulf.
Towards High Spatial Resolution Refreshable 2.5D Tactile Shape Displays
An Accessible 3D Modelling Workflow for the Blind and Visually-Impaired Via 2.5D Shape Displays
A Tool to Help Blind Programmers Feel the Structure of Code
Augmenting haptic interaction in VR through perceptual illusions.
Understanding user perception of expressive robotic motion through different sensory modalities
Using Quadcopters to Appropriate Objects and the Environment for Haptics in Virtual Reality
Haptic Display with Swarm Robots
Automated Instrumentation for In-Circuit PCB Debugging with Dynamic Component Isolation
Editing Spatial Layouts through Tactile Templates for People with Visual Impairments
Physical visualizations that use collections of self-propelled objects to represent data
Visuo-Haptic Illusions for Improving the Perceived Performance of Shape Displays
A Functional Optimization Based Approach for Continuous 3D Retargeted Touch of Arbitrary, Complex Boundaries in Haptic Virtual Reality
A Mobile Tabletop Shape Display for Tangible and Haptic Interaction
Robotic Assembly of Haptic Proxy Objects for Tangible Interaction and Virtual Reality
A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality
Interactive Design and Debugging of Analog Circuits with Programmable Hardware
Ubiquitous Robotic Interfaces and Investigation of Abstract motion as a Display.
High extension pneumatic actuator.
Reconfigurable Tactile Elements for dynamic physical controls.
Building block for swarm user interface
A wearable haptic interface for grasping in virtual reality.
Miniature on-body robots as mobile wearables
Applications in shape change and tactile display
Display for mobile tactile interaction.