We develop advanced technologies in robotics, mechatronics, and sensing to create interactive, dynamic physical 3D displays and haptic interfaces that allow 3D information to be touched as well as seen. We are specifically interested in using these novel interfaces to support richer remote collaboration, computer aided design, education, and interfaces for people with visual impairments. In pursuit of these goals, we use a design process grounded in iterative prototyping and human centered design and look to create new understanding about human perception and interaction through controlled studies.

Our research in Human Computer Interaction and Human Robot Interaction currently directed in five areas:

  • Dynamic physical shape displays
  • Wearable Haptics for grasping in VR
  • Ubiquitous Robotic Interfaces
  • Mobile Haptics
  • Soft actuation and Sensing

Robotic Assembly

Robotic Assembly of Haptic Proxy Objects for Tangible Interaction and Virtual Reality


A Wearable Haptic Interface for Simulating Weight and Grasping in Virtual Reality


Interactive Design and Debugging of Analog Circuits with Programmable Hardware


A Mobile Tabletop Shape Display for Tangible and Haptic Interaction


Ubiquitous Robotic Interfaces and Investigation of Abstract motion as a Display.

Pneumatic Reel Actuator

High extension pneumatic actuator.


Reconfigurable Tactile Elements for dynamic physical controls.


Building block for swarm user interface


A wearable haptic interface for grasping in virtual reality.


Miniature on-body robots as mobile wearables

Switchable Permanent Magnetic Actuators

Applications in shape change and tactile display

Haptic Edge Display

Display for mobile tactile interaction.