Research

We develop advanced technologies in robotics, mechatronics, and sensing to create interactive, dynamic physical 3D displays and haptic interfaces that allow 3D information to be touched as well as seen. We are specifically interested in using these novel interfaces to support richer remote collaboration, computer aided design, education, and interfaces for people with visual impairments. In pursuit of these goals, we use a design process grounded in iterative prototyping and human centered design and look to create new understanding about human perception and interaction through controlled studies.

Our research in Human Computer Interaction and Human Robot Interaction currently directed in five areas:

  • Dynamic physical shape displays
  • Wearable Haptics for grasping in VR
  • Ubiquitous Robotic Interfaces
  • Mobile Haptics
  • Soft actuation and Sensing

UbiSwarm

Ubiquitous Robotic Interfaces and Investigation of Abstract motion as a Display.

Pneumatic Reel Actuator

High extension pneumatic actuator.

shiftIO

Reconfigurable Tactile Elements for dynamic physical controls.

Zooids

Building block for swarm user interface

Wolverine

A wearable haptic interface for grasping in virtual reality.

Rovables

Miniature on-body robots as mobile wearables

Switchable Permanent Magnetic Actuators

Applications in shape change and tactile display

Haptic Edge Display

Display for mobile tactile interaction.