Remote Touch

These two phones represent two users at separate locations.

Remote Touch
Developed in: Unity 5.x

Remote Touch is a networked, location-based experience on mobile, meant to provide a deeper connection between two people via technology and vibrotactile haptic feedback. It’s a prototype combining gesture, haptics, and location to create the illusion of social touch at a distance.

Functionally, each Android device sends out a GPS location to the corresponding networked device. This then allows for the “rope” to align with the direction of the other user. When a user tugs on the rope, the actuator responds in magnitude according to how far the rope is stretched.

This was published on and presented at the Human Computer Interaction International conference in Vancouver, BC in the Virtual, Augmented and Mixed Reality track. You can read more about the experience and see the presentation in my blog post.


Two user journeys are connected by a network. When User 1 interacts with the application, User 2 sees and feels the effects (vibrotactile + animations) opposite of how User 1 pulls on the rope.