Remote Touch

These two phones represent two users at separate locations.

Remote Touch
Developed in: Unity 5.x

Remote Touch is a networked, location-based experience on mobile, meant to provide a deeper connection between two people via technology and vibrotactile haptic feedback. It’s a prototype combining gesture, haptics, and location to create the illusion of social touch at a distance.

Functionally, each Android device sends out a GPS location to the corresponding networked device. This then allows for the “rope” to align with the direction of the other user. When a user tugs on the rope, the actuator responds in magnitude according to how far the rope is stretched.

This was published on and presented at the Human Computer Interaction International conference in Vancouver, BC in the Virtual, Augmented and Mixed Reality track. You can read more about the experience and see the presentation in my blog post.

My contributions to the project included:

Building the experience – I developed the application in Unity and built it to Android devices. Networking was a large challenge, but thankfully Photon plugin handled most of that work. Sending latitude and longitude to the other device and rotating the needle in the direction of the other user was a big part of the development.

Designing look and feel – I created all art assets, designed the animations of the bouncing back of the tugged rope, and designed the haptic effects to add a tension to the pull and release.

Two user journeys are connected by a network. When User 1 interacts with the application, User 2 sees and feels the effects (vibrotactile + animations) opposite of how User 1 pulls on the rope.