3D Gestural Interfaces for Mobile

In this lightning talk, Aidan Feldman demos a prototype he built to manipulate virtual 3D objects using a mobile device. Dealing with "gestures" on a phone generally means tapping, swiping, etc., but all of these are limited to the 2D plane/area of the screen. Using the movement of the phone itself opens up new dimensions for gestural interfaces. With a (gyro-enabled) mobile device in hand and a 3D model displayed on the computer, this project passes device orientation data from the phone to the display over websockets, adjusting the model in real-time.


GitHub Repo for the project here.

This talk was presented at the NodeJS meetup at Spotify in New York.