This is the latest project that I have been working on this semester. Its a six week project on Interaction Design. We were asked to select any topic that we are interested in. Initially I took up “Interactive Movie” but later I changed it to Gesture Recognition. Apart from exploring the technology aspects, we read a lot of theory related to Interaction and New Media Design in particular. The books that were suggested to us explained the fundamentals of different aspects of design. One of the other reasons for the book study was to inculcate the habit of reading in us – the engineers especially, who have lost touch or never had the habit of reading. Not only we had to study, but we had to present our readings in front of the class to others so that each one shares his/her knowledge about the particular topic. This was one of the most interesting project I did at NID.
Human Gestures are mainly divided into 3 categories
- Facial Gestures
- Hand Gestures
- Full body Gestures
I started by studying the Air Marshaller’s Gestures that he/she does to guide the air craft. Then I tried to capture these gestures through blob detection. However the main problem with the blob detection was – The blob were fixed on screen and hence the blob detection depended on the height of the person using it. Also any disturbance on the blob would be captured as motion.
I recognized 2 gestures during this phase
- Wingwalker / Guide
- Straight Ahead
Then I started looking at ways to capture gestures or movements independent of the user’s height or position, few ideas that emerged were color tracking, tracking IR(infrared) LEDs and tracking Fiducial Markers.
Color tracking was too much dependent on the light that reflected the color. The settings that would detect the color during daylight would not work during night under artificial lighting.
Fiducial markers seemed more accurate and dependable choice for detecting movements. Hence I started with basic fiducial tracking getting the X,Y co ordinates of the markers on screen. Initially I got the markers printed on 5 faces of a cube.
Later I got the markers printed on a thick board and attached it to an elastic band, so that I could wear it easily.
I recognized a few basic gestures like
- STOP – right and left hand
- Turn the knob left and right
- Come Here / Go Away
- Zoom In / Zoom Out
Once the gestures were getting recognized, I was asked to map some application on those gestures. So I made a 3D interactive globe which can be rotated in any direction and also zoomed in and out.
Gesture Recognition in action !