Interaction with on-screen objects using visual gesture recognition
Abstract
This paper will review the design of a working system that visually recognizes hand gestures for the control of a window based user interface. After an overview of the system, it will explore one aspect of gestural interaction in depth, hand tracking, and what is needed for the user to be able to interact comfortably with on-screen objects. We describe how the location of the hand is mapped to a location on the screen, and how it is both necessary and possible to smooth the camera input using a non-linear physical model of the cursor. The performance of the system is examined, especially with respect to object selection. We show how a standard HCI model of object selection (Fitts' Law) can be extended to model the selection performance of free-hand pointing.