-->

A Prototype System for Computer Vision Based Human Computer Interaction

Lars Bretzner, Ivan Laptev, Tony Lindeberg, Sören Lenman, Yngve Sundblad

Technical report CVAP251, ISRN KTH NA/P--01/09--SE. Department of Numerical Analysis and Computer Science, KTH (Royal Institute of Technology), S-100 44 Stockholm, Sweden, April 23-25, 2001.

Demo presented at the Swedish IT-fair Connect 2001, Älvsjömässan, Stockholm, Sweden, April 2001. View the news reports about our gesture control system at www.idg.se, Dagens IT.

Introduction

With the development of information technology in our society, we can expect that computer systems to a larger extent will be embedded into our environment. These environments will impose needs for new types of human-computer-interaction, with interfaces that are natural and easy to use. In particular, the ability to interact with computerized equipment without need for special external equipment is attractive.

Today, the keyboard, the mouse and the remote control are used as the main interfaces for transferring information and commands to computerized equipment. In some applications involving three-dimensional information, such as visualization, computer games and control of robots, other interfaces based on trackballs, joysticks and datagloves are being used. In our daily life, however, we humans use our vision and hearing as main sources of information about our environment. Therefore, one may ask to what extent it would be possible to develop computerized equipment able to communicate with humans in a similar way, by understanding visual and auditive input.

Perceptual interfaces based on speech have already started to find a number of commercial and technical applications. For examples, systems are now available where speech commands can be use for dialing numbers in cellular phones or for making ticket reservations. Concerning visual input, the processing power of computers has reached a point where real-time processing of visual information is possible with common workstations.

The purpose of this article is to describe ongoing work of developing new perceptual interfaces with emphasis on commands expressed as hand gestures. Examples of applications of hand gestures analysis include:

  • Control of consumer electronics
  • Interaction with visualization systems
  • Control of mechanical systems
  • Computer games
Potential advantages of using visual input in this context are that visual information makes it possible to communicate with computerized equipment at a distance, without need for physical contact with the equipment that is to be controlled. Moreover, there will be no need for specialized external equipment, such as a remote control. The idea is that the user should be able to control the equipment as he is.

PDF: (522 kb)

Video clips illustrating applications of gesture control:

Related publications: (Feature-based method for simultaneous hand tracking and hand posture recognition including colour cues) (Earlier version of the feature-based method for simultaneous hand tracking and hand posture recognition) (Direct method for simultaneous hand tracking and hand posture recognition)

Related project: Computer vision based human-computer interaction

Responsible for this page: Tony Lindeberg Lars Bretzner