AbstractRecently, attention has focused on the development of computer-user interfaces, which combine digital information with physical environments. In this thesis, the author assumed the hypothesis that Computer Vision is a technique that has great potential to support the concept of marrying the digital world with the physical one. This was proven by developing a novel sensing technology that allows users to use their home computer and an ordinary web camera to detect multiple small physical objects, referred to as “Interactive Toys” (e.g. any pieces of pawns, cars, animals) in a collaborative environment, referred to as “Interactive Toys Environment” (e.g. playsets or boardgames).
The study comprises the development of novel algorithms that detect the intrusion of hands into the camera’s view by monitoring users’ actions (“move and place”). When the user finishes performing the current task by withdrawing their hands from the camera’s view, the system automatically responds according to the positions, directions and colours of the moved objects or toys. Toys tracking and identification are based on novel change detection algorithms that allow the system to know which toy or toys have moved in the last turn. Moreover, particular efforts were given to map the positions of the toys in the Interactive Toys Environment to the image coordinates system under various working conditions (e.g. camera vibration or background slightly moved).
Usability and effectiveness of the Interactive Toys Environment are demonstrated through a number of applications of computer games used and played by children. Some of these applications were evaluated in user studies, which have shown that Interactive Toys Environment provides a new application of computer vision to facilitate natural human-computer interaction (e.g. without forcing users to wear special equipment such as special sensing gloves or using chipped-up or tagged toys). It also encourages active thinking and social communication and improves the interaction between users and computers.
|Date of Award||Mar 2005|