A Synthetic Visual Environment with Hand Gesturing and Voice Input.
01 January 1989
This paper describes work under progress on synthetic visual environments. We propose a practical system that can be applied to mechanical CAD, and in the future can also serve as a platform for teleoperation. Instead of using expensive head tracking and head mounted display systems, we use a standard display and compute smooth shaded images using an AT&T Pixel machine. Our work has concentrated on using a VPL data glove to track the hand, bringing the synthetic world into the space of the user's hand. We describe some initial results in establishing a gesturing interface, to be later augmented with voice input. This type of interface has been used to implement a virtual control panel, and some standard geometric modeling tasks. Finally we outline some requirements for implementing an interface for teleoperation.