next up previous
Next: Third-generation: Wearabletetherless computer-mediated Up: Second-generation WearComp Previous: Second-generation WearComp

Augmented reality

Ivan Sutherland described a head-mounted display with half-silvered mirrors so that the wearer could see a virtual world superimposed on reality [12]. Sutherland's work, as well as more recent work by others [13] was characterized by its tethered nature. Because the wearer was tethered to a workstation which was generally powered from an AC outlet, the apparatus was confined to a lab or other fixed location. Another reason for augmented reality systems being confined to some limited space was/is the use of 2-part trackers that require that the user, wearing one part, be near the other part, which is typically heavy and often carefully positioned relative to surrounding objects[13].

Generation-1 and 2 WearComp was typically characterized by a display over only one eye. Early wearable displays (such as the one in Fig 1(c)) were constructed with a CRT above the eye, aimed down, and then a mirror was used, at a 45-degree angle, to direct the light through a lens, into the eye. In some cases, the lens was placed between the mirror and the CRT, and a partially silvered mirror was used. Regardless of whether a see-through display or a display that blocked one eye completely was used, the perceptual effect was to see the computer screen as though it was overlaid on reality. This resulted in an augmented reality experience, but differed from previous work by others, in the sense that I was free to roam about untethered, while engaged in this experience.

Much more recently, Reflection Technology introduced a display called the ``Private Eye'', making it possible to put together a wearable computer from off-the-shelf board-level components, giving rise to a number of independent wearable computing research efforts appearing simultaneously in the early 1990s[14]. Thad Starner, one of those who began wearable computing work in the 1990s, refer to the experience of seeing text on a non-see-through display as a form of augmented reality[15], because, even though the text completely blocks part of the vision in one eye, there is an illusion of overlay on account of the way the brain perceives input from both eyes as a combined image.

Unfortunately, the Private Eye-based wearable computers, unlike my earlier video-based systems, were primarily text-based systems. The Private Eye display consisted of a row of red LEDs that could be switched on and off rapidly, and a vibrating mirror that would create an apparent image, but this could not, of course, produce a good continuous-tone greyscale image. Thus I did not adopt this technology, but instead, envisioned, designed, and built the third-generation of WearComp based on the notion that it should support personal imaging, yet be at least as small and unobtrusive as the generation of off-the-shelf solutions built around the Private Eye.


next up previous
Next: Third-generation: Wearabletetherless computer-mediated Up: Second-generation WearComp Previous: Second-generation WearComp

Steve Mann
Tue Jan 6 23:24:56 EST 1998