Publication | Closed Access
Augmented Reality through Wearable Computing
415
Citations
9
References
1997
Year
Wearable SystemEngineeringWearable TechnologyWearable ComputerComputer-mediated RealityVirtual RealityAffective ComputingRemembrance AgentMultimodal Human Computer InterfaceAssistive TechnologyMobile ComputingAugmented RealityReal WorldGraphical OverlayExtended RealityBusinessHuman-computer InteractionTechnologyWearable Computing
Wearable computing shifts computation from desktop to the user, enabling body‑worn computers to adapt to changing environments and assist more intelligently, consistently, and continuously than desktop systems. The project aims to build a community of networked wearable‑computer users to explore long‑term augmented realities, model user actions, anticipate needs, and enable seamless interaction between virtual and physical environments, illustrated by the Remembrance Agent. The Remembrance Agent uses video cameras to warp visual input and sense the environment for overlays, tracks the user’s finger as a mouse, performs face recognition, detects passive objects for 2.5D/3D graphics, and incorporates audio, infrared beacons, and biosensors to learn the wearer’s affect.
Wearable computing moves computation from the desktop to the user. We are forming a community of networked, wearable-computer users to explore, over a long period, the augmented realities that these systems can provide. By adapting its behavior to the user's changing environment, a body-worn computer can assist the user more intelligently, consistently, and continuously than a desktop system. A text-based augmented reality, the Remembrance Agent, is presented to illustrate this approach. Video cameras are used both to warp the visual input (mediated reality) and to sense the user's world for graphical overlay. With a camera, the computer could track the user's finger to act as the system's mouse; perform face recognition; and detect passive objects to overlay 2.5D and 3D graphics onto the real world. Additional apparatus such as audio systems, infrared beacons for sensing location, and biosensors for learning about the wearer's affect are described. With the use of input from these interface devices and sensors, a long-term goal of this project is to model the user's actions, anticipate his or her needs, and perform a seamless interaction between the virtual and physical environments.
| Year | Citations | |
|---|---|---|
Page 1
Page 1