3 resultados para Paper hand-held record PHR
em SAPIENTIA - Universidade do Algarve - Portugal
Resumo:
The goal of the project "SmartVision: active vision for the blind" is to develop a small and portable but intelligent and reliable system for assisting the blind and visually impaired while navigating autonomously, both outdoor and indoor. In this paper we present an overview of the prototype, design issues, and its different modules which integrate a GIS with GPS, Wi-Fi, RFID tags and computer vision. The prototype addresses global navigation by following known landmarks, local navigation with path tracking and obstacle avoidance, and object recognition. The system does not replace the white cane, but extends it beyond its reach. The user-friendly interface consists of a 4-button hand-held box, a vibration actuator in the handle of the cane, and speech synthesis. A future version may also employ active RFID tags for marking navigation landmarks, and speech recognition may complement speech synthesis.
Resumo:
The general goal of my research is to find out what is questioned whenever an animated film is made by an author who chooses to have maximum control over the device automatisms. I am trying to understand in what ways that specific kind of film relates with Cinema and the History of Art as a whole and, more specifically, how its filmic discourse is built within cinematic codes, workings and machinery. This paper, in particular, aims to establish that each time an author makes a film by suspending both automatic ‘motion’ and image recording functions—that which is often known as “cameraless” film—a process is initiated that simultaneously questions not only Cinema, within both expression and technology, but also the ontological position this same technology occupies in current media.
Resumo:
Human-robot interaction is an interdisciplinary research area which aims at integrating human factors, cognitive psychology and robot technology. The ultimate goal is the development of social robots. These robots are expected to work in human environments, and to understand behavior of persons through gestures and body movements. In this paper we present a biological and realtime framework for detecting and tracking hands. This framework is based on keypoints extracted from cortical V1 end-stopped cells. Detected keypoints and the cells’ responses are used to classify the junction type. By combining annotated keypoints in a hierarchical, multi-scale tree structure, moving and deformable hands can be segregated, their movements can be obtained, and they can be tracked over time. By using hand templates with keypoints at only two scales, a hand’s gestures can be recognized.