9 resultados para Wearable Computing Augmented Reality Interfaccia utente Smart Glass Android
em CentAUR: Central Archive University of Reading - UK
Resumo:
Augmented Reality systems overlay computer generated information onto a user's natural senses. Where this additional information is visual, the information is overlaid on the user's natural visual field of view through a head mounted (or “head-up”) display device. Integrated Home Systems provides a network that links every electrical device in the home which provides to a user both control and data transparency across the network.
Resumo:
The problems encountered by individuals with disabilities when accessing large public buildings is described and a solution based on the generation of virtual models of the built environment is proposed. These models are superimposed on a control network infrastructure, currently utilised in intelligent building applications such as lighting, heating and access control. The use of control network architectures facilitates the creation of distributed models that closely mirror both the physical and control properties of the environment. The model of the environment is kept local to the installation which allows the virtual representation of a large building to be decomposed into an interconnecting series of smaller models. This paper describes two methods of interacting with the virtual model, firstly a two dimensional aural representation that can be used as the basis of a portable navigational device. Secondly an augmented reality called DAMOCLES that overlays additional information on a user’s normal field of view. The provision of virtual environments offers new possibilities in the man-machine interface so that intuitive access to network based services and control functions can be given to a user.
Resumo:
The interface between humans and technology is a rapidly changing field. In particular as technological methods have improved dramatically so interaction has become possible that could only be speculated about even a decade earlier. This interaction can though take on a wide range of forms. Indeed standard buttons and dials with televisual feedback are perhaps a common example. But now virtual reality systems, wearable computers and most of all, implant technology are throwing up a completely new concept, namely a symbiosis of human and machine. No longer is it sensible simply to consider how a human interacts with a machine, but rather how the human-machine symbiotic combination interacts with the outside world. In this paper we take a look at some of the recent approaches, putting implant technology in context. We also consider some specific practical examples which may well alter the way we look at this symbiosis in the future. The main area of interest as far as symbiotic studies are concerned is clearly the use of implant technology, particularly where a connection is made between technology and the human brain and/or nervous system. Often pilot tests and experimentation has been carried out apriori to investigate the eventual possibilities before human subjects are themselves involved. Some of the more pertinent animal studies are discussed briefly here. The paper however concentrates on human experimentation, in particular that carried out by the authors themselves, firstly to indicate what possibilities exist as of now with available technology, but perhaps more importantly to also show what might be possible with such technology in the future and how this may well have extensive social effects. The driving force behind the integration of technology with humans on a neural level has historically been to restore lost functionality in individuals who have suffered neurological trauma such as spinal cord damage, or who suffer from a debilitating disease such as lateral amyotrophic sclerosis. Very few would argue against the development of implants to enable such people to control their environment, or some aspect of their own body functions. Indeed this technology in the short term has applications for amelioration of symptoms for the physically impaired, such as alternative senses being bestowed on a blind or deaf individual. However the issue becomes distinctly more complex when it is proposed that such technology be used on those with no medical need, but instead who wish to enhance and augment their own bodies, particularly in terms of their mental attributes. These issues are discussed here in the light of practical experimental test results and their ethical consequences.
Resumo:
Pocket Data Mining (PDM) is our new term describing collaborative mining of streaming data in mobile and distributed computing environments. With sheer amounts of data streams are now available for subscription on our smart mobile phones, the potential of using this data for decision making using data stream mining techniques has now been achievable owing to the increasing power of these handheld devices. Wireless communication among these devices using Bluetooth and WiFi technologies has opened the door wide for collaborative mining among the mobile devices within the same range that are running data mining techniques targeting the same application. This paper proposes a new architecture that we have prototyped for realizing the significant applications in this area. We have proposed using mobile software agents in this application for several reasons. Most importantly the autonomic intelligent behaviour of the agent technology has been the driving force for using it in this application. Other efficiency reasons are discussed in details in this paper. Experimental results showing the feasibility of the proposed architecture are presented and discussed.
Resumo:
A world of ubiquitous computing, full of networked mobile and embedded technologies, is approaching. The benefits of this technology are numerous, and act as the major driving force behind its development. These benefits are brought about, in part, by ubiquitous monitoring (UM): the continuous and wide spread collection of ?significant amounts of data about users
Resumo:
In their contribution to PNAS, Penner et al. (1) used a climate model to estimate the radiative forcing by the aerosol first indirect effect (cloud albedo effect) in two different ways: first, by deriving a statistical relationship between the logarithm of cloud droplet number concentration, ln Nc, and the logarithm of aerosol optical depth, ln AOD (or the logarithm of the aerosol index, ln AI) for present-day and preindustrial aerosol fields, a method that was applied earlier to satellite data (2), and, second, by computing the radiative flux perturbation between two simulations with and without anthropogenic aerosol sources. They find a radiative forcing that is a factor of 3 lower in the former approach than in the latter [as Penner et al. (1) correctly noted, only their “inline” results are useful for the comparison]. This study is a very interesting contribution, but we believe it deserves several clarifications.
Resumo:
Smart healthcare is a complex domain for systems integration due to human and technical factors and heterogeneous data sources involved. As a part of smart city, it is such a complex area where clinical functions require smartness of multi-systems collaborations for effective communications among departments, and radiology is one of the areas highly relies on intelligent information integration and communication. Therefore, it faces many challenges regarding integration and its interoperability such as information collision, heterogeneous data sources, policy obstacles, and procedure mismanagement. The purpose of this study is to conduct an analysis of data, semantic, and pragmatic interoperability of systems integration in radiology department, and to develop a pragmatic interoperability framework for guiding the integration. We select an on-going project at a local hospital for undertaking our case study. The project is to achieve data sharing and interoperability among Radiology Information Systems (RIS), Electronic Patient Record (EPR), and Picture Archiving and Communication Systems (PACS). Qualitative data collection and analysis methods are used. The data sources consisted of documentation including publications and internal working papers, one year of non-participant observations and 37 interviews with radiologists, clinicians, directors of IT services, referring clinicians, radiographers, receptionists and secretary. We identified four primary phases of data analysis process for the case study: requirements and barriers identification, integration approach, interoperability measurements, and knowledge foundations. Each phase is discussed and supported by qualitative data. Through the analysis we also develop a pragmatic interoperability framework that summaries the empirical findings and proposes recommendations for guiding the integration in the radiology context.
Resumo:
In this study we applied a smart biomaterial formed from a self-assembling, multi-functional synthetic peptide amphiphile (PA) to coat substrates with various surface chemistries. The combination of PA coating and alignment-inducing functionalised substrates provided a template to instruct human corneal stromal fibroblasts to adhere, become aligned and then bio-fabricate a highlyordered, multi-layered, three-dimensional tissue by depositing an aligned, native-like extracellular matrix. The newly-formed corneal tissue equivalent was subsequently able to eliminate the adhesive properties of the template and govern its own complete release via the action of endogenous proteases. Tissues recovered through this method were structurally stable, easily handled, and carrier-free. Furthermore, topographical and mechanical analysis by atomic force microscopy showed that tissue equivalents formed on the alignment-inducing PA template had highly-ordered, compact collagen deposition, with a two-fold higher elastic modulus compared to the less compact tissues produced on the non-alignment template, the PA-coated glass. We suggest that this technology represents a new paradigm in tissue engineering and regenerative medicine, whereby all processes for the biofabrication and subsequent self-release of natural, bioprosthetic human tissues depend solely on simple templatetissue feedback interactions.