935 resultados para INFORMATICS
Resumo:
Adequate hand-washing has been shown to be a critical activity in preventing the transmission of infections such as MRSA in health-care environments. Hand-washing guidelines published by various health-care related institutions recommend a technique incorporating six hand-washing poses that ensure all areas of the hands are thoroughly cleaned. In this paper, an embedded wireless vision system (VAMP) capable of accurately monitoring hand-washing quality is presented. The VAMP system hardware consists of a low resolution CMOS image sensor and FPGA processor which are integrated with a microcontroller and ZigBee standard wireless transceiver to create a wireless sensor network (WSN) based vision system that can be retargeted at a variety of health care applications. The device captures and processes images locally in real-time, determines if hand-washing procedures have been correctly undertaken and then passes the resulting high-level data over a low-bandwidth wireless link. The paper outlines the hardware and software mechanisms of the VAMP system and illustrates that it offers an easy to integrate sensor solution to adequately monitor and improve hand hygiene quality. Future work to develop a miniaturized, low cost system capable of being integrated into everyday products is also discussed.
Resumo:
In this research we focus on the Tyndall 25mm and 10mm nodes energy-aware topology management to extend sensor network lifespan and optimise node power consumption. The two tiered Tyndall Heterogeneous Automated Wireless Sensors (THAWS) tool is used to quickly create and configure application-specific sensor networks. To this end, we propose to implement a distributed route discovery algorithm and a practical energy-aware reaction model on the 25mm nodes. Triggered by the energy-warning events, the miniaturised Tyndall 10mm data collector nodes adaptively and periodically change their association to 25mm base station nodes, while 25mm nodes also change the inter-connections between themselves, which results in reconfiguration of the 25mm nodes tier topology. The distributed routing protocol uses combined weight functions to balance the sensor network traffic. A system level simulation is used to quantify the benefit of the route management framework when compared to other state of the art approaches in terms of the system power-saving.
Resumo:
The power consumption of wireless sensor networks (WSN) module is an important practical concern in building energy management (BEM) system deployments. A set of metrics are created to assess the power profiles of WSN in real world condition. The aim of this work is to understand and eventually eliminate the uncertainties in WSN power consumption during long term deployments and the compatibility with existing and emerging energy harvesting technologies. This paper investigates the key metrics in data processing, wireless data transmission, data sensing and duty cycle parameter to understand the system power profile from a practical deployment prospective. Based on the proposed analysis, the impacts of individual metric on power consumption in a typical BEM application are presented and the subsequent low power solutions are investigated.
Resumo:
Evaluation of temperature distribution in cold rooms is an important consideration in the design of food storage solutions. Two common approaches used in both industry and academia to address this question are the deployment of wireless sensors, and modelling with Computational Fluid Dynamics (CFD). However, for a realworld evaluation of temperature distribution in a cold room, both approaches have their limitations. For wireless sensors, it is economically unfeasible to carry out large-scale deployment (to obtain a high resolution of temperature distribution); while with CFD modelling, it is usually not accurate enough to get a reliable result. In this paper, we propose a model-based framework which combines the wireless sensors technique with CFD modelling technique together to achieve a satisfactory trade-off between minimum number of wireless sensors and the accuracy of temperature profile in cold rooms. A case study is presented to demonstrate the usability of the framework.
Resumo:
Traditionally, attacks on cryptographic algorithms looked for mathematical weaknesses in the underlying structure of a cipher. Side-channel attacks, however, look to extract secret key information based on the leakage from the device on which the cipher is implemented, be it smart-card, microprocessor, dedicated hardware or personal computer. Attacks based on the power consumption, electromagnetic emanations and execution time have all been practically demonstrated on a range of devices to reveal partial secret-key information from which the full key can be reconstructed. The focus of this thesis is power analysis, more specifically a class of attacks known as profiling attacks. These attacks assume a potential attacker has access to, or can control, an identical device to that which is under attack, which allows him to profile the power consumption of operations or data flow during encryption. This assumes a stronger adversary than traditional non-profiling attacks such as differential or correlation power analysis, however the ability to model a device allows templates to be used post-profiling to extract key information from many different target devices using the power consumption of very few encryptions. This allows an adversary to overcome protocols intended to prevent secret key recovery by restricting the number of available traces. In this thesis a detailed investigation of template attacks is conducted, along with how the selection of various attack parameters practically affect the efficiency of the secret key recovery, as well as examining the underlying assumption of profiling attacks in that the power consumption of one device can be used to extract secret keys from another. Trace only attacks, where the corresponding plaintext or ciphertext data is unavailable, are then investigated against both symmetric and asymmetric algorithms with the goal of key recovery from a single trace. This allows an adversary to bypass many of the currently proposed countermeasures, particularly in the asymmetric domain. An investigation into machine-learning methods for side-channel analysis as an alternative to template or stochastic methods is also conducted, with support vector machines, logistic regression and neural networks investigated from a side-channel viewpoint. Both binary and multi-class classification attack scenarios are examined in order to explore the relative strengths of each algorithm. Finally these machine-learning based alternatives are empirically compared with template attacks, with their respective merits examined with regards to attack efficiency.
Resumo:
BACKGROUND: A Royal Statistical Society Working Party recently recommended that "Greater use should be made of numerical, as opposed to verbal, descriptions of risk" in first-in-man clinical trials. This echoed the view of many clinicians and psychologists about risk communication. As the clinical trial industry expands rapidly across the globe, it is important to understand risk communication in Asian countries. METHODS: We conducted a cognitive experiment about participation in a hypothetical clinical trial of a pain relief medication and a survey in cancer and arthritis patients in Singapore. In part 1 of the experiment, the patients received information about the risk of side effects in one of three formats (frequency, percentage and verbal descriptor) and in one of two sequences (from least to most severe and from most to least severe), and were asked about their willingness to participate. In part 2, the patients received information about the risk in all three formats, in the same sequence, and were again asked about their willingness to participate. A survey of preference for risk presentation methods and usage of verbal descriptors immediately followed. RESULTS: Willingness to participate and the likelihood of changing one's decision were not affected by the risk presentation methods. Most patients indicated a preference for the frequency format, but patients with primary school or no formal education were indifferent. While the patients used the verbal descriptors "very common", "common" and "very rare" in ways similar to the European Commission's Guidelines, their usage of the descriptors "uncommon" and "rare" was substantially different from the EU's. CONCLUSION: In this sample of Asian cancer and arthritis patients, risk presentation format had no impact on willingness to participate in a clinical trial. However, there is a clear preference for the frequency format. The lay use of verbal descriptors was substantially different from the EU's.
Resumo:
© 2005-2012 IEEE.Within industrial automation systems, three-dimensional (3-D) vision provides very useful feedback information in autonomous operation of various manufacturing equipment (e.g., industrial robots, material handling devices, assembly systems, and machine tools). The hardware performance in contemporary 3-D scanning devices is suitable for online utilization. However, the bottleneck is the lack of real-time algorithms for recognition of geometric primitives (e.g., planes and natural quadrics) from a scanned point cloud. One of the most important and the most frequent geometric primitive in various engineering tasks is plane. In this paper, we propose a new fast one-pass algorithm for recognition (segmentation and fitting) of planar segments from a point cloud. To effectively segment planar regions, we exploit the orthonormality of certain wavelets to polynomial function, as well as their sensitivity to abrupt changes. After segmentation of planar regions, we estimate the parameters of corresponding planes using standard fitting procedures. For point cloud structuring, a z-buffer algorithm with mesh triangles representation in barycentric coordinates is employed. The proposed recognition method is tested and experimentally validated in several real-world case studies.
Resumo:
In recent years international policies have aimed to stimulate the use of information and communication technologies (ICT) in the field of health care. Belgium has also been affected by these developments and, for example, health electronic regional networks ("HNs") are established. Thanks to a qualitative case study we have explored the implementation of such innovations (HN) to better understand how health professionals collaborate through the HN and how the HN affect their relationships. Within the HNs studied a common good unites the actors: the continuity of care for a better quality of care. However behind this objective of continuity of care other individual motivations emerge. Some controversies need also to be resolved in order to achieve cooperative relationships. HNs have notably to take national developments into account. These developments raise the question of the control of medical knowledge and medical practice. Professional issues, and not only practical changes, are involved in these innovations. © 2008 The authors and IOS Press. All rights reserved.
Resumo:
This paper describes the architecture of the case based reasoning (CBR) component of Smartfire, a fire field modelling tool for use by members of the Fire Safety Engineering community who are not expert in modelling techniques. The CBR system captures the qualitative reasoning of an experienced modeller in the assessment of room geometries so as to set up the important initial parameters of the problem. The system relies on two important reasoning principles obtained from the expert: 1) there is a natural hierarchical retrieval mechanism which may be employed; and 2) much of the reasoning on a qualitative level is linear in nature, although the computational solution of the problem is non-linear. The paper describes the qualitative representation of geometric room information on which the system is based, and the principles on which the CBR system operates.
Resumo:
Daedalus is a computer tool, developed by an Italian magistrate - Carmelo Asaro - and integrated in his own daily routine as an investigating magistrate conducting inquiries, then as a prosecutor if and when the case investigated goes to court. This tool has recently been adopted by magistrates in judiciary offices throughout Italy, spawning moreover other related projects. First, this paper describes a sample session with daedalus. Next, an overview of an array of judicial tools leads to positioning daedalus in the context of the spectrum.
Resumo:
A formal representation is given of the situational structure, and the agents' beliefs about personal identity, in the Smemorato di Collegno amnesia case tried in 1927, in Pollenza, Italy. Another section discusses and formalizes a sample heuristic rule for conjecturing whether an individual identity other than personal, being conveyed by a toponym, was used literally or fictitiously in a given historical corpus of legal casenotes. For example, a landlocked city being named and referred to as though it was a sea port is a fairly good cue for assuming that the toponym is a disguise. Yet, the interpretation is governed by other conventions, when in a play by Shakeaspeare it is stated that a given scene is set on the sea coast of Bohemia. Further discussion of a situational casuistry for identification (especially individual and personal) along with more formal representations will appear in a companion paper "nissanidentifpirandello", also at the disciplinary meet of AI formalisms and legal applications.
Resumo:
Existing election algorithms suffer limited scalability. This limit stems from the communication design which in turn stems from their fundamentally two-state behaviour. This paper presents a new election algorithm specifically designed to be highly scalable in broadcast networks whilst allowing any processing node to become coordinator with initially equal probability. To achieve this, careful attention has been paid to the communication design, and an additional state has been introduced. The design of the tri-state election algorithm has been motivated by the requirements analysis of a major research project to deliver robust scalable distributed applications, including load sharing, in hostile computing environments in which it is common for processing nodes to be rebooted frequently without notice. The new election algorithm is based in-part on a simple 'emergent' design. The science of emergence is of great relevance to developers of distributed applications because it describes how higher-level self-regulatory behaviour can arise from many participants following a small set of simple rules. The tri-state election algorithm is shown to have very low communication complexity in which the number of messages generated remains loosely-bounded regardless of scale for large systems; is highly scalable because nodes in the idle state do not transmit any messages; and because of its self-organising characteristics, is very stable.
Resumo:
The Symposium, “Towards the sustainable use of Europe’s forests”, with sub-title “Forest ecosystem and landscape research: scientific challenges and opportunities” lists three fundamental substantive areas of research that are involved: Forest management and practices, Ecosystem processes and functional ecology, and Environmental economics and sociology. This paper argues that there are essential catalytic elements missing! Without these elements there is great danger that the aimed-for world leadership in the forest sciences will not materialize. What are the missing elements? All the sciences, and in particular biology, environmental sciences, sociology, economics, and forestry have evolved so that they include good scientific methodology. Good methodology is imperative in both the design and analysis of research studies, the management of research data, and in the interpretation of research finding. The methodological disciplines of Statistics, Modelling and Informatics (“SMI”) are crucial elements in a proposed Centre of European Forest Science, and the full involvement of professionals in these methodological disciplines is needed if the research of the Centre is to be world-class. Distributed Virtual Institute (DVI) for Statistics, Modelling and Informatics in Forestry and the Environment (SMIFE) is a consortium with the aim of providing world-class methodological support and collaboration to European research in the areas of Forestry and the Environment. It is suggested that DVI: SMIFE should be a formal partner in the proposed Centre for European Forest Science.