676 resultados para real-world
Resumo:
Medical imaging is a powerful diagnostic tool. Consequently, the number of medical images taken has increased vastly over the past few decades. The most common medical imaging techniques use X-radiation as the primary investigative tool. The main limitation of using X-radiation is associated with the risk of developing cancers. Alongside this, technology has advanced and more centres now use CT scanners; these can incur significant radiation burdens compared with traditional X-ray imaging systems. The net effect is that the population radiation burden is rising steadily. Risk arising from X-radiation for diagnostic medical purposes needs minimising and one way to achieve this is through reducing radiation dose whilst optimising image quality. All ages are affected by risk from X-radiation however the increasing population age highlights the elderly as a new group that may require consideration. Of greatest concern are paediatric patients: firstly they are more sensitive to radiation; secondly their younger age means that the potential detriment to this group is greater. Containment of radiation exposure falls to a number of professionals within medical fields, from those who request imaging to those who produce the image. These staff are supported in their radiation protection role by engineers, physicists and technicians. It is important to realise that radiation protection is currently a major European focus of interest and minimum competence levels in radiation protection for radiographers have been defined through the integrated activities of the EU consortium called MEDRAPET. The outcomes of this project have been used by the European Federation of Radiographer Societies to describe the European Qualifications Framework levels for radiographers in radiation protection. Though variations exist between European countries radiographers and nuclear medicine technologists are normally the professional groups who are responsible for exposing screening populations and patients to X-radiation. As part of their training they learn fundamental principles of radiation protection and theoretical and practical approaches to dose minimisation. However dose minimisation is complex – it is not simply about reducing X-radiation without taking into account major contextual factors. These factors relate to the real world of clinical imaging and include the need to measure clinical image quality and lesion visibility when applying X-radiation dose reduction strategies. This requires the use of validated psychological and physics techniques to measure clinical image quality and lesion perceptibility.
Resumo:
WiDom is a wireless prioritized medium access control protocol which offers a very large number of priority levels. Hence, it brings the potential to employ non-preemptive static-priority scheduling and schedulability analysis for a wireless channel assuming that the overhead of WiDom is modeled properly. One schedulability analysis for WiDom has already been proposed but recent research has created a new version of WiDom (we call it: Slotted WiDom) with lower overhead and for this version of WiDom no schedulability analysis exists. In this paper we propose a new schedulability analysis for slotted WiDom and extend it to work also for message streams with release jitter. We have performed experiments with an implementation of slotted WiDom on a real-world platform (MicaZ). We find that for each message stream, the maximum observed response time never exceeds the calculated response time and hence this corroborates our belief that our new scheduling theory is applicable in practice.
Resumo:
Electrocardiography (ECG) biometrics is emerging as a viable biometric trait. Recent developments at the sensor level have shown the feasibility of performing signal acquisition at the fingers and hand palms, using one-lead sensor technology and dry electrodes. These new locations lead to ECG signals with lower signal to noise ratio and more prone to noise artifacts; the heart rate variability is another of the major challenges of this biometric trait. In this paper we propose a novel approach to ECG biometrics, with the purpose of reducing the computational complexity and increasing the robustness of the recognition process enabling the fusion of information across sessions. Our approach is based on clustering, grouping individual heartbeats based on their morphology. We study several methods to perform automatic template selection and account for variations observed in a person's biometric data. This approach allows the identification of different template groupings, taking into account the heart rate variability, and the removal of outliers due to noise artifacts. Experimental evaluation on real world data demonstrates the advantages of our approach.
Resumo:
This report describes the development of a Test-bed Application for the ART-WiSe Framework with the aim of providing a means of access, validate and demonstrate that architecture. The chosen application is a kind of pursuit-evasion game where a remote controlled robot, navigating through an area covered by wireless sensor network (WSN), is detected and continuously tracked by the WSN. Then a centralized control station takes the appropriate actions for a pursuit robot to chase and “capture” the intruder one. This kind of application imposes stringent timing requirements to the underlying communication infrastructure. It also involves interesting research problems in WSNs like tracking, localization, cooperation between nodes, energy concerns and mobility. Additionally, it can be easily ported into a real-world application. Surveillance or search and rescue operations are two examples where this kind of functionality can be applied. This is still a first approach on the test-bed application and this development effort will be continuously pushed forward until all the envisaged objectives for the Art-WiSe architecture become accomplished.
Resumo:
Physical computing has spun a true global revolution in the way in which the digital interfaces with the real world. From bicycle jackets with turn signal lights to twitter-controlled christmas trees, the Do-it-Yourself (DiY) hardware movement has been driving endless innovations and stimulating an age of creative engineering. This ongoing (r)evolution has been led by popular electronics platforms such as the Arduino, the Lilypad, or the Raspberry Pi, however, these are not designed taking into account the specific requirements of biosignal acquisition. To date, the physiological computing community has been severely lacking a parallel to that found in the DiY electronics realm, especially in what concerns suitable hardware frameworks. In this paper, we build on previous work developed within our group, focusing on an all-in-one, low-cost, and modular biosignal acquisition hardware platform, that makes it quicker and easier to build biomedical devices. We describe the main design considerations, experimental evaluation and circuit characterization results, together with the results from a usability study performed with volunteers from multiple target user groups, namely health sciences and electrical, biomedical, and computer engineering. Copyright © 2014 SCITEPRESS - Science and Technology Publications. All rights reserved.
Resumo:
This paper addresses the problem of finding several different solutions with the same optimum performance in single objective real-world engineering problems. In this paper a parallel robot design is proposed. Thereby, this paper presents a genetic algorithm to optimize uni-objective problems with an infinite number of optimal solutions. The algorithm uses the maximin concept and ε-dominance to promote diversity over the admissible space. The performance of the proposed algorithm is analyzed with three well-known test functions and a function obtained from practical real-world engineering optimization problems. A spreading analysis is performed showing that the solutions drawn by the algorithm are well dispersed.
Resumo:
The goal of this paper is to discuss the benefits and challenges of yielding an inter-continental network of remote laboratories supported and used by both European and Latin American Institutions of Higher Education. Since remote experimentation, understood as the ability to carry out real-world experiments through a simple Web browser, is already a proven solution for the educational community as a supplement to on-site practical lab work (and in some cases, namely for distance learning courses, a replacement to that work), the purpose is not to discuss its technical, pedagogical, or economical strengths, but rather to raise and try to answer some questions about the underlying benefits and challenges of establishing a peer-to-peer network of remote labs. Ultimately, we regard such a network as a constructive mechanism to help students gain the working and social skills often valued by multinational/global companies, while also providing awareness of local cultural aspects.
Resumo:
We prove existence, uniqueness, and stability of solutions of the prescribed curvature problem (u'/root 1 + u'(2))' = au - b/root 1 + u'(2) in [0, 1], u'(0) = u(1) = 0, for any given a > 0 and b > 0. We also develop a linear monotone iterative scheme for approximating the solution. This equation has been proposed as a model of the corneal shape in the recent paper (Okrasinski and Plociniczak in Nonlinear Anal., Real World Appl. 13:1498-1505, 2012), where a simplified version obtained by partial linearization has been investigated.
Resumo:
Remote laboratories are an emergent technological and pedagogical tool at all education levels, and their widespread use is an important part of their own improvement and evolution. This paper describes several issues encountered on laboratorial classes, on higher education courses, when using remote laboratories based on PXI systems, either using the VISIR system or an alternate in-house solution. Three main issues are presented and explained, all reported by teachers, that gave support to students' use of remote laboratories. The first issue deals with the need to allow students to select the actual place where an ammeter is to be inserted on electric circuits, even incorrectly, therefore emulating real-world difficulties. The second one deals with problems with timing when several measurements are required at short intervals, as in the discharge cycle of a capacitor. In addition, the last issue deals with the use of a multimeter in dc mode when reading ac values, a use that collides with the lab settings. All scenarios are presented and discussed, including the solution found for each case. The conclusion derived from the described work is that the remote laboratories area is an expanding field, where practical use leads to improvement and evolution of the available solutions, requiring a strict cooperation and information-sharing between all actors, i.e., developers, teachers, and students.
Resumo:
Remote Laboratories are an emergent technological and pedagogical tool at all education levels, and their widespread use is an important part of their own improvement and evolution. This paper describes several issues encountered on laboratorial classes, on higher education courses, when using remote laboratories based on PXI systems, either using the VISIR system or an alternate in-house solution. Three main issues are presented and explained, all reported by teachers that gave support to students use of remote laboratories. The first issue deals with the need to allow students to select the actual place where an ammeter is to be inserted on electric circuits, even incorrectly, therefore emulating real world difficulties. The second one deals with problems with timing when several measurements are required at short intervals, as in the discharge cycle of a capacitor. And the last issue deals with the use of a multimeter in DC mode when reading AC values, a use that collides with the lab settings. All scenarios are presented and discussed including the solution found for each case. The conclusion derived from the described work is that the remote laboratories area is an expanding field, where practical use leads to improvement and evolution of the available solutions, requiring a strict cooperation and information sharing between all actors, i.e. developers, teachers and students.
Resumo:
Density-dependent effects, both positive or negative, can have an important impact on the population dynamics of species by modifying their population per-capita growth rates. An important type of such density-dependent factors is given by the so-called Allee effects, widely studied in theoretical and field population biology. In this study, we analyze two discrete single population models with overcompensating density-dependence and Allee effects due to predator saturation and mating limitation using symbolic dynamics theory. We focus on the scenarios of persistence and bistability, in which the species dynamics can be chaotic. For the chaotic regimes, we compute the topological entropy as well as the Lyapunov exponent under ecological key parameters and different initial conditions. We also provide co-dimension two bifurcation diagrams for both systems computing the periods of the orbits, also characterizing the period-ordering routes toward the boundary crisis responsible for species extinction via transient chaos. Our results show that the topological entropy increases as we approach to the parametric regions involving transient chaos, being maximum when the full shift R(L)(infinity) occurs, and the system enters into the essential extinction regime. Finally, we characterize analytically, using a complex variable approach, and numerically the inverse square-root scaling law arising in the vicinity of a saddle-node bifurcation responsible for the extinction scenario in the two studied models. The results are discussed in the context of species fragility under differential Allee effects. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Many data have been useful to describe the growth of marine mammals, invertebrates and reptiles, seabirds, sea turtles and fishes, using the logistic, the Gom-pertz and von Bertalanffy's growth models. A generalized family of von Bertalanffy's maps, which is proportional to the right hand side of von Bertalanffy's growth equation, is studied and its dynamical approach is proposed. The system complexity is measured using Lyapunov exponents, which depend on two biological parameters: von Bertalanffy's growth rate constant and the asymptotic weight. Applications of synchronization in real world is of current interest. The behavior of birds ocks, schools of fish and other animals is an important phenomenon characterized by synchronized motion of individuals. In this work, we consider networks having in each node a von Bertalanffy's model and we study the synchronization interval of these networks, as a function of those two biological parameters. Numerical simulation are also presented to support our approaches.
Resumo:
Concepts like E-learning and M-learning are changing the traditional learning place. No longer restricted to well-defined physical places, education on Automation and other Engineering areas is entering the so-called ubiquitous learning place, where even the more practical knowledge (acquired at lab classes) is now moving into, due to emergent concepts such as Remote Experimentation or Mobile Experimentation. While Remote Experimentation is traditionally regarded as the remote access to real-world experiments through a simple web browser running on a PC connected to the Internet, Mobile Experimentation may be seen as the access to those same (or others) experiments, through mobile devices, used in M-learning contexts. These two distinct client types (PCs versus mobile devices) pose specific requirements for the remote lab infrastructure, namely the ability to tune the experiment interface according to the characteristics (e.g. display size) of the accessing device. This paper addresses those requirements, namely by proposing a new architecture for the remote lab infrastructure able to accommodate both Remote and Mobile Experimentation scenarios.
Resumo:
Consolidation consists in scheduling multiple virtual machines onto fewer servers in order to improve resource utilization and to reduce operational costs due to power consumption. However, virtualization technologies do not offer performance isolation, causing applications’ slowdown. In this work, we propose a performance enforcing mechanism, composed of a slowdown estimator, and a interference- and power-aware scheduling algorithm. The slowdown estimator determines, based on noisy slowdown data samples obtained from state-of-the-art slowdown meters, if tasks will complete within their deadlines, invoking the scheduling algorithm if needed. When invoked, the scheduling algorithm builds performance and power aware virtual clusters to successfully execute the tasks. We conduct simulations injecting synthetic jobs which characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our strategy can be efficiently integrated with state-of-the-art slowdown meters to fulfil contracted SLAs in real-world environments, while reducing operational costs in about 12%.
Resumo:
Thesis to obtain the Master of Science Degree in Computer Science and Engineering