956 resultados para Probabilistic robotics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sensor networks have been an active research area in the past decade due to the variety of their applications. Many research studies have been conducted to solve the problems underlying the middleware services of sensor networks, such as self-deployment, self-localization, and synchronization. With the provided middleware services, sensor networks have grown into a mature technology to be used as a detection and surveillance paradigm for many real-world applications. The individual sensors are small in size. Thus, they can be deployed in areas with limited space to make unobstructed measurements in locations where the traditional centralized systems would have trouble to reach. However, there are a few physical limitations to sensor networks, which can prevent sensors from performing at their maximum potential. Individual sensors have limited power supply, the wireless band can get very cluttered when multiple sensors try to transmit at the same time. Furthermore, the individual sensors have limited communication range, so the network may not have a 1-hop communication topology and routing can be a problem in many cases. Carefully designed algorithms can alleviate the physical limitations of sensor networks, and allow them to be utilized to their full potential. Graphical models are an intuitive choice for designing sensor network algorithms. This thesis focuses on a classic application in sensor networks, detecting and tracking of targets. It develops feasible inference techniques for sensor networks using statistical graphical model inference, binary sensor detection, events isolation and dynamic clustering. The main strategy is to use only binary data for rough global inferences, and then dynamically form small scale clusters around the target for detailed computations. This framework is then extended to network topology manipulation, so that the framework developed can be applied to tracking in different network topology settings. Finally the system was tested in both simulation and real-world environments. The simulations were performed on various network topologies, from regularly distributed networks to randomly distributed networks. The results show that the algorithm performs well in randomly distributed networks, and hence requires minimum deployment effort. The experiments were carried out in both corridor and open space settings. A in-home falling detection system was simulated with real-world settings, it was setup with 30 bumblebee radars and 30 ultrasonic sensors driven by TI EZ430-RF2500 boards scanning a typical 800 sqft apartment. Bumblebee radars are calibrated to detect the falling of human body, and the two-tier tracking algorithm is used on the ultrasonic sensors to track the location of the elderly people.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer-aided microscopic surgery of the lateral skull base is a rare intervention in daily practice. It is often a delicate and difficult minimally invasive intervention, since orientation between the petrous bone and the petrous bone apex is often challenging. In the case of aural atresia or tumors the normal anatomical landmarks are often absent, making orientation more difficult. Navigation support, together with imaging techniques such as CT, MR and angiography, enable the surgeon in such cases to perform the operation more accurately and, in some cases, also in a shorter time. However, there are no internationally standardised indications for navigated surgery on the lateral skull base. Miniaturised robotic systems are still in the initial validation phase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Amyloids and prion proteins are clinically and biologically important beta-structures, whose supersecondary structures are difficult to determine by standard experimental or computational means. In addition, significant conformational heterogeneity is known or suspected to exist in many amyloid fibrils. Recent work has indicated the utility of pairwise probabilistic statistics in beta-structure prediction. We develop here a new strategy for beta-structure prediction, emphasizing the determination of beta-strands and pairs of beta-strands as fundamental units of beta-structure. Our program, BETASCAN, calculates likelihood scores for potential beta-strands and strand-pairs based on correlations observed in parallel beta-sheets. The program then determines the strands and pairs with the greatest local likelihood for all of the sequence's potential beta-structures. BETASCAN suggests multiple alternate folding patterns and assigns relative a priori probabilities based solely on amino acid sequence, probability tables, and pre-chosen parameters. The algorithm compares favorably with the results of previous algorithms (BETAPRO, PASTA, SALSA, TANGO, and Zyggregator) in beta-structure prediction and amyloid propensity prediction. Accurate prediction is demonstrated for experimentally determined amyloid beta-structures, for a set of known beta-aggregates, and for the parallel beta-strands of beta-helices, amyloid-like globular proteins. BETASCAN is able both to detect beta-strands with higher sensitivity and to detect the edges of beta-strands in a richly beta-like sequence. For two proteins (Abeta and Het-s), there exist multiple sets of experimental data implying contradictory structures; BETASCAN is able to detect each competing structure as a potential structure variant. The ability to correlate multiple alternate beta-structures to experiment opens the possibility of computational investigation of prion strains and structural heterogeneity of amyloid. BETASCAN is publicly accessible on the Web at http://betascan.csail.mit.edu.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose an intelligent method, named the Novelty Detection Power Meter (NodePM), to detect novelties in electronic equipment monitored by a smart grid. Considering the entropy of each device monitored, which is calculated based on a Markov chain model, the proposed method identifies novelties through a machine learning algorithm. To this end, the NodePM is integrated into a platform for the remote monitoring of energy consumption, which consists of a wireless sensors network (WSN). It thus should be stressed that the experiments were conducted in real environments different from many related works, which are evaluated in simulated environments. In this sense, the results show that the NodePM reduces by 13.7% the power consumption of the equipment we monitored. In addition, the NodePM provides better efficiency to detect novelties when compared to an approach from the literature, surpassing it in different scenarios in all evaluations that were carried out.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

How do probabilistic models represent their targets and how do they allow us to learn about them? The answer to this question depends on a number of details, in particular on the meaning of the probabilities involved. To classify the options, a minimalist conception of representation (Su\'arez 2004) is adopted: Modelers devise substitutes (``sources'') of their targets and investigate them to infer something about the target. Probabilistic models allow us to infer probabilities about the target from probabilities about the source. This leads to a framework in which we can systematically distinguish between different models of probabilistic modeling. I develop a fully Bayesian view of probabilistic modeling, but I argue that, as an alternative, Bayesian degrees of belief about the target may be derived from ontic probabilities about the source. Remarkably, some accounts of ontic probabilities can avoid problems if they are supposed to apply to sources only.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Derivation of probability estimates complementary to geophysical data sets has gained special attention over the last years. Information about a confidence level of provided physical quantities is required to construct an error budget of higher-level products and to correctly interpret final results of a particular analysis. Regarding the generation of products based on satellite data a common input consists of a cloud mask which allows discrimination between surface and cloud signals. Further the surface information is divided between snow and snow-free components. At any step of this discrimination process a misclassification in a cloud/snow mask propagates to higher-level products and may alter their usability. Within this scope a novel probabilistic cloud mask (PCM) algorithm suited for the 1 km × 1 km Advanced Very High Resolution Radiometer (AVHRR) data is proposed which provides three types of probability estimates between: cloudy/clear-sky, cloudy/snow and clear-sky/snow conditions. As opposed to the majority of available techniques which are usually based on the decision-tree approach in the PCM algorithm all spectral, angular and ancillary information is used in a single step to retrieve probability estimates from the precomputed look-up tables (LUTs). Moreover, the issue of derivation of a single threshold value for a spectral test was overcome by the concept of multidimensional information space which is divided into small bins by an extensive set of intervals. The discrimination between snow and ice clouds and detection of broken, thin clouds was enhanced by means of the invariant coordinate system (ICS) transformation. The study area covers a wide range of environmental conditions spanning from Iceland through central Europe to northern parts of Africa which exhibit diverse difficulties for cloud/snow masking algorithms. The retrieved PCM cloud classification was compared to the Polar Platform System (PPS) version 2012 and Moderate Resolution Imaging Spectroradiometer (MODIS) collection 6 cloud masks, SYNOP (surface synoptic observations) weather reports, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) vertical feature mask version 3 and to MODIS collection 5 snow mask. The outcomes of conducted analyses proved fine detection skills of the PCM method with results comparable to or better than the reference PPS algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AB A fundamental capacity of the human brain is to learn relations (contingencies) between environmental stimuli and the consequences of their occurrence. Some contingencies are probabilistic; that is, they predict an event in some situations but not in all. Animal studies suggest that damage to limbic structures or the prefrontal cortex may disturb probabilistic learning. The authors studied the learning of probabilistic contingencies in amnesic patients with limbic lesions, patients with prefrontal cortex damage, and healthy controls. Across 120 trials, participants learned contingent relations between spatial sequences and a button press. Amnesic patients had learning comparable to that of control subjects but failed to indicate what they had learned. Across the last 60 trials, amnesic patients and control subjects learned to avoid a noncontingent choice better than frontal patients. These results indicate that probabilistic learning does not depend on the brain structures supporting declarative memory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Robotics-assisted tilt table technology was introduced for early rehabilitation of neurological patients. It provides cyclical stepping movement and physiological loading of the legs. The aim of the present study was to assess the feasibility of this type of device for peak cardiopulmonary performance testing using able-bodied subjects. METHODS: A robotics-assisted tilt table was augmented with force sensors in the thigh cuffs and a work rate estimation algorithm. A custom visual feedback system was employed to guide the subjects' work rate and to provide real time feedback of actual work rate. Feasibility assessment focused on: (i) implementation (technical feasibility), and (ii) responsiveness (was there a measurable, high-level cardiopulmonary reaction?). For responsiveness testing, each subject carried out an incremental exercise test to the limit of functional capacity with a work rate increment of 5 W/min in female subjects and 8 W/min in males. RESULTS: 11 able-bodied subjects were included (9 male, 2 female; age 29.6 ± 7.1 years: mean ± SD). Resting oxygen uptake (O_{2}) was 4.6 ± 0.7 mL/min/kg and O_{2}peak was 32.4 ± 5.1 mL/min/kg; this mean O_{2}peak was 81.1% of the predicted peak value for cycle ergometry. Peak heart rate (HRpeak) was 177.5 ± 9.7 beats/min; all subjects reached at least 85% of their predicted HRpeak value. Respiratory exchange ratio (RER) at O_{2}peak was 1.02 ± 0.07. Peak work rate) was 61.3 ± 15.1 W. All subjects reported a Borg CR10 value for exertion and leg fatigue of 7 or more. CONCLUSIONS: The robotics-assisted tilt table is deemed feasible for peak cardiopulmonary performance testing: the approach was found to be technically implementable and substantial cardiopulmonary responses were observed. Further testing in neurologically-impaired subjects is warranted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Record linkage of existing individual health care data is an efficient way to answer important epidemiological research questions. Reuse of individual health-related data faces several problems: Either a unique personal identifier, like social security number, is not available or non-unique person identifiable information, like names, are privacy protected and cannot be accessed. A solution to protect privacy in probabilistic record linkages is to encrypt these sensitive information. Unfortunately, encrypted hash codes of two names differ completely if the plain names differ only by a single character. Therefore, standard encryption methods cannot be applied. To overcome these challenges, we developed the Privacy Preserving Probabilistic Record Linkage (P3RL) method. METHODS In this Privacy Preserving Probabilistic Record Linkage method we apply a three-party protocol, with two sites collecting individual data and an independent trusted linkage center as the third partner. Our method consists of three main steps: pre-processing, encryption and probabilistic record linkage. Data pre-processing and encryption are done at the sites by local personnel. To guarantee similar quality and format of variables and identical encryption procedure at each site, the linkage center generates semi-automated pre-processing and encryption templates. To retrieve information (i.e. data structure) for the creation of templates without ever accessing plain person identifiable information, we introduced a novel method of data masking. Sensitive string variables are encrypted using Bloom filters, which enables calculation of similarity coefficients. For date variables, we developed special encryption procedures to handle the most common date errors. The linkage center performs probabilistic record linkage with encrypted person identifiable information and plain non-sensitive variables. RESULTS In this paper we describe step by step how to link existing health-related data using encryption methods to preserve privacy of persons in the study. CONCLUSION Privacy Preserving Probabilistic Record linkage expands record linkage facilities in settings where a unique identifier is unavailable and/or regulations restrict access to the non-unique person identifiable information needed to link existing health-related data sets. Automated pre-processing and encryption fully protect sensitive information ensuring participant confidentiality. This method is suitable not just for epidemiological research but also for any setting with similar challenges.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robotics-assisted tilt table (RATT) technology provides body support, cyclical stepping movement and physiological loading. This technology can potentially be used to facilitate the estimation of peak cardiopulmonary performance parameters in patients who have neurological or other problems that may preclude testing on a treadmill or cycle ergometer. The aim of the study was to compare the magnitude of peak cardiopulmonary performance parameters including peak oxygen uptake (VO2peak) and peak heart rate (HRpeak) obtained from a robotics-assisted tilt table (RATT), a cycle ergometer and a treadmill. The strength of correlations between the three devices, test-retest reliability and repeatability were also assessed. Eighteen healthy subjects performed six maximal exercise tests, with two tests on each of the three exercise modalities. Data from the second tests were used for the comparative and correlation analyses. For nine subjects, test-retest reliability and repeatability of VO2peak and HRpeak were assessed. Absolute VO2peak from the RATT, the cycle ergometer and the treadmill was (mean (SD)) 2.2 (0.56), 2.8 (0.80) and 3.2 (0.87) L/min, respectively (p < 0.001). HRpeak from the RATT, the cycle ergometer and the treadmill was 168 (9.5), 179 (7.9) and 184 (6.9) beats/min, respectively (p < 0.001). VO2peak and HRpeak from the RATT vs the cycle ergometer and the RATT vs the treadmill showed strong correlations. Test-retest reliability and repeatability were high for VO2peak and HRpeak for all devices. The results demonstrate that the RATT is a valid and reliable device for exercise testing. There is potential for the RATT to be used in severely impaired subjects who cannot use the standard modalities.