535 resultados para pretest probability


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article presents an approach to improve and monitor the behavior of a skid-steering rover on rough terrains. An adaptive locomotion control generates speeds references to avoid slipping situations. An enhanced odometry provides a better estimation of the distance travelled. A probabilistic classification procedure provides an evaluation of the locomotion efficiency on-line, with a detection of locomotion faults. Results obtained with a Marsokhod rover are presented throughout the paper

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research identifies roadway, traffic, and environmental factors that influence the injury severity of road traffic crashes in Dhaka. Dhaka provides a rather unusual driving risk environment to study, since virtually anyone can obtain a drivers’ license and very little traffic enforcement and fines are given when drivers violate traffic rules. To examine this city with presumed heightened crash severity risk, police reported crash data from 2007 to 2011 containing about 2714 road traffic crashes were collected. The injury severity of traffic crashes—recorded as either fatal, serious injury, or property damage only—were modeled using an ordered Probit model. Significant factors increasing the probability of fatal injuries include crashes along highways (65%), absence of a road divider (80%), crashes during night time (54%), and vehicle-pedestrian collisions (367%); whereas two-way traffic configuration (21%), and traffic police controlled schemes (41%) decrease the probability of fatalities. Both similarities and differences of the findings between crash risk in Dhaka and developed countries are discussed in policy relevant terms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Risk taking is central to human activity. Consequently, it lies at the focal point of behavioral sciences such as neuroscience, economics, and finance. Many influential models from these sciences assume that financial risk preferences form a stable trait. Is this assumption justified and, if not, what causes the appetite for risk to fluctuate? We have previously found that traders experience a sustained increase in the stress hormone cortisol when the amount of uncertainty, in the form of market volatility, increases. Here we ask whether these elevated cortisol levels shift risk preferences. Using a double-blind, placebo-controlled, cross-over protocol we raised cortisol levels in volunteers over eight days to the same extent previously observed in traders. We then tested for the utility and probability weighting functions underlying their risk taking, and found that participants became more risk averse. We also observed that the weighting of probabilities became more distorted among men relative to women. These results suggest that risk preferences are highly dynamic. Specifically, the stress response calibrates risk taking to our circumstances, reducing it in times of prolonged uncertainty, such as a financial crisis. Physiology-induced shifts in risk preferences may thus be an under-appreciated cause of market instability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Whole image descriptors have recently been shown to be remarkably robust to perceptual change especially compared to local features. However, whole-image-based localization systems typically rely on heuristic methods for determining appropriate matching thresholds in a particular environment. These environment-specific tuning requirements and the lack of a meaningful interpretation of these arbitrary thresholds limits the general applicability of these systems. In this paper we present a Bayesian model of probability for whole-image descriptors that can be seamlessly integrated into localization systems designed for probabilistic visual input. We demonstrate this method using CAT-Graph, an appearance-based visual localization system originally designed for a FAB-MAP-style probabilistic input. We show that using whole-image descriptors as visual input extends CAT-Graph’s functionality to environments that experience a greater amount of perceptual change. We also present a method of estimating whole-image probability models in an online manner, removing the need for a prior training phase. We show that this online, automated training method can perform comparably to pre-trained, manually tuned local descriptor methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work is to develop a demand-side-response model, which assists electricity consumers exposed to the market price to independently and proactively manage air-conditioning peak electricity demand. The main contribution of this research is to show how consumers can optimize the energy cost caused by the air conditioning load considering to several cases e.g. normal price, spike price, and the probability of a price spike case. This model also investigated how air-conditioning applies a pre-cooling method when there is a substantial risk of a price spike. The results indicate the potential of the scheme to achieve financial benefits for consumers and target the best economic performance for electrical generation distribution and transmission. The model was tested with Queensland electricity market data from the Australian Energy Market Operator and Brisbane temperature data from the Bureau of Statistics regarding hot days from 2011 to 2012.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To synthesise the available evidence and estimate the comparative efficacy of control strategies to prevent total hip replacement (THR)-related surgical site infections (SSIs) using a mixed treatment comparison. DESIGN: Systematic review and mixed treatment comparison. SETTING: Hospital and other healthcare settings. PARTICIPANTS: Patients undergoing THR. PRIMARY AND SECONDARY OUTCOME MEASURES: The number of THR-related SSIs occurring following the surgical operation. RESULTS: 12 studies involving 123 788 THRs and 9 infection control strategies were identified. The strategy of 'systemic antibiotics+antibiotic-impregnated cement+conventional ventilation' significantly reduced the risk of THR-related SSI compared with the referent strategy (no systemic antibiotics+plain cement+conventional ventilation), OR 0.13 (95% credible interval (CrI) 0.03-0.35), and had the highest probability (47-64%) and highest median rank of being the most effective strategy. There was some evidence to suggest that 'systemic antibiotics+antibiotic-impregnated cement+laminar airflow' could potentially increase infection risk compared with 'systemic antibiotics+antibiotic-impregnated cement+conventional ventilation', 1.96 (95% CrI 0.52-5.37). There was no high-quality evidence that antibiotic-impregnated cement without systemic antibiotic prophylaxis was effective in reducing infection compared with plain cement with systemic antibiotics, 1.28 (95% CrI 0.38-3.38). CONCLUSIONS: We found no convincing evidence in favour of the use of laminar airflow over conventional ventilation for prevention of THR-related SSIs, yet laminar airflow is costly and widely used. Antibiotic-impregnated cement without systemic antibiotics may not be effective in reducing THR-related SSIs. The combination with the highest confidence for reducing SSIs was 'systemic antibiotics+antibiotic-impregnated cement+conventional ventilation'. Our evidence synthesis underscores the need to review current guidelines based on the available evidence, and to conduct further high-quality double-blind randomised controlled trials to better inform the current clinical guidelines and practice for prevention of THR-related SSIs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A long query provides more useful hints for searching relevant documents, but it is likely to introduce noise which affects retrieval performance. In order to smooth such adverse effect, it is important to reduce noisy terms, introduce and boost additional relevant terms. This paper presents a comprehensive framework, called Aspect Hidden Markov Model (AHMM), which integrates query reduction and expansion, for retrieval with long queries. It optimizes the probability distribution of query terms by utilizing intra-query term dependencies as well as the relationships between query terms and words observed in relevance feedback documents. Empirical evaluation on three large-scale TREC collections demonstrates that our approach, which is automatic, achieves salient improvements over various strong baselines, and also reaches a comparable performance to a state of the art method based on user’s interactive query term reduction and expansion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A predictive model of terrorist activity is developed by examining the daily number of terrorist attacks in Indonesia from 1994 through 2007. The dynamic model employs a shot noise process to explain the self-exciting nature of the terrorist activities. This estimates the probability of future attacks as a function of the times since the past attacks. In addition, the excess of nonattack days coupled with the presence of multiple coordinated attacks on the same day compelled the use of hurdle models to jointly model the probability of an attack day and corresponding number of attacks. A power law distribution with a shot noise driven parameter best modeled the number of attacks on an attack day. Interpretation of the model parameters is discussed and predictive performance of the models is evaluated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important aspect of decision support systems involves applying sophisticated and flexible statistical models to real datasets and communicating these results to decision makers in interpretable ways. An important class of problem is the modelling of incidence such as fire, disease etc. Models of incidence known as point processes or Cox processes are particularly challenging as they are ‘doubly stochastic’ i.e. obtaining the probability mass function of incidents requires two integrals to be evaluated. Existing approaches to the problem either use simple models that obtain predictions using plug-in point estimates and do not distinguish between Cox processes and density estimation but do use sophisticated 3D visualization for interpretation. Alternatively other work employs sophisticated non-parametric Bayesian Cox process models, but do not use visualization to render interpretable complex spatial temporal forecasts. The contribution here is to fill this gap by inferring predictive distributions of Gaussian-log Cox processes and rendering them using state of the art 3D visualization techniques. This requires performing inference on an approximation of the model on a discretized grid of large scale and adapting an existing spatial-diurnal kernel to the log Gaussian Cox process context.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a texture recognition based method for segmenting kelp from images collected in highly dynamic shallow water environments by an Autonomous Underwater Vehicle (AUV). A particular challenge is image quality that is affected by uncontrolled lighting, reduced visibility, significantly varying perspective due to platform egomotion, and kelp sway from wave action. The kelp segmentation approach uses the Mahalanobis distance as a way to classify Haralick texture features from sub-regions within an image. The results illustrate the applicability of the method to classify kelp allowing construction of probability maps of kelp masses across a sequence of images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many large-scale GNSS CORS networks have been deployed around the world to support various commercial and scientific applications. To make use of these networks for real-time kinematic positioning services, one of the major challenges is the ambiguity resolution (AR) over long inter-station baselines in the presence of considerable atmosphere biases. Usually, the widelane ambiguities are fixed first, followed by the procedure of determination of the narrowlane ambiguity integers based on the ionosphere-free model in which the widelane integers are introduced as known quantities. This paper seeks to improve the AR performance over long baseline through efficient procedures for improved float solutions and ambiguity fixing. The contribution is threefold: (1) instead of using the ionosphere-free measurements, the absolute and/or relative ionospheric constraints are introduced in the ionosphere-constrained model to enhance the model strength, thus resulting in the better float solutions; (2) the realistic widelane ambiguity precision is estimated by capturing the multipath effects due to the observation complexity, leading to improvement of reliability of widelane AR; (3) for the narrowlane AR, the partial AR for a subset of ambiguities selected according to the successively increased elevation is applied. For fixing the scalar ambiguity, an error probability controllable rounding method is proposed. The established ionosphere-constrained model can be efficiently solved based on the sequential Kalman filter. It can be either reduced to some special models simply by adjusting the variances of ionospheric constraints, or extended with more parameters and constraints. The presented methodology is tested over seven baselines of around 100 km from USA CORS network. The results show that the new widelane AR scheme can obtain the 99.4 % successful fixing rate with 0.6 % failure rate; while the new rounding method of narrowlane AR can obtain the fix rate of 89 % with failure rate of 0.8 %. In summary, the AR reliability can be efficiently improved with rigorous controllable probability of incorrectly fixed ambiguities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliability of carrier phase ambiguity resolution (AR) of an integer least-squares (ILS) problem depends on ambiguity success rate (ASR), which in practice can be well approximated by the success probability of integer bootstrapping solutions. With the current GPS constellation, sufficiently high ASR of geometry-based model can only be achievable at certain percentage of time. As a result, high reliability of AR cannot be assured by the single constellation. In the event of dual constellations system (DCS), for example, GPS and Beidou, which provide more satellites in view, users can expect significant performance benefits such as AR reliability and high precision positioning solutions. Simply using all the satellites in view for AR and positioning is a straightforward solution, but does not necessarily lead to high reliability as it is hoped. The paper presents an alternative approach that selects a subset of the visible satellites to achieve a higher reliability performance of the AR solutions in a multi-GNSS environment, instead of using all the satellites. Traditionally, satellite selection algorithms are mostly based on the position dilution of precision (PDOP) in order to meet accuracy requirements. In this contribution, some reliability criteria are introduced for GNSS satellite selection, and a novel satellite selection algorithm for reliable ambiguity resolution (SARA) is developed. The SARA algorithm allows receivers to select a subset of satellites for achieving high ASR such as above 0.99. Numerical results from a simulated dual constellation cases show that with the SARA procedure, the percentages of ASR values in excess of 0.99 and the percentages of ratio-test values passing the threshold 3 are both higher than those directly using all satellites in view, particularly in the case of dual-constellation, the percentages of ASRs (>0.99) and ratio-test values (>3) could be as high as 98.0 and 98.5 % respectively, compared to 18.1 and 25.0 % without satellite selection process. It is also worth noting that the implementation of SARA is simple and the computation time is low, which can be applied in most real-time data processing applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Quantum Probability Ranking Principle (QPRP) has been recently proposed, and accounts for interdependent document relevance when ranking. However, to be instantiated, the QPRP requires a method to approximate the interference" between two documents. In this poster, we empirically evaluate a number of different methods of approximation on two TREC test collections for subtopic retrieval. It is shown that these approximations can lead to significantly better retrieval performance over the state of the art.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The assumptions underlying the Probability Ranking Principle (PRP) have led to a number of alternative approaches that cater or compensate for the PRP's limitations. In this poster we focus on the Interactive PRP (iPRP), which rejects the assumption of independence between documents made by the PRP. Although the theoretical framework of the iPRP is appealing, no instantiation has been proposed and investigated. In this poster, we propose a possible instantiation of the principle, performing the first empirical comparison of the iPRP against the PRP. For document diversification, our results show that the iPRP is significantly better than the PRP, and comparable to or better than other methods such as Modern Portfolio Theory.