985 resultados para DETECTION PROBABILITY


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Documenting changes in distribution is necessary for understanding species' response to environmental changes, but data on species distributions are heterogeneous in accuracy and resolution. Combining different data sources and methodological approaches can fill gaps in knowledge about the dynamic processes driving changes in species-rich, but data-poor regions. We combined recent bird survey data from the Neotropical Biodiversity Mapping Initiative (NeoMaps) with historical distribution records to estimate potential changes in the distribution of eight species of Amazon parrots in Venezuela. Using environmental covariates and presence-only data from museum collections and the literature, we first used maximum likelihood to fit a species distribution model (SDM) estimating a historical maximum probability of occurrence for each species. We then used recent, NeoMaps survey data to build single-season occupancy models (OM) with the same environmental covariates, as well as with time- and effort-dependent detectability, resulting in estimates of the current probability of occurrence. We finally calculated the disagreement between predictions as a matrix of probability of change in the state of occurrence. Our results suggested negative changes for the only restricted, threatened species, Amazona barbadensis, which has been independently confirmed with field studies. Two of the three remaining widespread species that were detected, Amazona amazonica, Amazona ochrocephala, also had a high probability of negative changes in northern Venezuela, but results were not conclusive for Amazona farinosa. The four remaining species were undetected in recent field surveys; three of these were most probably absent from the survey locations (Amazona autumnalis, Amazona mercenaria and Amazona festiva), while a fourth (Amazona dufresniana) requires more intensive targeted sampling to estimate its current status. Our approach is unique in taking full advantage of available, but limited data, and in detecting a high probability of change even for rare and patchily-distributed species. However, it is presently limited to species meeting the strong assumptions required for maximum-likelihood estimation with presence-only data, including very high detectability and representative sampling of its historical distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding changes over time in the distribution of interacting native and invasive species that may be symptomatic of competitive exclusion is critical to identify the need for and effectiveness of management interventions. Occupancy models greatly increase the robustness of inference that can be made from presence/absence data when species are imperfectly detected, and recent novel developments allow for the quantification of the strength of interaction between pairs of species. We used a two-species multi-season occupancy model to quantify the impact of the invasive American mink on the native European mink in Spain through the analysis of their co-occurrence pattern over twelve years (2000 - 2011) in the entire Spanish range of European mink distribution, where both species were detected by live trapping but American mink were culled. We detected a negative temporal trend in the rate of occupancy of European mink and a simultaneous positive trend in the occupancy of American mink. The species co-occurred less often than expected and the native mink was more likely to become extinct from sites occupied by the invasive species. Removal of American mink resulted in a high probability of local extinction where it co-occurred with the endemic mink, but the overall increase in the probability of occupancy over the last decade indicates that the ongoing management is failing to halt its spread. More intensive culling effort where both species co-exist as well as in adjacent areas where the invasive American mink is found at high densities is required in order to stop thedecline of European mink.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Is Benford's law a good instrument to detect fraud in reports of statistical and scientific data? For a valid test the probability of "false positives" and "false negatives" has to be low. However, it is very doubtful whether the Benford distribution is an appropriate tool to discriminate between manipulated and non-manipulated estimates. Further research should focus more on the validity of the test and test results should be interpreted more carefully.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a layered architecture to spot and characterize vowel segments in running speech is presented. The detection process is based on neuromorphic principles, as is the use of Hebbian units in layers to implement lateral inhibition, band probability estimation and mutual exclusion. Results are presented showing how the association between the acoustic set of patterns and the phonologic set of symbols may be created. Possible applications of this methodology are to be found in speech event spotting, in the study of pathological voice and in speaker biometric characterization, among others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the detection and tracking of an unknown number of targets using a Bayesian hierarchical model with target labels. To approximate the posterior probability density function, we develop a two-layer particle filter. One deals with track initiation, and the other with track maintenance. In addition, the parallel partition method is proposed to sample the states of the surviving targets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Along the recent years, several moving object detection strategies by non-parametric background-foreground modeling have been proposed. To combine both models and to obtain the probability of a pixel to belong to the foreground, these strategies make use of Bayesian classifiers. However, these classifiers do not allow to take advantage of additional prior information at different pixels. So, we propose a novel and efficient alternative Bayesian classifier that is suitable for this kind of strategies and that allows the use of whatever prior information. Additionally, we present an effective method to dynamically estimate prior probability from the result of a particle filter-based tracking strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A methodology has been developed for the study of molecular recognition at the level of single events and for the localization of sites on biosurfaces, in combining force microscopy with molecular recognition by specific ligands. For this goal, a sensor was designed by covalently linking an antibody (anti-human serum albumin, polyclonal) via a flexible spacer to the tip of a force microscope. This sensor permitted detection of single antibody-antigen recognition events by force signals of unique shape with an unbinding force of 244 +/- 22 pN. Analysis revealed that observed unbinding forces originate from the dissociation of individual Fab fragments from a human serum albumin molecule. The two Fab fragments of the antibody were found to bind independently and with equal probability. The flexible linkage provided the antibody with a 6-nm dynamical reach for binding, rendering binding probability high, 0.5 for encounter times of 60 ms. This permitted fast and reliable detection of antigenic sites during lateral scans with a positional accuracy of 1.5 nm. It is indicated that this methodology has promise for characterizing rate constants and kinetics of molecular recognition complexes and for molecular mapping of biosurfaces such as membranes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 1991, Bryant and Eckard estimated the annual probability that a cartel would be detected by the US Federal authorities, conditional on being detected, to be at most between 13 % and 17 %. 15 years later, we estimated the same probability over a European sample and we found an annual probability that falls between 12.9 % and 13.3 %. We also develop a detection model to clarify this probability. Our estimate is based on detection durations, calculated from data reported for all the cartels convicted by the European Commission from 1969 to the present date, and a statistical birth and death process model describing the onset and detection of cartels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Obstructive sleep apnea (OSA) is a highly prevalent disease in which upper airways are collapsed during sleep, leading to serious consequences. The gold standard of diagnosis, called polysomnography (PSG), requires a full-night hospital stay connected to over ten channels of measurements requiring physical contact with sensors. PSG is inconvenient, expensive and unsuited for community screening. Snoring is the earliest symptom of OSA, but its potential in clinical diagnosis is not fully recognized yet. Diagnostic systems intent on using snore-related sounds (SRS) face the tough problem of how to define a snore. In this paper, we present a working definition of a snore, and propose algorithms to segment SRS into classes of pure breathing, silence and voiced/unvoiced snores. We propose a novel feature termed the 'intra-snore-pitch-jump' (ISPJ) to diagnose OSA. Working on clinical data, we show that ISPJ delivers OSA detection sensitivities of 86-100% while holding specificity at 50-80%. These numbers indicate that snore sounds and the ISPJ have the potential to be good candidates for a take-home device for OSA screening. Snore sounds have the significant advantage in that they can be conveniently acquired with low-cost non-contact equipment. The segmentation results presented in this paper have been derived using data from eight patients as the training set and another eight patients as the testing set. ISPJ-based OSA detection results have been derived using training data from 16 subjects and testing data from 29 subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This correspondence considers block detection for blind wireless digital transmission. At high signal-to-noise ratio (SNR), block detection errors are primarily due to the received sequence having multiple possible decoded sequences with the same likelihood. We derive analytic expressions for the probability of detection ambiguity written in terms of a Dedekind zeta function, in the zero noise case with large constellations. Expressions are also provided for finite constellations, which can be evaluated efficiently, independent of the block length. Simulations demonstrate that the analytically derived error floors exist at high SNR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the electrical transport of a harmonically bound, single-molecule C-60 shuttle operating in the Coulomb blockade regime, i.e. single electron shuttling. In particular, we examine the dependance of the tunnel current on an ultra-small stationary force exerted on the shuttle. As an example, we consider the force exerted on an endohedral N@C-60 by the magnetic field gradient generated by a nearby nanomagnet. We derive a Hamiltonian for the full shuttle system which includes the metallic contacts, the spatially dependent tunnel couplings to the shuttle, the electronic and motional degrees of freedom of the shuttle itself and a coupling of the shuttle's motion to a phonon bath. We analyse the resulting quantum master equation and find that, due to the exponential dependence of the tunnel probability on the shuttle-contact separation, the current is highly sensitive to very small forces. In particular, we predict that the spin state of the endohedral electrons of N@C-60 in a large magnetic gradient field can be distinguished from the resulting current signals within a few tens of nanoseconds. This effect could prove useful for the detection of the endohedral spin-state of individual paramagnetic molecules such as N@C-60 and P@C-60, or the detection of very small static forces acting on a C-60 shuttle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we describe the evaluation of a method for building detection by the Dempster-Shafer fusion of LIDAR data and multispectral images. For that purpose, ground truth was digitised for two test sites with quite different characteristics. Using these data sets, the heuristic model for the probability mass assignments of the method is validated, and rules for the tuning of the parameters of this model are discussed. Further we evaluate the contributions of the individual cues used in the classification process to the quality of the classification results. Our results show the degree to which the overall correctness of the results can be improved by fusing LIDAR data with multispectral images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper, addresses the problem of novelty detection in the case that the observed data is a mixture of a known 'background' process contaminated with an unknown other process, which generates the outliers, or novel observations. The framework we describe here is quite general, employing univariate classification with incomplete information, based on knowledge of the distribution (the 'probability density function', 'pdf') of the data generated by the 'background' process. The relative proportion of this 'background' component (the 'prior' 'background' 'probability), the 'pdf' and the 'prior' probabilities of all other components are all assumed unknown. The main contribution is a new classification scheme that identifies the maximum proportion of observed data following the known 'background' distribution. The method exploits the Kolmogorov-Smirnov test to estimate the proportions, and afterwards data are Bayes optimally separated. Results, demonstrated with synthetic data, show that this approach can produce more reliable results than a standard novelty detection scheme. The classification algorithm is then applied to the problem of identifying outliers in the SIC2004 data set, in order to detect the radioactive release simulated in the 'oker' data set. We propose this method as a reliable means of novelty detection in the emergency situation which can also be used to identify outliers prior to the application of a more general automatic mapping algorithm. © Springer-Verlag 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The initial image-processing stages of visual cortex are well suited to a local (patchwise) analysis of the viewed scene. But the world's structures extend over space as textures and surfaces, suggesting the need for spatial integration. Most models of contrast vision fall shy of this process because (i) the weak area summation at detection threshold is attributed to probability summation (PS) and (ii) there is little or no advantage of area well above threshold. Both of these views are challenged here. First, it is shown that results at threshold are consistent with linear summation of contrast following retinal inhomogeneity, spatial filtering, nonlinear contrast transduction and multiple sources of additive Gaussian noise. We suggest that the suprathreshold loss of the area advantage in previous studies is due to a concomitant increase in suppression from the pedestal. To overcome this confound, a novel stimulus class is designed where: (i) the observer operates on a constant retinal area, (ii) the target area is controlled within this summation field, and (iii) the pedestal is fixed in size. Using this arrangement, substantial summation is found along the entire masking function, including the region of facilitation. Our analysis shows that PS and uncertainty cannot account for the results, and that suprathreshold summation of contrast extends over at least seven target cycles of grating. © 2007 The Royal Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The detection of signals in the presence of noise is one of the most basic and important problems encountered by communication engineers. Although the literature abounds with analyses of communications in Gaussian noise, relatively little work has appeared dealing with communications in non-Gaussian noise. In this thesis several digital communication systems disturbed by non-Gaussian noise are analysed. The thesis is divided into two main parts. In the first part, a filtered-Poisson impulse noise model is utilized to calulate error probability characteristics of a linear receiver operating in additive impulsive noise. Firstly the effect that non-Gaussian interference has on the performance of a receiver that has been optimized for Gaussian noise is determined. The factors affecting the choice of modulation scheme so as to minimize the deterimental effects of non-Gaussian noise are then discussed. In the second part, a new theoretical model of impulsive noise that fits well with the observed statistics of noise in radio channels below 100 MHz has been developed. This empirical noise model is applied to the detection of known signals in the presence of noise to determine the optimal receiver structure. The performance of such a detector has been assessed and is found to depend on the signal shape, the time-bandwidth product, as well as the signal-to-noise ratio. The optimal signal to minimize the probability of error of; the detector is determined. Attention is then turned to the problem of threshold detection. Detector structure, large sample performance and robustness against errors in the detector parameters are examined. Finally, estimators of such parameters as. the occurrence of an impulse and the parameters in an empirical noise model are developed for the case of an adaptive system with slowly varying conditions.