53 resultados para default probability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using the classical Parzen window estimate as the target function, the kernel density estimation is formulated as a regression problem and the orthogonal forward regression technique is adopted to construct sparse kernel density estimates. The proposed algorithm incrementally minimises a leave-one-out test error score to select a sparse kernel model, and a local regularisation method is incorporated into the density construction process to further enforce sparsity. The kernel weights are finally updated using the multiplicative nonnegative quadratic programming algorithm, which has the ability to reduce the model size further. Except for the kernel width, the proposed algorithm has no other parameters that need tuning, and the user is not required to specify any additional criterion to terminate the density construction procedure. Two examples are used to demonstrate the ability of this regression-based approach to effectively construct a sparse kernel density estimate with comparable accuracy to that of the full-sample optimised Parzen window density estimate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A generalized or tunable-kernel model is proposed for probability density function estimation based on an orthogonal forward regression procedure. Each stage of the density estimation process determines a tunable kernel, namely, its center vector and diagonal covariance matrix, by minimizing a leave-one-out test criterion. The kernel mixing weights of the constructed sparse density estimate are finally updated using the multiplicative nonnegative quadratic programming algorithm to ensure the nonnegative and unity constraints, and this weight-updating process additionally has the desired ability to further reduce the model size. The proposed tunable-kernel model has advantages, in terms of model generalization capability and model sparsity, over the standard fixed-kernel model that restricts kernel centers to the training data points and employs a single common kernel variance for every kernel. On the other hand, it does not optimize all the model parameters together and thus avoids the problems of high-dimensional ill-conditioned nonlinear optimization associated with the conventional finite mixture model. Several examples are included to demonstrate the ability of the proposed novel tunable-kernel model to effectively construct a very compact density estimate accurately.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A two-locus match probability is presented that incorporates the effects of within-subpopulation inbreeding (consanguinity) in addition to population subdivision. The usual practice of calculating multi-locus match probabilities as the product of single-locus probabilities assumes independence between loci. There are a number of population genetics phenomena that can violate this assumption: in addition to consanguinity, which increases homozygosity at all loci simultaneously, gametic disequilibrium will introduce dependence into DNA profiles. However, in forensics the latter problem is usually addressed in part by the careful choice of unlinked loci. Hence, as is conventional, we assume gametic equilibrium here, and focus instead on between-locus dependence due to consanguinity. The resulting match probability formulae are an extension of existing methods in the literature, and are shown to be more conservative than these methods in the case of double homozygote matches. For two-locus profiles involving one or more heterozygous genotypes, results are similar to, or smaller than, the existing approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new sparse kernel probability density function (pdf) estimator based on zero-norm constraint is constructed using the classical Parzen window (PW) estimate as the target function. The so-called zero-norm of the parameters is used in order to achieve enhanced model sparsity, and it is suggested to minimize an approximate function of the zero-norm. It is shown that under certain condition, the kernel weights of the proposed pdf estimator based on the zero-norm approximation can be updated using the multiplicative nonnegative quadratic programming algorithm. Numerical examples are employed to demonstrate the efficacy of the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent research has suggested that relatively cold UK winters are more common when solar activity is low (Lockwood et al 2010 Environ. Res. Lett. 5 024001). Solar activity during the current sunspot minimum has fallen to levels unknown since the start of the 20th century (Lockwood 2010 Proc. R. Soc. A 466 303–29) and records of past solar variations inferred from cosmogenic isotopes (Abreu et al 2008 Geophys. Res. Lett. 35 L20109) and geomagnetic activity data (Lockwood et al 2009 Astrophys. J. 700 937–44) suggest that the current grand solar maximum is coming to an end and hence that solar activity can be expected to continue to decline. Combining cosmogenic isotope data with the long record of temperatures measured in central England, we estimate how solar change could influence the probability in the future of further UK winters that are cold, relative to the hemispheric mean temperature, if all other factors remain constant. Global warming is taken into account only through the detrending using mean hemispheric temperatures. We show that some predictive skill may be obtained by including the solar effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of automatic segmentation methods in lesion detection is desirable. However, such methods are restricted by intensity similarities between lesioned and healthy brain tissue. Using multi-spectral magnetic resonance imaging (MRI) modalities may overcome this problem but it is not always practicable. In this article, a lesion detection approach requiring a single MRI modality is presented, which is an improved method based on a recent publication. This new method assumes that a low similarity should be found in the regions of lesions when the likeness between an intensity based fuzzy segmentation and a location based tissue probabilities is measured. The usage of a normalized similarity measurement enables the current method to fine-tune the threshold for lesion detection, thus maximizing the possibility of reaching high detection accuracy. Importantly, an extra cleaning step is included in the current approach which removes enlarged ventricles from detected lesions. The performance investigation using simulated lesions demonstrated that not only the majority of lesions were well detected but also normal tissues were identified effectively. Tests on images acquired in stroke patients further confirmed the strength of the method in lesion detection. When compared with the previous version, the current approach showed a higher sensitivity in detecting small lesions and had less false positives around the ventricle and the edge of the brain

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consideration is given to a standard CDMA system and determination of the density function of the interference with and without Gaussian noise using sampling theory concepts. The formula derived provides fast and accurate results and is a simple, useful alternative to other methods

Relevância:

20.00% 20.00%

Publicador:

Resumo:

References (20)Cited By (1)Export CitationAboutAbstract Proper scoring rules provide a useful means to evaluate probabilistic forecasts. Independent from scoring rules, it has been argued that reliability and resolution are desirable forecast attributes. The mathematical expectation value of the score allows for a decomposition into reliability and resolution related terms, demonstrating a relationship between scoring rules and reliability/resolution. A similar decomposition holds for the empirical (i.e. sample average) score over an archive of forecast–observation pairs. This empirical decomposition though provides a too optimistic estimate of the potential score (i.e. the optimum score which could be obtained through recalibration), showing that a forecast assessment based solely on the empirical resolution and reliability terms will be misleading. The differences between the theoretical and empirical decomposition are investigated, and specific recommendations are given how to obtain better estimators of reliability and resolution in the case of the Brier and Ignorance scoring rule.