19 resultados para Maximum entropy statistical estimate

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T2.33TC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The extraction of the finite temperature heavy quark potential from lattice QCD relies on a spectral analysis of the Wilson loop. General arguments tell us that the lowest lying spectral peak encodes, through its position and shape, the real and imaginary parts of this complex potential. Here we benchmark this extraction strategy using leading order hard-thermal loop (HTL) calculations. In other words, we analytically calculate the Wilson loop and determine the corresponding spectrum. By fitting its lowest lying peak we obtain the real and imaginary parts and confirm that the knowledge of the lowest peak alone is sufficient for obtaining the potential. Access to the full spectrum allows an investigation of spectral features that do not contribute to the potential but can pose a challenge to numerical attempts of an analytic continuation from imaginary time data. Differences in these contributions between the Wilson loop and gauge fixed Wilson line correlators are discussed. To better understand the difficulties in a numerical extraction we deploy the maximum entropy method with extended search space to HTL correlators in Euclidean time and observe how well the known spectral function and values for the real and imaginary parts are reproduced. Possible venues for improvement of the extraction strategy are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a novel approach for the reconstruction of spectra from Euclidean correlator data that makes close contact to modern Bayesian concepts. It is based upon an axiomatically justified dimensionless prior distribution, which in the case of constant prior function m(ω) only imprints smoothness on the reconstructed spectrum. In addition we are able to analytically integrate out the only relevant overall hyper-parameter α in the prior, removing the necessity for Gaussian approximations found e.g. in the Maximum Entropy Method. Using a quasi-Newton minimizer and high-precision arithmetic, we are then able to find the unique global extremum of P[ρ|D] in the full Nω » Nτ dimensional search space. The method actually yields gradually improving reconstruction results if the quality of the supplied input data increases, without introducing artificial peak structures, often encountered in the MEM. To support these statements we present mock data analyses for the case of zero width delta peaks and more realistic scenarios, based on the perturbative Euclidean Wilson Loop as well as the Wilson Line correlator in Coulomb gauge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a shallow dialogue analysis model, aimed at human-human dialogues in the context of staff or business meetings. Four components of the model are defined, and several machine learning techniques are used to extract features from dialogue transcripts: maximum entropy classifiers for dialogue acts, latent semantic analysis for topic segmentation, or decision tree classifiers for discourse markers. A rule-based approach is proposed for solving cross-modal references to meeting documents. The methods are trained and evaluated thanks to a common data set and annotation format. The integration of the components into an automated shallow dialogue parser opens the way to multimodal meeting processing and retrieval applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

rrlogit fits a maximum-likelihood logistic regression for randomized response data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical shape models (SSMs) have been used widely as a basis for segmenting and interpreting complex anatomical structures. The robustness of these models are sensitive to the registration procedures, i.e., establishment of a dense correspondence across a training data set. In this work, two SSMs based on the same training data set of scoliotic vertebrae, and registration procedures were compared. The first model was constructed based on the original binary masks without applying any image pre- and post-processing, and the second was obtained by means of a feature preserving smoothing method applied to the original training data set, followed by a standard rasterization algorithm. The accuracies of the correspondences were assessed quantitatively by means of the maximum of the mean minimum distance (MMMD) and Hausdorf distance (H(D)). Anatomical validity of the models were quantified by means of three different criteria, i.e., compactness, specificity, and model generalization ability. The objective of this study was to compare quasi-identical models based on standard metrics. Preliminary results suggest that the MMMD distance and eigenvalues are not sensitive metrics for evaluating the performance and robustness of SSMs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a framework for statistical finite element analysis combining shape and material properties, and allowing performing statistical statements of biomechanical performance across a given population. In this paper, we focus on the design of orthopaedic implants that fit a maximum percentage of the target population, both in terms of geometry and biomechanical stability. CT scans of the bone under consideration are registered non-rigidly to obtain correspondences in position and intensity between them. A statistical model of shape and intensity (bone density) is computed by means of principal component analysis. Afterwards, finite element analysis (FEA) is performed to analyse the biomechanical performance of the bones. Realistic forces are applied on the bones and the resulting displacement and bone stress distribution are calculated. The mechanical behaviour of different PCA bone instances is compared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gauging the maximum willingness to pay (WTP) of a product accurately is a critical success factor that determines not only market performance but also financial results. A number of approaches have therefore been developed to accurately estimate consumers’ willingness to pay. Here, four commonly used measurement approaches are compared using real purchase data as a benchmark. The relative strengths of each method are analyzed on the basis of statistical criteria and, more importantly, on their potential to predict managerially relevant criteria such as optimal price, quantity and profit. The results show a slight advantage of incentive-aligned approaches though the market settings need to be considered to choose the best-fitting procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical physicists assume a probability distribution over micro-states to explain thermodynamic behavior. The question of this paper is whether these probabilities are part of a best system and can thus be interpreted as Humean chances. I consider two strategies, viz. a globalist as suggested by Loewer, and a localist as advocated by Frigg and Hoefer. Both strategies fail because the system they are part of have rivals that are roughly equally good, while ontic probabilities should be part of a clearly winning system. I conclude with the diagnosis that well-defined micro-probabilities under-estimate the robust character of explanations in statistical physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: One reason for the decision to delay the introduction of an Attenuated Psychosis Syndrome in the main text of the fifth edition of the American Psychiatric Association's Diagnostic and Statistical Manual of Mental Disorders was the concern that attenuated psychotic symptoms (APS) might in fact be common features in adolescents and young adults from the general population of no psychopathological significance in themselves. This concern was based on reports of high prevalence rates of psychotic-like experiences (PLEs) in the general population and the assumption that PLEs are a good estimate of APS. Although the criterion validity of self-reported PLEs had already been studied with respect to clinician-rated psychotic symptoms and found insufficient, it had been argued that PLEs might in fact be more comparable with mild, subclinical expressions of psychotic symptoms and, therefore, with APS. The present paper is the first to specifically study this assumption. Sampling and Methods: The sample consisted of 123 persons seeking help at a service for the early detection of psychosis, of whom 54 had an at-risk mental state or psychosis, 55 had a nonpsychotic mental disorder and 14 had no full-blown mental disorder. PLEs were assessed with the Peters Delusion Inventory and the revised Launay-Slade Hallucination Scale, and psychotic symptoms and APS were assessed with the Structured Interview for Prodromal Syndromes. Results: At a level of agreement between the presence of any PLE (in 98.4% of patients) and any APS (in 40.7%) just exceeding chance (κ = 0.022), the criterion validity of PLEs for APS was insufficient. Even if additional qualifiers (high agreement or distress, preoccupation and conviction) were considered, PLEs (in 52.8%) still tended to significantly overestimate APS, and agreement was only fair (κ = 0.340). Furthermore, the group effect on PLE prevalence was, at most, moderate (Cramer's V ≤ 0.382). Conclusions: The prevalence of APS cannot be deduced from studies of PLEs. Thus, the high population prevalence rate of PLEs does not allow the conclusion that APS are common features of no pathological significance and would lack clinical validity as an Attenuated Psychosis Syndrome in the Diagnostic and Statistical Manual of Mental Disorders, 5th edition. Rather, the population prevalence rate of APS has to be assumed to be largely unknown at present but is likely lower than indicated by epidemiological studies of PLEs. Therefore, dedicated studies are warranted, in which APS are assessed in a way that equates to their clinical evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical physicists assume a probability distribution over micro-states to explain thermodynamic behavior. The question of this paper is whether these probabilities are part of a best system and can thus be interpreted as Humean chances. I consider two Boltzmannian accounts of the Second Law, viz.\ a globalist and a localist one. In both cases, the probabilities fail to be chances because they have rivals that are roughly equally good. I conclude with the diagnosis that well-defined micro-probabilities under-estimate the robust character of explanations in statistical physics.