963 resultados para Statistics and probability
Resumo:
In a recent paper, Komaki studied the second-order asymptotic properties of predictive distributions, using the Kullback-Leibler divergence as a loss function. He showed that estimative distributions with asymptotically efficient estimators can be improved by predictive distributions that do not belong to the model. The model is assumed to be a multidimensional curved exponential family. In this paper we generalize the result assuming as a loss function any f divergence. A relationship arises between alpha connections and optimal predictive distributions. In particular, using an alpha divergence to measure the goodness of a predictive distribution, the optimal shift of the estimate distribution is related to alpha-covariant derivatives. The expression that we obtain for the asymptotic risk is also useful to study the higher-order asymptotic properties of an estimator, in the mentioned class of loss functions.
Resumo:
We prove a characterization of the support of the law of the solution for a stochastic wave equation with two-dimensional space variable, driven by a noise white in time and correlated in space. The result is a consequence of an approximation theorem, in the convergence of probability, for equations obtained by smoothing the random noise. For some particular classes of coefficients, approximation in the Lp-norm for p¿1 is also proved.
Resumo:
In this article, the fusion of a stochastic metaheuristic as Simulated Annealing (SA) with classical criteria for convergence of Blind Separation of Sources (BSS), is shown. Although the topic of BSS, by means of various techniques, including ICA, PCA, and neural networks, has been amply discussed in the literature, to date the possibility of using simulated annealing algorithms has not been seriously explored. From experimental results, this paper demonstrates the possible benefits offered by SA in combination with high order statistical and mutual information criteria for BSS, such as robustness against local minima and a high degree of flexibility in the energy function.
Resumo:
Dans cette thèse l’ancienne question philosophique “tout événement a-t-il une cause ?” sera examinée à la lumière de la mécanique quantique et de la théorie des probabilités. Aussi bien en physique qu’en philosophie des sciences la position orthodoxe maintient que le monde physique est indéterministe. Au niveau fondamental de la réalité physique – au niveau quantique – les événements se passeraient sans causes, mais par chance, par hasard ‘irréductible’. Le théorème physique le plus précis qui mène à cette conclusion est le théorème de Bell. Ici les prémisses de ce théorème seront réexaminées. Il sera rappelé que d’autres solutions au théorème que l’indéterminisme sont envisageables, dont certaines sont connues mais négligées, comme le ‘superdéterminisme’. Mais il sera argué que d’autres solutions compatibles avec le déterminisme existent, notamment en étudiant des systèmes physiques modèles. Une des conclusions générales de cette thèse est que l’interprétation du théorème de Bell et de la mécanique quantique dépend crucialement des prémisses philosophiques desquelles on part. Par exemple, au sein de la vision d’un Spinoza, le monde quantique peut bien être compris comme étant déterministe. Mais il est argué qu’aussi un déterminisme nettement moins radical que celui de Spinoza n’est pas éliminé par les expériences physiques. Si cela est vrai, le débat ‘déterminisme – indéterminisme’ n’est pas décidé au laboratoire : il reste philosophique et ouvert – contrairement à ce que l’on pense souvent. Dans la deuxième partie de cette thèse un modèle pour l’interprétation de la probabilité sera proposé. Une étude conceptuelle de la notion de probabilité indique que l’hypothèse du déterminisme aide à mieux comprendre ce que c’est qu’un ‘système probabiliste’. Il semble que le déterminisme peut répondre à certaines questions pour lesquelles l’indéterminisme n’a pas de réponses. Pour cette raison nous conclurons que la conjecture de Laplace – à savoir que la théorie des probabilités présuppose une réalité déterministe sous-jacente – garde toute sa légitimité. Dans cette thèse aussi bien les méthodes de la philosophie que de la physique seront utilisées. Il apparaît que les deux domaines sont ici solidement reliés, et qu’ils offrent un vaste potentiel de fertilisation croisée – donc bidirectionnelle.
Resumo:
Given an observed test statistic and its degrees of freedom, one may compute the observed P value with most statistical packages. It is unknown to what extent test statistics and P values are congruent in published medical papers. Methods: We checked the congruence of statistical results reported in all the papers of volumes 409–412 of Nature (2001) and a random sample of 63 results from volumes 322–323 of BMJ (2001). We also tested whether the frequencies of the last digit of a sample of 610 test statistics deviated from a uniform distribution (i.e., equally probable digits).Results: 11.6% (21 of 181) and 11.1% (7 of 63) of the statistical results published in Nature and BMJ respectively during 2001 were incongruent, probably mostly due to rounding, transcription, or type-setting errors. At least one such error appeared in 38% and 25% of the papers of Nature and BMJ, respectively. In 12% of the cases, the significance level might change one or more orders of magnitude. The frequencies of the last digit of statistics deviated from the uniform distribution and suggested digit preference in rounding and reporting.Conclusions: this incongruence of test statistics and P values is another example that statistical practice is generally poor, even in the most renowned scientific journals, and that quality of papers should be more controlled and valued
Resumo:
In 2007, the Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF) was operated for a nine-month period in the Murg Valley, Black Forest, Germany, in support of the Convective and Orographically-induced Precipitation Study (COPS). The synergy of AMF and COPS partner instrumentation was exploited to derive a set of high-quality thermodynamic and cloud property profiles with 30 s resolution. In total, clouds were present 72% of the time, with multi-layer mixed phase (28.4%) and single-layer water clouds (11.3%) occurring most frequently. A comparison with the Cloudnet sites Chilbolton and Lindenberg for the same time period revealed that the Murg Valley exhibits lower liquid water paths (LWPs; median = 37.5 g m−2) compared to the two sites located in flat terrain. In order to evaluate the derived thermodynamic and cloud property profiles, a radiative closure study was performed with independent surface radiation measurements. In clear sky, average differences between calculated and observed surface fluxes are less than 2% and 4% for the short wave and long wave part, respectively. In cloudy situations, differences between simulated and observed fluxes, particularly in the short wave part, are much larger, but most of these can be related to broken cloud situations. The daytime cloud radiative effect (CRE), i.e. the difference of cloudy and clear-sky net fluxes, has been analysed for the whole nine-month period. For overcast, single-layer water clouds, sensitivity studies revealed that the CRE uncertainty is likewise determined by uncertainties in liquid water content and effective radius. For low LWP clouds, CRE uncertainty is dominated by LWP uncertainty; therefore refined retrievals, such as using infrared and/or higher microwave frequencies, are needed.
An LDA and probability-based classifier for the diagnosis of Alzheimer's Disease from structural MRI
Resumo:
In this paper a custom classification algorithm based on linear discriminant analysis and probability-based weights is implemented and applied to the hippocampus measurements of structural magnetic resonance images from healthy subjects and Alzheimer’s Disease sufferers; and then attempts to diagnose them as accurately as possible. The classifier works by classifying each measurement of a hippocampal volume as healthy controlsized or Alzheimer’s Disease-sized, these new features are then weighted and used to classify the subject as a healthy control or suffering from Alzheimer’s Disease. The preliminary results obtained reach an accuracy of 85.8% and this is a similar accuracy to state-of-the-art methods such as a Naive Bayes classifier and a Support Vector Machine. An advantage of the method proposed in this paper over the aforementioned state of the art classifiers is the descriptive ability of the classifications it produces. The descriptive model can be of great help to aid a doctor in the diagnosis of Alzheimer’s Disease, or even further the understand of how Alzheimer’s Disease affects the hippocampus.