912 resultados para Asymptotic normality of sums
Resumo:
Es soll eine Dichtefunktion geschätzt werden unter der Modellannahme, dass diese in einer geeigneten Besovklasse liegt und kompakten Träger hat. Hierzu wird ein Waveletschätzer TW näher untersucht, der Thresholding-Methoden verwendet. Es wird die asymptotische Konvergenzgeschwindigkeit von TW für eine große Zahl von Beobachtungen angegeben und bewiesen. Schließlich werden in einem Überblick weitere Waveletschätzer diskutiert und mit TW verglichen. Es zeigt sich, dass TW in vielen Modellannahmen die optimale Konvergenzrate erreicht.
Resumo:
Wir analysieren die Rolle von "Hintergrundunabhängigkeit" im Zugang der effektiven Mittelwertwirkung zur Quantengravitation. Wenn der nicht-störungstheoretische Renormierungsgruppen-(RG)-Fluß "hintergrundunabhängig" ist, muß die Vergröberung durch eine nicht spezifizierte, variable Metrik definiert werden. Die Forderung nach "Hintergrundunabhängigkeit" in der Quantengravitation führt dazu, daß die funktionale RG-Gleichung von zusätzlichen Feldern abhängt; dadurch unterscheidet sich der RG-Fluß in der Quantengravitation deutlich von dem RG-Fluß einer gewöhnlichen Quantentheorie, deren Moden-Cutoff von einer starren Metrik abhängt. Beispielsweise kann in der "hintergrundunabhängigen" Theorie ein Nicht-Gauß'scher Fixpunkt existieren, obwohl die entsprechende gewöhnliche Quantentheorie keinen solchen entwickelt. Wir untersuchen die Bedeutung dieses universellen, rein kinematischen Effektes, indem wir den RG-Fluß der Quanten-Einstein-Gravitation (QEG) in einem "konform-reduzierten" Zusammenhang untersuchen, in dem wir nur den konformen Faktor der Metrik quantisieren. Alle anderen Freiheitsgrade der Metrik werden vernachlässigt. Die konforme Reduktion der Einstein-Hilbert-Trunkierung zeigt exakt dieselben qualitativen Eigenschaften wie in der vollen Einstein-Hilbert-Trunkierung. Insbesondere besitzt sie einen Nicht-Gauß'schen Fixpunkt, der notwendig ist, damit die Gravitation asymptotisch sicher ist. Ohne diese zusätzlichen Feldabhängigkeiten ist der RG-Fluß dieser Trunkierung der einer gewöhnlichen $phi^4$-Theorie. Die lokale Potentialnäherung für den konformen Faktor verallgemeinert den RG-Fluß in der Quantengravitation auf einen unendlich-dimensionalen Theorienraum. Auch hier finden wir sowohl einen Gauß'schen als auch einen Nicht-Gauß'schen Fixpunkt, was weitere Hinweise dafür liefert, daß die Quantengravitation asymptotisch sicher ist. Das Analogon der Metrik-Invarianten, die proportional zur dritten Potenz der Krümmung ist und die die störungstheoretische Renormierbarkeit zerstört, ist unproblematisch für die asymptotische Sicherheit der konform-reduzierten Theorie. Wir berechnen die Skalenfelder und -imensionen der beiden Fixpunkte explizit und diskutieren mögliche Einflüsse auf die Vorhersagekraft der Theorie. Da der RG-Fluß von der Topologie der zugrundeliegenden Raumzeit abhängt, diskutieren wir sowohl den flachen Raum als auch die Sphäre. Wir lösen die Flußgleichung für das Potential numerisch und erhalten Beispiele für RG-Trajektorien, die innerhalb der Ultraviolett-kritischen Mannigfaltigkeit des Nicht-Gauß'schen Fixpunktes liegen. Die Quantentheorien, die durch einige solcher Trajektorien definiert sind, zeigen einen Phasenübergang von der bekannten (Niederenergie-) Phase der Gravitation mit spontan gebrochener Diffeomorphismus-Invarianz zu einer neuen Phase von ungebrochener Diffeomorphismus-Invarianz. Diese Hochenergie-Phase ist durch einen verschwindenden Metrik-Erwartungswert charakterisiert.
Resumo:
Während das Standardmodell der Elementarteilchenphysik eine konsistente, renormierbare Quantenfeldtheorie dreier der vier bekannten Wechselwirkungen darstellt, bleibt die Quantisierung der Gravitation ein bislang ungelöstes Problem. In den letzten Jahren haben sich jedoch Hinweise ergeben, nach denen metrische Gravitation asymptotisch sicher ist. Das bedeutet, daß sich auch für diese Wechselwirkung eine Quantenfeldtheorie konstruieren läßt. Diese ist dann in einem verallgemeinerten Sinne renormierbar, der nicht mehr explizit Bezug auf die Störungstheorie nimmt. Zudem sagt dieser Zugang, der auf der Wilsonschen Renormierungsgruppe beruht, die korrekte mikroskopische Wirkung der Theorie voraus. Klassisch ist metrische Gravitation auf dem Niveau der Vakuumfeldgleichungen äquivalent zur Einstein-Cartan-Theorie, die das Vielbein und den Spinzusammenhang als fundamentale Variablen verwendet. Diese Theorie besitzt allerdings mehr Freiheitsgrade, eine größere Eichgruppe, und die zugrundeliegende Wirkung ist von erster Ordnung. Alle diese Eigenschaften erschweren eine zur metrischen Gravitation analoge Behandlung.rnrnIm Rahmen dieser Arbeit wird eine dreidimensionale Trunkierung von der Art einer verallgemeinerten Hilbert-Palatini-Wirkung untersucht, die neben dem Laufen der Newton-Konstante und der kosmologischen Konstante auch die Renormierung des Immirzi-Parameters erfaßt. Trotz der angedeuteten Schwierigkeiten war es möglich, das Spektrum des freien Hilbert-Palatini-Propagators analytisch zu berechnen. Auf dessen Grundlage wird eine Flußgleichung vom Propertime-Typ konstruiert. Zudem werden geeignete Eichbedingungen gewählt und detailliert analysiert. Dabei macht die Struktur der Eichgruppe eine Kovariantisierung der Eichtransformationen erforderlich. Der resultierende Fluß wird für verschiedene Regularisierungsschemata und Eichparameter untersucht. Dies liefert auch im Einstein-Cartan-Zugang berzeugende Hinweise auf asymptotische Sicherheit und damit auf die mögliche Existenz einer mathematisch konsistenten und prädiktiven fundamentalen Quantentheorie der Gravitation. Insbesondere findet man ein Paar nicht-Gaußscher Fixpunkte, das Anti-Screening aufweist. An diesen sind die Newton-Konstante und die kosmologische Konstante jeweils relevante Kopplungen, wohingegen der Immirzi-Parameter an einem Fixpunkt irrelevant und an dem anderen relevant ist. Zudem ist die Beta-Funktion des Immirzi-Parameters von bemerkenswert einfacher Form. Die Resultate sind robust gegenüber Variationen des Regularisierungsschemas. Allerdings sollten zukünftige Untersuchungen die bestehenden Eichabhängigkeiten reduzieren.
Resumo:
Let O-2n be a symplectic toric orbifold with a fixed T-n-action and with a tonic Kahler metric g. In [10] we explored whether, when O is a manifold, the equivariant spectrum of the Laplace Delta(g) operator on C-infinity(O) determines O up to symplectomorphism. In the setting of tonic orbifolds we shmilicantly improve upon our previous results and show that a generic tone orbifold is determined by its equivariant spectrum, up to two possibilities. This involves developing the asymptotic expansion of the heat trace on an orbifold in the presence of an isometry. We also show that the equivariant spectrum determines whether the toric Kahler metric has constant scalar curvature. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
The concordance probability is used to evaluate the discriminatory power and the predictive accuracy of nonlinear statistical models. We derive an analytic expression for the concordance probability in the Cox proportional hazards model. The proposed estimator is a function of the regression parameters and the covariate distribution only and does not use the observed event and censoring times. For this reason it is asymptotically unbiased, unlike Harrell's c-index based on informative pairs. The asymptotic distribution of the concordance probability estimate is derived using U-statistic theory and the methodology is applied to a predictive model in lung cancer.
Resumo:
Power calculations in a small sample comparative study, with a continuous outcome measure, are typically undertaken using the asymptotic distribution of the test statistic. When the sample size is small, this asymptotic result can be a poor approximation. An alternative approach, using a rank based test statistic, is an exact power calculation. When the number of groups is greater than two, the number of calculations required to perform an exact power calculation is prohibitive. To reduce the computational burden, a Monte Carlo resampling procedure is used to approximate the exact power function of a k-sample rank test statistic under the family of Lehmann alternative hypotheses. The motivating example for this approach is the design of animal studies, where the number of animals per group is typically small.
Resumo:
We introduce a diagnostic test for the mixing distribution in a generalised linear mixed model. The test is based on the difference between the marginal maximum likelihood and conditional maximum likelihood estimates of a subset of the fixed effects in the model. We derive the asymptotic variance of this difference, and propose a test statistic that has a limiting chi-square distribution under the null hypothesis that the mixing distribution is correctly specified. For the important special case of the logistic regression model with random intercepts, we evaluate via simulation the power of the test in finite samples under several alternative distributional forms for the mixing distribution. We illustrate the method by applying it to data from a clinical trial investigating the effects of hormonal contraceptives in women.
Resumo:
There is an emerging interest in modeling spatially correlated survival data in biomedical and epidemiological studies. In this paper, we propose a new class of semiparametric normal transformation models for right censored spatially correlated survival data. This class of models assumes that survival outcomes marginally follow a Cox proportional hazard model with unspecified baseline hazard, and their joint distribution is obtained by transforming survival outcomes to normal random variables, whose joint distribution is assumed to be multivariate normal with a spatial correlation structure. A key feature of the class of semiparametric normal transformation models is that it provides a rich class of spatial survival models where regression coefficients have population average interpretation and the spatial dependence of survival times is conveniently modeled using the transformed variables by flexible normal random fields. We study the relationship of the spatial correlation structure of the transformed normal variables and the dependence measures of the original survival times. Direct nonparametric maximum likelihood estimation in such models is practically prohibited due to the high dimensional intractable integration of the likelihood function and the infinite dimensional nuisance baseline hazard parameter. We hence develop a class of spatial semiparametric estimating equations, which conveniently estimate the population-level regression coefficients and the dependence parameters simultaneously. We study the asymptotic properties of the proposed estimators, and show that they are consistent and asymptotically normal. The proposed method is illustrated with an analysis of data from the East Boston Ashma Study and its performance is evaluated using simulations.
Resumo:
Growth codes are a subclass of Rateless codes that have found interesting applications in data dissemination problems. Compared to other Rateless and conventional channel codes, Growth codes show improved intermediate performance which is particularly useful in applications where partial data presents some utility. In this paper, we investigate the asymptotic performance of Growth codes using the Wormald method, which was proposed for studying the Peeling Decoder of LDPC and LDGM codes. Compared to previous works, the Wormald differential equations are set on nodes' perspective which enables a numerical solution to the computation of the expected asymptotic decoding performance of Growth codes. Our framework is appropriate for any class of Rateless codes that does not include a precoding step. We further study the performance of Growth codes with moderate and large size codeblocks through simulations and we use the generalized logistic function to model the decoding probability. We then exploit the decoding probability model in an illustrative application of Growth codes to error resilient video transmission. The video transmission problem is cast as a joint source and channel rate allocation problem that is shown to be convex with respect to the channel rate. This illustrative application permits to highlight the main advantage of Growth codes, namely improved performance in the intermediate loss region.
Resumo:
We present a new approach to the issues of spacetime singularities and cosmic censorship in general relativity. This is based on the idea that standard 4-dimensional spacetime is the conformal infinity of an ambient metric for the 5-dimensional Einstein equations with fluid sources. We then find that the existence of spacetime singularities in four dimensions is constrained by asymptotic properties of the ambient 5-metric, while the non-degeneracy of the latter crucially depends on cosmic censorship holding on the boundary.
Resumo:
Monte Carlo simulation has been conducted to investigate parameter estimation and hypothesis testing in some well known adaptive randomization procedures. The four urn models studied are Randomized Play-the-Winner (RPW), Randomized Pôlya Urn (RPU), Birth and Death Urn with Immigration (BDUI), and Drop-the-Loses Urn (DL). Two sequential estimation methods, the sequential maximum likelihood estimation (SMLE) and the doubly adaptive biased coin design (DABC), are simulated at three optimal allocation targets that minimize the expected number of failures under the assumption of constant variance of simple difference (RSIHR), relative risk (ORR), and odds ratio (OOR) respectively. Log likelihood ratio test and three Wald-type tests (simple difference, log of relative risk, log of odds ratio) are compared in different adaptive procedures. ^ Simulation results indicates that although RPW is slightly better in assigning more patients to the superior treatment, the DL method is considerably less variable and the test statistics have better normality. When compared with SMLE, DABC has slightly higher overall response rate with lower variance, but has larger bias and variance in parameter estimation. Additionally, the test statistics in SMLE have better normality and lower type I error rate, and the power of hypothesis testing is more comparable with the equal randomization. Usually, RSIHR has the highest power among the 3 optimal allocation ratios. However, the ORR allocation has better power and lower type I error rate when the log of relative risk is the test statistics. The number of expected failures in ORR is smaller than RSIHR. It is also shown that the simple difference of response rates has the worst normality among all 4 test statistics. The power of hypothesis test is always inflated when simple difference is used. On the other hand, the normality of the log likelihood ratio test statistics is robust against the change of adaptive randomization procedures. ^
Resumo:
In geographical epidemiology, maps of disease rates and disease risk provide a spatial perspective for researching disease etiology. For rare diseases or when the population base is small, the rate and risk estimates may be unstable. Empirical Bayesian (EB) methods have been used to spatially smooth the estimates by permitting an area estimate to "borrow strength" from its neighbors. Such EB methods include the use of a Gamma model, of a James-Stein estimator, and of a conditional autoregressive (CAR) process. A fully Bayesian analysis of the CAR process is proposed. One advantage of this fully Bayesian analysis is that it can be implemented simply by using repeated sampling from the posterior densities. Use of a Markov chain Monte Carlo technique such as Gibbs sampler was not necessary. Direct resampling from the posterior densities provides exact small sample inferences instead of the approximate asymptotic analyses of maximum likelihood methods (Clayton & Kaldor, 1987). Further, the proposed CAR model provides for covariates to be included in the model. A simulation demonstrates the effect of sample size on the fully Bayesian analysis of the CAR process. The methods are applied to lip cancer data from Scotland, and the results are compared. ^
Resumo:
The report presents the results of the CTD measurements carried out in the Bellingshausen Sea - an area rare of CTD measurements. The main part of the report consists of the brief description of the CTD data acquisition and processing routines, the vertical profiles of temperature, salinity and density, and of the plots of the distribution of these properties along the hydrographic sections. The final part of the report deals with the notably similar structure of the vertical density distribution at different locations if presented as a function of a non dimensional vertical co-ordinate. It is pointed out that such a distribution could be an asymptotic limit of stationary mixing along neutral surfaces.
Resumo:
El estudio de la fiabilidad de componentes y sistemas tiene gran importancia en diversos campos de la ingenieria, y muy concretamente en el de la informatica. Al analizar la duracion de los elementos de la muestra hay que tener en cuenta los elementos que no fallan en el tiempo que dure el experimento, o bien los que fallen por causas distintas a la que es objeto de estudio. Por ello surgen nuevos tipos de muestreo que contemplan estos casos. El mas general de ellos, el muestreo censurado, es el que consideramos en nuestro trabajo. En este muestreo tanto el tiempo hasta que falla el componente como el tiempo de censura son variables aleatorias. Con la hipotesis de que ambos tiempos se distribuyen exponencialmente, el profesor Hurt estudio el comportamiento asintotico del estimador de maxima verosimilitud de la funcion de fiabilidad. En principio parece interesante utilizar metodos Bayesianos en el estudio de la fiabilidad porque incorporan al analisis la informacion a priori de la que se dispone normalmente en problemas reales. Por ello hemos considerado dos estimadores Bayesianos de la fiabilidad de una distribucion exponencial que son la media y la moda de la distribucion a posteriori. Hemos calculado la expansion asint6tica de la media, varianza y error cuadratico medio de ambos estimadores cuando la distribuci6n de censura es exponencial. Hemos obtenido tambien la distribucion asintotica de los estimadores para el caso m3s general de que la distribucion de censura sea de Weibull. Dos tipos de intervalos de confianza para muestras grandes se han propuesto para cada estimador. Los resultados se han comparado con los del estimador de maxima verosimilitud, y con los de dos estimadores no parametricos: limite producto y Bayesiano, resultando un comportamiento superior por parte de uno de nuestros estimadores. Finalmente nemos comprobado mediante simulacion que nuestros estimadores son robustos frente a la supuesta distribuci6n de censura, y que uno de los intervalos de confianza propuestos es valido con muestras pequenas. Este estudio ha servido tambien para confirmar el mejor comportamiento de uno de nuestros estimadores. SETTING OUT AND SUMMARY OF THE THESIS When we study the lifetime of components it's necessary to take into account the elements that don't fail during the experiment, or those that fail by reasons which are desirable to exclude from consideration. The model of random censorship is very usefull for analysing these data. In this model the time to failure and the time censor are random variables. We obtain two Bayes estimators of the reliability function of an exponential distribution based on randomly censored data. We have calculated the asymptotic expansion of the mean, variance and mean square error of both estimators, when the censor's distribution is exponential. We have obtained also the asymptotic distribution of the estimators for the more general case of censor's Weibull distribution. Two large-sample confidence bands have been proposed for each estimator. The results have been compared with those of the maximum likelihood estimator, and with those of two non parametric estimators: Product-limit and Bayesian. One of our estimators has the best behaviour. Finally we have shown by simulation, that our estimators are robust against the assumed censor's distribution, and that one of our intervals does well in small sample situation.
Resumo:
We introduce in this paper a method to calculate the Hessenberg matrix of a sum of measures from the Hessenberg matrices of the component measures. Our method extends the spectral techniques used by G. Mantica to calculate the Jacobi matrix associated with a sum of measures from the Jacobi matrices of each of the measures. We apply this method to approximate the Hessenberg matrix associated with a self-similar measure and compare it with the result obtained by a former method for self-similar measures which uses a fixed point theorem for moment matrices. Results are given for a series of classical examples of self-similar measures. Finally, we also apply the method introduced in this paper to some examples of sums of (not self-similar) measures obtaining the exact value of the sections of the Hessenberg matrix.