967 resultados para Bivariate Gaussian distribution


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Logit-Logistic (LL), Johnson's SB, and the Beta (GBD) are flexible four-parameter probability distribution models in terms of the (skewness-kurtosis) region covered, and each has been used for modeling tree diameter distributions in forest stands. This article compares bivariate forms of these models in terms of their adequacy in representing empirical diameter-height distributions from 102 sample plots. Four bivariate models are compared: SBB, the natural, well-known, and much-used bivariate generalization of SB; the bivariate distributions with LL, SB, and Beta as marginals, constructed using Plackett's method (LL-2P, etc.). All models are fitted using maximum likelihood, and their goodness-of-fits are compared using minus log-likelihood (equivalent to Akaike's Information Criterion, the AIC). The performance ranking in this case study was SBB, LL-2P, GBD-2P, and SB-2P

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In dieser Arbeit werden mithilfe der Likelihood-Tiefen, eingeführt von Mizera und Müller (2004), (ausreißer-)robuste Schätzfunktionen und Tests für den unbekannten Parameter einer stetigen Dichtefunktion entwickelt. Die entwickelten Verfahren werden dann auf drei verschiedene Verteilungen angewandt. Für eindimensionale Parameter wird die Likelihood-Tiefe eines Parameters im Datensatz als das Minimum aus dem Anteil der Daten, für die die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, und dem Anteil der Daten, für die diese Ableitung nicht positiv ist, berechnet. Damit hat der Parameter die größte Tiefe, für den beide Anzahlen gleich groß sind. Dieser wird zunächst als Schätzer gewählt, da die Likelihood-Tiefe ein Maß dafür sein soll, wie gut ein Parameter zum Datensatz passt. Asymptotisch hat der Parameter die größte Tiefe, für den die Wahrscheinlichkeit, dass für eine Beobachtung die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, gleich einhalb ist. Wenn dies für den zu Grunde liegenden Parameter nicht der Fall ist, ist der Schätzer basierend auf der Likelihood-Tiefe verfälscht. In dieser Arbeit wird gezeigt, wie diese Verfälschung korrigiert werden kann sodass die korrigierten Schätzer konsistente Schätzungen bilden. Zur Entwicklung von Tests für den Parameter, wird die von Müller (2005) entwickelte Simplex Likelihood-Tiefe, die eine U-Statistik ist, benutzt. Es zeigt sich, dass für dieselben Verteilungen, für die die Likelihood-Tiefe verfälschte Schätzer liefert, die Simplex Likelihood-Tiefe eine unverfälschte U-Statistik ist. Damit ist insbesondere die asymptotische Verteilung bekannt und es lassen sich Tests für verschiedene Hypothesen formulieren. Die Verschiebung in der Tiefe führt aber für einige Hypothesen zu einer schlechten Güte des zugehörigen Tests. Es werden daher korrigierte Tests eingeführt und Voraussetzungen angegeben, unter denen diese dann konsistent sind. Die Arbeit besteht aus zwei Teilen. Im ersten Teil der Arbeit wird die allgemeine Theorie über die Schätzfunktionen und Tests dargestellt und zudem deren jeweiligen Konsistenz gezeigt. Im zweiten Teil wird die Theorie auf drei verschiedene Verteilungen angewandt: Die Weibull-Verteilung, die Gauß- und die Gumbel-Copula. Damit wird gezeigt, wie die Verfahren des ersten Teils genutzt werden können, um (robuste) konsistente Schätzfunktionen und Tests für den unbekannten Parameter der Verteilung herzuleiten. Insgesamt zeigt sich, dass für die drei Verteilungen mithilfe der Likelihood-Tiefen robuste Schätzfunktionen und Tests gefunden werden können. In unverfälschten Daten sind vorhandene Standardmethoden zum Teil überlegen, jedoch zeigt sich der Vorteil der neuen Methoden in kontaminierten Daten und Daten mit Ausreißern.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In survival analysis frailty is often used to model heterogeneity between individuals or correlation within clusters. Typically frailty is taken to be a continuous random effect, yielding a continuous mixture distribution for survival times. A Bayesian analysis of a correlated frailty model is discussed in the context of inverse Gaussian frailty. An MCMC approach is adopted and the deviance information criterion is used to compare models. As an illustration of the approach a bivariate data set of corneal graft survival times is analysed. (C) 2006 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this study was to analyze the weight at birth (BW) and adjusted at 205 (W205), 365 (W365) and 550 (W55O) days in beef buffaloes from Brazil, using two approaches: parametric, by normal distribution, and non-parametric, by kernel function, and thus estimating the genetic, environmental and phenotypic correlation among traits. Information of 5,169 animals at birth (BW), 3,792 at 205 days (W205), 3.883 at 365 days (W365) and 1,524 at 550 days of age (W550) were used. The birth weight distribution presented an evident discrepancy in relation to the normal distribution. However, W205, W365 and W550 presented normal distributions. The birth weight presented weak genetic, environmental, and phenotypic associations with the other weight measurements. on the other hand, the weight traits at 205, 365, 550 days of age showed a high genetic correlation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper considers statistical models in which two different types of events, such as the diagnosis of a disease and the remission of the disease, occur alternately over time and are observed subject to right censoring. We propose nonparametric estimators for the joint distribution of bivariate recurrence times and the marginal distribution of the first recurrence time. In general, the marginal distribution of the second recurrence time cannot be estimated due to an identifiability problem, but a conditional distribution of the second recurrence time can be estimated non-parametrically. In literature, statistical methods have been developed to estimate the joint distribution of bivariate recurrence times based on data of the first pair of censored bivariate recurrence times. These methods are efficient in the current model because recurrence times of higher orders are not used. Asymptotic properties of the estimators are established. Numerical studies demonstrate the estimator performs well with practical sample sizes. We apply the proposed method to a Denmark psychiatric case register data set for illustration of the methods and theory.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We report the suitability of an Einstein-Podolsky-Rosen entanglement source for Gaussian continuous-variable quantum key distribution at 1550 nm. Our source is based on a single continuous-wave squeezed vacuum mode combined with a vacuum mode at a balanced beam splitter. Extending a recent security proof, we characterize the source by quantifying the extractable length of a composable secure key from a finite number of samples under the assumption of collective attacks. We show that distances in the order of 10 km are achievable with this source for a reasonable sample size despite the fact that the entanglement was generated including a vacuum mode. Our security analysis applies to all states having an asymmetry in the field quadrature variances, including those generated by superposition of two squeezed modes with different squeezing strengths.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimal design for generalized linear models has primarily focused on univariate data. Often experiments are performed that have multiple dependent responses described by regression type models, and it is of interest and of value to design the experiment for all these responses. This requires a multivariate distribution underlying a pre-chosen model for the data. Here, we consider the design of experiments for bivariate binary data which are dependent. We explore Copula functions which provide a rich and flexible class of structures to derive joint distributions for bivariate binary data. We present methods for deriving optimal experimental designs for dependent bivariate binary data using Copulas, and demonstrate that, by including the dependence between responses in the design process, more efficient parameter estimates are obtained than by the usual practice of simply designing for a single variable only. Further, we investigate the robustness of designs with respect to initial parameter estimates and Copula function, and also show the performance of compound criteria within this bivariate binary setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our results demonstrate that photorefractive residual amplitude modulation (RAM) noise in electro-optic modulators (EOMs) can be reduced by modifying the incident beam intensity distribution. Here we report an order of magnitude reduction in RAM when beams with uniform intensity (flat-top) profiles, generated with an LCOS-SLM, are used instead of the usual fundamental Gaussian mode (TEM00). RAM arises from the photorefractive amplified scatter noise off the defects and impurities within the crystal. A reduction in RAM is observed with increasing intensity uniformity (flatness), which is attributed to a reduction in space charge field on the beam axis. The level of RAM reduction that can be achieved is physically limited by clipping at EOM apertures, with the observed results agreeing well with a simple model. These results are particularly important in applications where the reduction of residual amplitude modulation to 10^-6 is essential.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Autonomous navigation and picture compilation tasks require robust feature descriptions or models. Given the non Gaussian nature of sensor observations, it will be shown that Gaussian mixture models provide a general probabilistic representation allowing analytical solutions to the update and prediction operations in the general Bayesian filtering problem. Each operation in the Bayesian filter for Gaussian mixture models multiplicatively increases the number of parameters in the representation leading to the need for a re-parameterisation step. A computationally efficient re-parameterisation step will be demonstrated resulting in a compact and accurate estimate of the true distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydraulic conductivity (K) fields are used to parameterize groundwater flow and transport models. Numerical simulations require a detailed representation of the K field, synthesized to interpolate between available data. Several recent studies introduced high-resolution K data (HRK) at the Macro Dispersion Experiment (MADE) site, and used ground-penetrating radar (GPR) to delineate the main structural features of the aquifer. This paper describes a statistical analysis of these data, and the implications for K field modeling in alluvial aquifers. Two striking observations have emerged from this analysis. The first is that a simple fractional difference filter can have a profound effect on data histograms, organizing non-Gaussian ln K data into a coherent distribution. The second is that using GPR facies allows us to reproduce the significantly non-Gaussian shape seen in real HRK data profiles, using a simulated Gaussian ln K field in each facies. This illuminates a current controversy in the literature, between those who favor Gaussian ln K models, and those who observe non-Gaussian ln K fields. Both camps are correct, but at different scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diffusion weighted magnetic resonance (MR) imaging is a powerful tool that can be employed to study white matter microstructure by examining the 3D displacement profile of water molecules in brain tissue. By applying diffusion-sensitized gradients along a minimum of 6 directions, second-order tensors can be computed to model dominant diffusion processes. However, conventional DTI is not sufficient to resolve crossing fiber tracts. Recently, a number of high-angular resolution schemes with greater than 6 gradient directions have been employed to address this issue. In this paper, we introduce the Tensor Distribution Function (TDF), a probability function defined on the space of symmetric positive definite matrices. Here, fiber crossing is modeled as an ensemble of Gaussian diffusion processes with weights specified by the TDF. Once this optimal TDF is determined, the diffusion orientation distribution function (ODF) can easily be computed by analytic integration of the resulting displacement probability function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diffusion weighted magnetic resonance imaging is a powerful tool that can be employed to study white matter microstructure by examining the 3D displacement profile of water molecules in brain tissue. By applying diffusion-sensitized gradients along a minimum of six directions, second-order tensors (represented by three-by-three positive definite matrices) can be computed to model dominant diffusion processes. However, conventional DTI is not sufficient to resolve more complicated white matter configurations, e.g., crossing fiber tracts. Recently, a number of high-angular resolution schemes with more than six gradient directions have been employed to address this issue. In this article, we introduce the tensor distribution function (TDF), a probability function defined on the space of symmetric positive definite matrices. Using the calculus of variations, we solve the TDF that optimally describes the observed data. Here, fiber crossing is modeled as an ensemble of Gaussian diffusion processes with weights specified by the TDF. Once this optimal TDF is determined, the orientation distribution function (ODF) can easily be computed by analytic integration of the resulting displacement probability function. Moreover, a tensor orientation distribution function (TOD) may also be derived from the TDF, allowing for the estimation of principal fiber directions and their corresponding eigenvalues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recently discovered twist phase is studied in the context of the full ten-parameter family of partially coherent general anisotropic Gaussian Schell-model beams. It is shown that the nonnegativity requirement on the cross-spectral density of the beam demands that the strength of the twist phase be bounded from above by the inverse of the transverse coherence area of the beam. The twist phase as a two-point function is shown to have the structure of the generalized Huygens kernel or Green's function of a first-order system. The ray-transfer matrix of this system is exhibited. Wolf-type coherent-mode decomposition of the twist phase is carried out. Imposition of the twist phase on an otherwise untwisted beam is shown to result in a linear transformation in the ray phase space of the Wigner distribution. Though this transformation preserves the four-dimensional phase-space volume, it is not symplectic and hence it can, when impressed on a Wigner distribution, push it out of the convex set of all bona fide Wigner distributions unless the original Wigner distribution was sufficiently deep into the interior of the set.