944 resultados para Likelihood principle
Resumo:
The Hardy-Weinberg law, formulated about 100 years ago, states that under certainassumptions, the three genotypes AA, AB and BB at a bi-allelic locus are expected to occur inthe proportions p2, 2pq, and q2 respectively, where p is the allele frequency of A, and q = 1-p.There are many statistical tests being used to check whether empirical marker data obeys theHardy-Weinberg principle. Among these are the classical xi-square test (with or withoutcontinuity correction), the likelihood ratio test, Fisher's Exact test, and exact tests in combinationwith Monte Carlo and Markov Chain algorithms. Tests for Hardy-Weinberg equilibrium (HWE)are numerical in nature, requiring the computation of a test statistic and a p-value.There is however, ample space for the use of graphics in HWE tests, in particular for the ternaryplot. Nowadays, many genetical studies are using genetical markers known as SingleNucleotide Polymorphisms (SNPs). SNP data comes in the form of counts, but from the countsone typically computes genotype frequencies and allele frequencies. These frequencies satisfythe unit-sum constraint, and their analysis therefore falls within the realm of compositional dataanalysis (Aitchison, 1986). SNPs are usually bi-allelic, which implies that the genotypefrequencies can be adequately represented in a ternary plot. Compositions that are in exactHWE describe a parabola in the ternary plot. Compositions for which HWE cannot be rejected ina statistical test are typically “close" to the parabola, whereas compositions that differsignificantly from HWE are “far". By rewriting the statistics used to test for HWE in terms ofheterozygote frequencies, acceptance regions for HWE can be obtained that can be depicted inthe ternary plot. This way, compositions can be tested for HWE purely on the basis of theirposition in the ternary plot (Graffelman & Morales, 2008). This leads to nice graphicalrepresentations where large numbers of SNPs can be tested for HWE in a single graph. Severalexamples of graphical tests for HWE (implemented in R software), will be shown, using SNPdata from different human populations
Resumo:
Purpose Cadaveric study at our institution has demonstrated that optimal basaplate fixation of a reversed shoulder arthroplasty (RSA) could be achieved with screws in three major columns. Our aim was to review our early rate of aseptic glenoid loosening in a series of baseplate fixed according to this principle. Material and Methods Between 2005 and 2008, 48 RSA (Aequalis Reversed) were implanted in 48 patients with an average age of 74.4 years (range, 56 to 86 years). There were 37 women and 11 men. Twenty-seven primary RSAs were performed for cuff tear arthropathy, 3 after failed rotator cuff surgery, 6 for failed arthroplasties, 7 for acute fractures and 5 after failed ORIF. All baseplate fixation were done using a nonlocking posterior screw in the spine, a nonlocking anterior screw in the glenoid body, a locking superior screw in the coracoid and a locking inferior screw in the pillar. All patients were reviewed with standardized radiographs. The number of screws were reported. We measured the position of the screws in relation to the scapular spine and the coracoid process in two different views. We defined screw positions as totally, partially or out of the target. Finally we reported glenoid aseptic loosening which was defined as implant subsidence. Results Four patients were lost to follow-up. Thus, 44 shoulders could be reviewed after a mean follow-up of 13 months (range, 6 to 32 months). All baseplates were fixed with 4 screws. Thirty-seven (84%) screws were either partially or totally in the spine. Thus, 7 (16%) scapular spine screws were out of the target. No coracoid screw was out the target. Two (4.5%) patients had glenoid loosening. Both had a scapular spine and a coracoid screw partially in the bone. Conclusion Early aseptic glenoid loosening occurred before the two years follow-up and is most of time related to technical problems and/or insufficient bone stock and bone quality. Our study demonstrate that baseplate fixation according to the three columns principle is a reproducible technique and a valuable way to prevent early glenoid loosening.
Resumo:
SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.
Resumo:
Paltridge found reasonable values for the most significant climatic variables through maximizing the material transport part of entropy production by using a simple box model. Here, we analyse Paltridge's box model to obtain the energy and the entropy balance equations separately. Derived expressions for global entropy production, which is a function of the radiation field, and even its material transport component, are shown to be different from those used by Paltridge. Plausible climatic states are found at extrema of these parameters. Feasible results are also obtained by minimizing the radiation part of entropy production, in agreement with one of Planck's results, Finally, globally averaged values of the entropy flux of radiation and material entropy production are obtained for two dynamical extreme cases: an earth with uniform temperature, and an earth in radiative equilibrium at each latitudinal point
Resumo:
This letter to the Editor comments on the article When 'neutral' evidence still has probative value (with implications from the Barry George Case) by N. Fenton et al. [[1], 2014].
Resumo:
Introduction: Glenoid bone volume and bone quality can render the fixation of a reversed shoulder arthroplasty (RSA) basis plate hazardous. Cadaveric study at our institution has demonstrated that optimal baseplate fixation could be achieved with screws in three major columns. Our aim is to review our early rate of aseptic glenoid loosening in a series of baseplates fixed according to this principle. Methods: Between 2005 and 2008, 48 consecutive RSA (Reversed Aequalis) were implanted in 48 patients with an average age of 74.4 years (range, 56 to 86 years). There were 37 women and 11 men. Twenty-seven primary RSAs were performed for cuff tear arthropathy, 3 after failed rotator cuff surgery, 6 for failed arthroplasties, 7 for acute fractures and 5 after failed ORIF. All baseplate fixations were done using a nonlocking posterior screw in the scapular spine, a nonlocking anterior screw in the glenoid body, a locking superior screw in the coracoid and a locking inferior screw in the pillar. All patients were reviewed with standardized radiographs. We reported the positions of the screws in relation to the scapular spine and the coracoid process in two different views. We defined screw positions as totally, partially or out of the target. Finally, we reported aseptic glenoid loosening which was defined as implant subsidence. Results: Four patients were lost to follow-up. Thus 44 shoulders could be reviewed after a mean follow-up of 16 months (range, 9 to 32 months). Thirty-seven (84%) screws were either partially or totally in the spine. Thus, 7 (16%) scapular spine screws were out of the target. No coracoid screw was out of the target. At final follow-up control, we reported no glenoid loosening. Conclusion: Early glenoid loosening occurred before the two years follow-up and is most of time related to technical problems and/or insufficient glenoid bone stock and bone quality. Our study demonstrate that baseplate fixation of a RSA according to the three columns principle is a reproducible technique and a valuable way to prevent early glenoid loosening.
Resumo:
According to the Taylor principle a central bank should adjust the nominal interest rate by more than one-for-one in response to changes in current inflation. Most of the existing literature supports the view that by following this simple recommendation a central bank can avoid being a source of unnecessary fluctuations in economic activity. The present paper shows that this conclusion is not robust with respect to the modelling of capital accumulation. We use our insights to discuss the desirability of alternative interest rate rules. Our results suggest a reinterpretation of monetary policy under Volcker and Greenspan: The empirically plausible characterization of monetary policy can explain the stabilization of macroeconomic outcomes observed in the early eighties for the US economy. The Taylor principle in itself cannot.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
Although hemoglobin (Hb) is mainly present in the cytoplasm of erythrocytes (red blood cells), lower concentrations of pure, cell-free Hb are released permanently into the circulation due to an inherent intravascular hemolytic disruption of erythrocytes. Previously it was shown that the interaction of Hb with bacterial endotoxins (lipopolysaccharides, LPS) results in a significant increase of the biological activity of LPS. There is clear evidence that the enhancement of the biological activity of LPS by Hb is connected with a disaggregation of LPS. From these findings one questions whether the property to enhance the biological activity of endotoxin, in most cases proven by the ability to increase the cytokine (tumor-necrosis-factor-alpha, interleukins) production in human mononuclear cells, is restricted to bacterial endotoxin or is a more general principle in nature. To elucidate this question, we investigated the interaction of various synthetic and natural virulence (pathogenicity) factors with hemoglobin of human or sheep origin. In addition to enterobacterial R-type LPS a synthetic bacterial lipopeptide and synthetic phospholipid-like structures mimicking the lipid A portion of LPS were analysed. Furthermore, we also tested endotoxically inactive LPS and lipid A compounds such as those from Chlamydia trachomatis. We found that the observations made for endotoxically active form of LPS can be generalized for the other synthetic and natural virulence factors: In every case, the cytokine-production induced by them is increased by the addition of Hb. This biological property of Hb is connected with its physical property to convert the aggregate structures of the virulence factors into one with cubic symmetry, accompanied with a considerable reduction of the size and number of the original aggregates.
Resumo:
According to the Taylor principle a central bank should adjust the nominal interest rate by more than one-for-one in response to changes in current inflation. Most of the existing literature supports the view that by following this simple recommendation a central bank can avoid being a source of unnecessary fluctuations in economic activity. The present paper shows that this conclusion is not robust with respect to the modelling of capital accumulation. We use our insights to discuss the desirability of alternative interest raterules. Our results suggest a reinterpretation of monetary policy under Volcker and Greenspan: The empirically plausible characterization of monetary policy can explain the stabilization of macroeconomic outcomes observed in the early eighties for the US economy. The Taylor principle in itself cannot.
Resumo:
This paper extends the theory of network competition betweentelecommunications operators by allowing receivers to derive a surplusfrom receiving calls (call externality) and to affect the volume ofcommunications by hanging up (receiver sovereignty). We investigate theextent to which receiver charges can lead to an internalization of thecalling externality. When the receiver charge and the termination(access) charge are both regulated, there exists an e±cient equilibrium.Effciency requires a termination discount. When reception charges aremarket determined, it is optimal for each operator to set the prices foremission and reception at their off-net costs. For an appropriately chosentermination charge, the symmetric equilibrium is again effcient. Lastly,we show that network-based price discrimination creates strong incentivesfor connectivity breakdowns, even between equal networks.
Resumo:
Precise estimation of propagation parameters inprecipitation media is of interest to improve the performanceof communications systems and in remote sensing applications.In this paper, we present maximum-likelihood estimators ofspecific attenuation and specific differential phase in rain. Themodel used for obtaining the cited estimators assumes coherentpropagation, reflection symmetry of the medium, and Gaussianstatistics of the scattering matrix measurements. No assumptionsabout the microphysical properties of the medium are needed.The performance of the estimators is evaluated through simulateddata. Results show negligible estimators bias and variances closeto Cramer–Rao bounds.