918 resultados para cross likelihood ratio


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, the residual Kullback–Leibler discrimination information measure is extended to conditionally specified models. The extension is used to characterize some bivariate distributions. These distributions are also characterized in terms of proportional hazard rate models and weighted distributions. Moreover, we also obtain some bounds for this dynamic discrimination function by using the likelihood ratio order and some preceding results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this article, we study some relevant information divergence measures viz. Renyi divergence and Kerridge’s inaccuracy measures. These measures are extended to conditionally specifiedmodels and they are used to characterize some bivariate distributions using the concepts of weighted and proportional hazard rate models. Moreover, some bounds are obtained for these measures using the likelihood ratio order

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Developments in the statistical analysis of compositional data over the last two decades have made possible a much deeper exploration of the nature of variability, and the possible processes associated with compositional data sets from many disciplines. In this paper we concentrate on geochemical data sets. First we explain how hypotheses of compositional variability may be formulated within the natural sample space, the unit simplex, including useful hypotheses of subcompositional discrimination and specific perturbational change. Then we develop through standard methodology, such as generalised likelihood ratio tests, statistical tools to allow the systematic investigation of a complete lattice of such hypotheses. Some of these tests are simple adaptations of existing multivariate tests but others require special construction. We comment on the use of graphical methods in compositional data analysis and on the ordination of specimens. The recent development of the concept of compositional processes is then explained together with the necessary tools for a staying- in-the-simplex approach, namely compositional singular value decompositions. All these statistical techniques are illustrated for a substantial compositional data set, consisting of 209 major-oxide and rare-element compositions of metamorphosed limestones from the Northeast and Central Highlands of Scotland. Finally we point out a number of unresolved problems in the statistical analysis of compositional processes

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Hardy-Weinberg law, formulated about 100 years ago, states that under certain assumptions, the three genotypes AA, AB and BB at a bi-allelic locus are expected to occur in the proportions p2, 2pq, and q2 respectively, where p is the allele frequency of A, and q = 1-p. There are many statistical tests being used to check whether empirical marker data obeys the Hardy-Weinberg principle. Among these are the classical xi-square test (with or without continuity correction), the likelihood ratio test, Fisher's Exact test, and exact tests in combination with Monte Carlo and Markov Chain algorithms. Tests for Hardy-Weinberg equilibrium (HWE) are numerical in nature, requiring the computation of a test statistic and a p-value. There is however, ample space for the use of graphics in HWE tests, in particular for the ternary plot. Nowadays, many genetical studies are using genetical markers known as Single Nucleotide Polymorphisms (SNPs). SNP data comes in the form of counts, but from the counts one typically computes genotype frequencies and allele frequencies. These frequencies satisfy the unit-sum constraint, and their analysis therefore falls within the realm of compositional data analysis (Aitchison, 1986). SNPs are usually bi-allelic, which implies that the genotype frequencies can be adequately represented in a ternary plot. Compositions that are in exact HWE describe a parabola in the ternary plot. Compositions for which HWE cannot be rejected in a statistical test are typically “close" to the parabola, whereas compositions that differ significantly from HWE are “far". By rewriting the statistics used to test for HWE in terms of heterozygote frequencies, acceptance regions for HWE can be obtained that can be depicted in the ternary plot. This way, compositions can be tested for HWE purely on the basis of their position in the ternary plot (Graffelman & Morales, 2008). This leads to nice graphical representations where large numbers of SNPs can be tested for HWE in a single graph. Several examples of graphical tests for HWE (implemented in R software), will be shown, using SNP data from different human populations

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCCIÓN: El diagnóstico de Tromboembolismo Pulmonar (TEP) ha sido un reto clínico a pesar de los avances en modalidades diagnósticas y opciones terapéuticas, el TEP permanece como una entidad sub diagnosticada y letal. La medición en sangre del Dímero D, con punto de corte de 500 mcg/L, por lo tanto es una excelente prueba de tamizaje para los pacientes en el departamento de urgencias . Esta evaluación inicial debe ser complementada con la realización de angioTAC de tórax, decisión que debe ser tomada precozmente con el fin de evitar complicaciones que amenacen la vida METODOLOGIA: Se realizo un estudio de prueba diagnóstica retrospectivo donde se revisaron las historias clínicas de 109 pacientes adultos de la Fundación Santa Fe de Bogotá en quienes se realizo angioTAC de tórax con protocolo para TEP, con probabilidad diagnóstica de Tromboembolismo Pulmonar Baja o Intermedia por criterios de Wells y que además tengan Dímero D. Se calculo la sensibilidad y especificidad del Dímero D teniendo en cuenta la probabilidad clínica pre test calculada por criterios de Wells, y se calcularon likelihood ratio positivo y negativo para cada punto de corte de Dímero D. RESULTADOS: El estudio mostro una sensibilidad del 100% para valores de Dímero D menores de 1100 mcg/L, en pacientes con baja probabilidad, y sensibilidad de 100% para valores menores de 700 mcg/L en pacientes con probabilidad intermedia. DISCUSIÓN: Pacientes con baja probabilidad pre test por criterios de Wells con valores de Dímero D menores de 1100 mcg/L y de probabilidad intermedia con valores menores de 700 mcg/L no requieren estudios adicionales, lo cual disminuye de manera importante la toma de angioTAC y reduce costos de atención.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La concentración de ácido láctico en LCR en pacientes con sospecha de meningitis postquirúrgica luego de clipaje de aneurisma cerebral y hemorragia subaracnoidea espontánea se midió prospectivamente por un período de tres años. Se analizaron un total de 32 muestras de líquido cefalorraquídeo, se midió la concentración de ácido láctico y se comparó con el cultivo de LCR. Los cultivos fueron positivos en cinco pacientes, con una prevalencia de infección del 15%. Se utilizó un valor umbral de ácido láctico de 4 mmol/L. y se encontró una sensibilidad del 80%, especificidad del 52%, VPP del 23%, VPN del 93%, y likelihood ratio (LHR) positivo de 1,66 con una probabilidad post test de 15% de la concentración del ácido láctico en el diagnóstico de meningitis postquirúrgica en pacientes con hemorragia subaracnoidea aneurismática. La concentración de ácido láctico en LCR tiene un desempeño limitado en el diagnóstico de meningitis postquirúrgica en pacientes con hemorragia subaracnoidea aneurismática.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El documento examina el efecto de filtros de ajuste en el tamaño y poder de prueba de cointegración que usan los residuales como pruebas ADF y PP, mediante procedimientos MonteCarlo y una aplicación empírica. Nuestros resultados indican que el uso de filtros distorsiona el tamaño y reduce el poder de estas pruebas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aproximación a los hábitos de consumo de la población infantil escolarizada de 0-6 años, con atención preferente sobre los productos de baja calidad nutricional (BACAN), más conocidos como golosinas.Elaborar un programa de intervención dirigido a los distintos sectores de la comunidad educativa: centros, familias y niños, cuyo objetivo es reducir el consumo abusivo de productos de baja calidad nutricional. Muestra de 492 padres de niños matriculados en primer ciclo de los Centros de Educación Infantil, dependientes de la CARM y de segundo ciclo de EI de centros dependientes del MEC. Recogida de información mediante la aplicación de un cuestionario a los padres-madres. Análisis de datos. Elaboración de un programa de intervención en centros escolares. Cuestionario sobre consumo, de elaboración propia. Correlación entre variables. Estadísticos: Chi cuadrado de Pearson y de Likelihood Ratio. El consumo de productos BACAN no es homogéneo. Se diversifica y aumenta con la edad, excepción hecha con los fritos industriales cuyo consumo aparece a las edades más tempranas. La mayoría de las familias sitúan el consumo de sus hijos entre una y tres unidades de estos productos. Se ha constatado que los padres proporcionan estos productos a los hijos a pesar de estar informados de su perjuicio sobre la salud.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

For many networks in nature, science and technology, it is possible to order the nodes so that most links are short-range, connecting near-neighbours, and relatively few long-range links, or shortcuts, are present. Given a network as a set of observed links (interactions), the task of finding an ordering of the nodes that reveals such a range-dependent structure is closely related to some sparse matrix reordering problems arising in scientific computation. The spectral, or Fiedler vector, approach for sparse matrix reordering has successfully been applied to biological data sets, revealing useful structures and subpatterns. In this work we argue that a periodic analogue of the standard reordering task is also highly relevant. Here, rather than encouraging nonzeros only to lie close to the diagonal of a suitably ordered adjacency matrix, we also allow them to inhabit the off-diagonal corners. Indeed, for the classic small-world model of Watts & Strogatz (1998, Collective dynamics of ‘small-world’ networks. Nature, 393, 440–442) this type of periodic structure is inherent. We therefore devise and test a new spectral algorithm for periodic reordering. By generalizing the range-dependent random graph class of Grindrod (2002, Range-dependent random graphs and their application to modeling large small-world proteome datasets. Phys. Rev. E, 66, 066702-1–066702-7) to the periodic case, we can also construct a computable likelihood ratio that suggests whether a given network is inherently linear or periodic. Tests on synthetic data show that the new algorithm can detect periodic structure, even in the presence of noise. Further experiments on real biological data sets then show that some networks are better regarded as periodic than linear. Hence, we find both qualitative (reordered networks plots) and quantitative (likelihood ratios) evidence of periodicity in biological networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dual Carrier Modulation (DCM) is currently used as the higher data rate modulation scheme for Multiband Orthogonal Frequency Division Multiplexing (MB-OFDM) in the ECMA-368 defined Ultra-Wideband (UWB) radio platform. ECMA-368 has been chosen as the physical radio platform for many systems including Wireless USB (W-USB), Bluetooth 3.0 and Wireless HDMI; hence ECMA-368 is an important issue to consumer electronics and the user’s experience of these products. In this paper, Log Likelihood Ratio (LLR) demapping method is used for the DCM demaper implemented in fixed point model. Channel State Information (CSI) aided scheme coupled with the band hopping information is used as the further technique to improve the DCM demapping performance. The receiver performance for the fixed point DCM is simulated in realistic multi-path environments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A score test is developed for binary clinical trial data, which incorporates patient non-compliance while respecting randomization. It is assumed in this paper that compliance is all-or-nothing, in the sense that a patient either accepts all of the treatment assigned as specified in the protocol, or none of it. Direct analytic comparisons of the adjusted test statistic for both the score test and the likelihood ratio test are made with the corresponding test statistics that adhere to the intention-to-treat principle. It is shown that no gain in power is possible over the intention-to-treat analysis, by adjusting for patient non-compliance. Sample size formulae are derived and simulation studies are used to demonstrate that the sample size approximation holds. Copyright © 2003 John Wiley & Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: The present paper investigates the question of a suitable basic model for the number of scrapie cases in a holding and applications of this knowledge to the estimation of scrapie-ffected holding population sizes and adequacy of control measures within holding. Is the number of scrapie cases proportional to the size of the holding in which case it should be incorporated into the parameter of the error distribution for the scrapie counts? Or, is there a different - potentially more complex - relationship between case count and holding size in which case the information about the size of the holding should be better incorporated as a covariate in the modeling? Methods: We show that this question can be appropriately addressed via a simple zero-truncated Poisson model in which the hypothesis of proportionality enters as a special offset-model. Model comparisons can be achieved by means of likelihood ratio testing. The procedure is illustrated by means of surveillance data on classical scrapie in Great Britain. Furthermore, the model with the best fit is used to estimate the size of the scrapie-affected holding population in Great Britain by means of two capture-recapture estimators: the Poisson estimator and the generalized Zelterman estimator. Results: No evidence could be found for the hypothesis of proportionality. In fact, there is some evidence that this relationship follows a curved line which increases for small holdings up to a maximum after which it declines again. Furthermore, it is pointed out how crucial the correct model choice is when applied to capture-recapture estimation on the basis of zero-truncated Poisson models as well as on the basis of the generalized Zelterman estimator. Estimators based on the proportionality model return very different and unreasonable estimates for the population sizes. Conclusion: Our results stress the importance of an adequate modelling approach to the association between holding size and the number of cases of classical scrapie within holding. Reporting artefacts and speculative biological effects are hypothesized as the underlying causes of the observed curved relationship. The lack of adjustment for these artefacts might well render ineffective the current strategies for the control of the disease.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Survival times for the Acacia mangium plantation in the Segaliud Lokan Project, Sabah, East Malaysia were analysed based on 20 permanent sample plots (PSPs) established in 1988 as a spacing experiment. The PSPs were established following a complete randomized block design with five levels of spacing randomly assigned to units within four blocks at different sites. The survival times of trees in years are of interest. Since the inventories were only conducted annually, the actual survival time for each tree was not observed. Hence, the data set comprises censored survival times. Initial analysis of the survival of the Acacia mangium plantation suggested there is block by spacing interaction; a Weibull model gives a reasonable fit to the replicate survival times within each PSP; but a standard Weibull regression model is inappropriate because the shape parameter differs between PSPs. In this paper we investigate the form of the non-constant Weibull shape parameter. Parsimonious models for the Weibull survival times have been derived using maximum likelihood methods. The factor selection for the parameters is based on a backward elimination procedure. The models are compared using likelihood ratio statistics. The results suggest that both Weibull parameters depend on spacing and block.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dual Carrier Modulation (DCM) was chosen as the higher data rate modulation scheme for MB-OFDM (Multiband Orthogonal Frequency Division Multiplexing) in the UWB (Ultra-Wide Band) radio platform ECMA-368. ECMA-368 has been chosen as the physical implementation for high data rate Wireless USB (W-USB) and Bluetooth 3.0. In this paper, different demapping methods for the DCM demapper are presented, being Soft Bit, Maximum Likely (ML) Soft Bit and Log Likelihood Ratio (LLR). Frequency diversity and Channel State Information (CSI) are further techniques to enhance demapping methods. The system performance for those DCM demapping methods simulated in realistic multi-path environments are provided and compared.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most active-contour methods are based either on maximizing the image contrast under the contour or on minimizing the sum of squared distances between contour and image 'features'. The Marginalized Likelihood Ratio (MLR) contour model uses a contrast-based measure of goodness-of-fit for the contour and thus falls into the first class. The point of departure from previous models consists in marginalizing this contrast measure over unmodelled shape variations. The MLR model naturally leads to the EM Contour algorithm, in which pose optimization is carried out by iterated least-squares, as in feature-based contour methods. The difference with respect to other feature-based algorithms is that the EM Contour algorithm minimizes squared distances from Bayes least-squares (marginalized) estimates of contour locations, rather than from 'strongest features' in the neighborhood of the contour. Within the framework of the MLR model, alternatives to the EM algorithm can also be derived: one of these alternatives is the empirical-information method. Tracking experiments demonstrate the robustness of pose estimates given by the MLR model, and support the theoretical expectation that the EM Contour algorithm is more robust than either feature-based methods or the empirical-information method. (c) 2005 Elsevier B.V. All rights reserved.