990 resultados para LIKELIHOOD RATIO STATISTICS


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The progesterone receptor (PR) is a candidate gene for the development of endometriosis, a complex disease with strong hormonal features, common in women of reproductive age. We typed the 306 base pair Alu insertion (AluIns) polymorphism in intron G of PR in 101 individuals, estimated linkage disequilibrium (LD) between five single-nucleotide polymorphisms (SNPs) across the PR locus in 980 Australian triads (endometriosis case and two parents) and used transmission disequilibrium testing (TDT) for association with endometriosis. The five SNPs showed strong pairwise LD, and the AluIns was highly correlated with proximal SNPs rs1042839 ({Delta}2 = 0.877, D9 = 1.00, P < 0.0001) and rs500760 ({Delta}2 = 0.438, D9 = 0.942, P < 0.0001). TDT showed weak evidence of allelic association between endometriosis and rs500760 (P = 0.027) but not in the expected direction. We identified a common susceptibility haplotype GGGCA across the five SNPs (P = 0.0167) in the whole sample, but likelihood ratio testing of haplotype transmission and non-transmission of the AluIns and flanking SNPs showed no significant pattern. Further, analysis of our results pooled with those from two previous studies suggested that neither the T2 allele of the AluIns nor the T1/T2 genotype was associated with endometriosis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report the clinical characteristics of a schizophrenia sample of 409 pedigrees-263 of European ancestry ( EA) and 146 of African American ancestry ( AA)-together with the results of a genome scan ( with a simple tandem repeat polymorphism interval of 9 cM) and follow-up fine mapping. A family was required to have a proband with schizophrenia ( SZ) and one or more siblings of the proband with SZ or schizoaffective disorder. Linkage analyses included 403 independent full-sibling affected sibling pairs ( ASPs) ( 279 EA and 124 AA) and 100 all-possible half-sibling ASPs ( 15 EA and 85 AA). Nonparametric multipoint linkage analysis of all families detected two regions with suggestive evidence of linkage at 8p23.3-q12 and 11p11.2-q22.3 ( empirical Z likelihood-ratio score [ Z(lr)] threshold >= 2.65) and, in exploratory analyses, two other regions at 4p16.1-p15.32 in AA families and at 5p14.3-q11.2 in EA families. The most significant linkage peak was in chromosome 8p; its signal was mainly driven by the EA families. Z(lr) scores >= 2.0 in 8p were observed from 30.7 cM to 61.7 cM ( Center for Inherited Disease Research map locations). The maximum evidence in the full sample was a multipoint Z(lr) of 3.25 ( equivalent Kong-Cox LOD of 2.30) near D8S1771 ( at 52 cM); there appeared to be two peaks, both telomeric to neuregulin 1 ( NRG1). There is a paracentric inversion common in EA individuals within this region, the effect of which on the linkage evidence remains unknown in this and in other previously analyzed samples. Fine mapping of 8p did not significantly alter the significance or length of the peak. We also performed fine mapping of 4p16.3-p15.2, 5p15.2-q13.3, 10p15.3-p14, 10q25.3-q26.3, and 11p13-q23.3. The highest increase in Z(lr) scores was observed for 5p14.1-q12.1, where the maximum Z(lr) increased from 2.77 initially to 3.80 after fine mapping in the EA families.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Atrial fibrillation in the elderly is common and potentially life threatening. The classical sign of atrial fibrillation is an irregularly irregular pulse. Objective The objective of this research was to determine the accuracy of pulse palpation to detect atrial fibrillation. Methods We searched Medline, EMBASE, and the reference lists of review articles for studies that compared pulse palpation with the electrocardiogram (ECG) diagnosis of atrial fibrillation. Two reviewers independently assessed the search results to determine the eligibility of studies, extracted data, and assessed the quality of the studies. Results We identified 3 studies (2385 patients) that compared pulse palpation with ECG. The estimated sensitivity of pulse palpation ranged from 91% to 100%, while specificity ranged from 70% to 77%. Pooled sensitivity was 94% (95% confidence interval [CI], 84%-97%) and pooled specificity was 72% (95% CI 69%-75%). The pooled positive likelihood ratio was 3.39, while the pooled negative likelihood ratio was 0.10. Conclusions Pulse palpation has a high sensitivity but relatively low specificity for atrial fibrillation. It is therefore useful for ruling out atrial fibrillation. It may also be a useful screen to apply opportunistically for previously undetected atrial fibrillation. Assuming a prevalence of 3% for undetected atrial fibrillation in patients older than 65 years, and given the test's sensitivity and specificity, opportunistic pulse palpation in this age group would detect an irregular pulse in 30% of screened patients, requiring further testing with ECG. Among screened patients, 0.2% would have atrial fibrillation undetected with pulse palpation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We demonstrate that the process of generating smooth transitions Call be viewed as a natural result of the filtering operations implied in the generation of discrete-time series observations from the sampling of data from an underlying continuous time process that has undergone a process of structural change. In order to focus discussion, we utilize the problem of estimating the location of abrupt shifts in some simple time series models. This approach will permit its to address salient issues relating to distortions induced by the inherent aggregation associated with discrete-time sampling of continuous time processes experiencing structural change, We also address the issue of how time irreversible structures may be generated within the smooth transition processes. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An emerging issue in the field of astronomy is the integration, management and utilization of databases from around the world to facilitate scientific discovery. In this paper, we investigate application of the machine learning techniques of support vector machines and neural networks to the problem of amalgamating catalogues of galaxies as objects from two disparate data sources: radio and optical. Formulating this as a classification problem presents several challenges, including dealing with a highly unbalanced data set. Unlike the conventional approach to the problem (which is based on a likelihood ratio) machine learning does not require density estimation and is shown here to provide a significant improvement in performance. We also report some experiments that explore the importance of the radio and optical data features for the matching problem.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A procedure for calculating critical level and power of likelihood ratio test, based on a Monte-Carlo simulation method is proposed. General principles of software building for its realization are given. Some examples of its application are shown.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62F25, 62F03.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62P10, 92D10, 92D30, 62F03

Relevância:

80.00% 80.00%

Publicador:

Resumo:

2010 Mathematics Subject Classification: 65D18.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62H15, 62H12.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Crash reduction factors (CRFs) are used to estimate the potential number of traffic crashes expected to be prevented from investment in safety improvement projects. The method used to develop CRFs in Florida has been based on the commonly used before-and-after approach. This approach suffers from a widely recognized problem known as regression-to-the-mean (RTM). The Empirical Bayes (EB) method has been introduced as a means to addressing the RTM problem. This method requires the information from both the treatment and reference sites in order to predict the expected number of crashes had the safety improvement projects at the treatment sites not been implemented. The information from the reference sites is estimated from a safety performance function (SPF), which is a mathematical relationship that links crashes to traffic exposure. The objective of this dissertation was to develop the SPFs for different functional classes of the Florida State Highway System. Crash data from years 2001 through 2003 along with traffic and geometric data were used in the SPF model development. SPFs for both rural and urban roadway categories were developed. The modeling data used were based on one-mile segments that contain homogeneous traffic and geometric conditions within each segment. Segments involving intersections were excluded. The scatter plots of data show that the relationships between crashes and traffic exposure are nonlinear, that crashes increase with traffic exposure in an increasing rate. Four regression models, namely, Poisson (PRM), Negative Binomial (NBRM), zero-inflated Poisson (ZIP), and zero-inflated Negative Binomial (ZINB), were fitted to the one-mile segment records for individual roadway categories. The best model was selected for each category based on a combination of the Likelihood Ratio test, the Vuong statistical test, and the Akaike's Information Criterion (AIC). The NBRM model was found to be appropriate for only one category and the ZINB model was found to be more appropriate for six other categories. The overall results show that the Negative Binomial distribution model generally provides a better fit for the data than the Poisson distribution model. In addition, the ZINB model was found to give the best fit when the count data exhibit excess zeros and over-dispersion for most of the roadway categories. While model validation shows that most data points fall within the 95% prediction intervals of the models developed, the Pearson goodness-of-fit measure does not show statistical significance. This is expected as traffic volume is only one of the many factors contributing to the overall crash experience, and that the SPFs are to be applied in conjunction with Accident Modification Factors (AMFs) to further account for the safety impacts of major geometric features before arriving at the final crash prediction. However, with improved traffic and crash data quality, the crash prediction power of SPF models may be further improved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

When a suspect's DNA profile is admitted into court as a match to evidence the probability of the perpetrator being another individual must be calculated from database allele frequencies. The two methods used for this calculation are phenotypic frequency and likelihood ratio. Neither of these calculations takes into account substructuring within populations. In these substructured populations the frequency of homozygotes increases and that of heterozygotes usually decreases. The departure from Hardy- Weinberg expectation in a sample population can be estimated using Sewall Wright's Fst statistic. Fst values were calculated in four populations of African descent by comparing allele frequencies at three short tandem repeat loci. This was done by amplifying the three loci in each sample using the Polymerase Chain Reaction and separating these fragments using polyacrylamide gel electrophoresis. The gels were then silver stained and autoradiograms taken, from which allele frequencies were estimated. Fst values averaged 0.007+- 0.005 within populations of African descent and 0.02+- 0.01 between white and black populations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation focuses on two vital challenges in relation to whale acoustic signals: detection and classification.

In detection, we evaluated the influence of the uncertain ocean environment on the spectrogram-based detector, and derived the likelihood ratio of the proposed Short Time Fourier Transform detector. Experimental results showed that the proposed detector outperforms detectors based on the spectrogram. The proposed detector is more sensitive to environmental changes because it includes phase information.

In classification, our focus is on finding a robust and sparse representation of whale vocalizations. Because whale vocalizations can be modeled as polynomial phase signals, we can represent the whale calls by their polynomial phase coefficients. In this dissertation, we used the Weyl transform to capture chirp rate information, and used a two dimensional feature set to represent whale vocalizations globally. Experimental results showed that our Weyl feature set outperforms chirplet coefficients and MFCC (Mel Frequency Cepstral Coefficients) when applied to our collected data.

Since whale vocalizations can be represented by polynomial phase coefficients, it is plausible that the signals lie on a manifold parameterized by these coefficients. We also studied the intrinsic structure of high dimensional whale data by exploiting its geometry. Experimental results showed that nonlinear mappings such as Laplacian Eigenmap and ISOMAP outperform linear mappings such as PCA and MDS, suggesting that the whale acoustic data is nonlinear.

We also explored deep learning algorithms on whale acoustic data. We built each layer as convolutions with either a PCA filter bank (PCANet) or a DCT filter bank (DCTNet). With the DCT filter bank, each layer has different a time-frequency scale representation, and from this, one can extract different physical information. Experimental results showed that our PCANet and DCTNet achieve high classification rate on the whale vocalization data set. The word error rate of the DCTNet feature is similar to the MFSC in speech recognition tasks, suggesting that the convolutional network is able to reveal acoustic content of speech signals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There has been an increasing interest in the development of new methods using Pareto optimality to deal with multi-objective criteria (for example, accuracy and time complexity). Once one has developed an approach to a problem of interest, the problem is then how to compare it with the state of art. In machine learning, algorithms are typically evaluated by comparing their performance on different data sets by means of statistical tests. Standard tests used for this purpose are able to consider jointly neither performance measures nor multiple competitors at once. The aim of this paper is to resolve these issues by developing statistical procedures that are able to account for multiple competing measures at the same time and to compare multiple algorithms altogether. In particular, we develop two tests: a frequentist procedure based on the generalized likelihood-ratio test and a Bayesian procedure based on a multinomial-Dirichlet conjugate model. We further extend them by discovering conditional independences among measures to reduce the number of parameters of such models, as usually the number of studied cases is very reduced in such comparisons. Data from a comparison among general purpose classifiers is used to show a practical application of our tests.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08