990 resultados para LIKELIHOOD RATIO STATISTICS
Resumo:
This paper considers likelihood-based inference for the family of power distributions. Widely applicable results are presented which can be used to conduct inference for all three parameters of the general location-scale extension of the family. More specific results are given for the special case of the power normal model. The analysis of a large data set, formed from density measurements for a certain type of pollen, illustrates the application of the family and the results for likelihood-based inference. Throughout, comparisons are made with analogous results for the direct parametrisation of the skew-normal distribution.
Resumo:
The number of record-breaking events expected to occur in a strictly stationary time-series depends only on the number of values in the time-series, regardless of distribution. This holds whether the events are record-breaking highs or lows and whether we count from past to present or present to past. However, these symmetries are broken in distinct ways by trends in the mean and variance. We define indices that capture this information and use them to detect weak trends from multiple time-series. Here, we use these methods to answer the following questions: (1) Is there a variability trend among globally distributed surface temperature time-series? We find a significant decreasing variability over the past century for the Global Historical Climatology Network (GHCN). This corresponds to about a 10% change in the standard deviation of inter-annual monthly mean temperature distributions. (2) How are record-breaking high and low surface temperatures in the United States affected by time period? We investigate the United States Historical Climatology Network (USHCN) and find that the ratio of record-breaking highs to lows in 2006 increases as the time-series extend further into the past. When we consider the ratio as it evolves with respect to a fixed start year, we find it is strongly correlated with the ensemble mean. We also compare the ratios for USHCN and GHCN (minus USHCN stations). We find the ratios grow monotonically in the GHCN data set, but not in the USHCN data set. (3) Do we detect either mean or variance trends in annual precipitation within the United States? We find that the total annual and monthly precipitation in the United States (USHCN) has increased over the past century. Evidence for a trend in variance is inconclusive.
Resumo:
Detecting lame cows is important in improving animal welfare. Automated tools are potentially useful to enable identification and monitoring of lame cows. The goals of this study were to evaluate the suitability of various physiological and behavioral parameters to automatically detect lameness in dairy cows housed in a cubicle barn. Lame cows suffering from a claw horn lesion (sole ulcer or white line disease) of one claw of the same hind limb (n=32; group L) and 10 nonlame healthy cows (group C) were included in this study. Lying and standing behavior at night by tridimensional accelerometers, weight distribution between hind limbs by the 4-scale weighing platform, feeding behavior at night by the nose band sensor, and heart activity by the Polar device (Polar Electro Oy, Kempele, Finland) were assessed. Either the entire data set or parts of the data collected over a 48-h period were used for statistical analysis, depending upon the parameter in question. The standing time at night over 12 h and the limb weight ratio (LWR) were significantly higher in group C as compared with group L, whereas the lying time at night over 12 h, the mean limb difference (△weight), and the standard deviation (SD) of the weight applied on the limb taking less weight were significantly lower in group C as compared with group L. No significant difference was noted between the groups for the parameters of heart activity and feeding behavior at night. The locomotion score of cows in group L was positively correlated with the lying time and △weight, whereas it was negatively correlated with LWR and SD. The highest sensitivity (0.97) for lameness detection was found for the parameter SD [specificity of 0.80 and an area under the curve (AUC) of 0.84]. The highest specificity (0.90) for lameness detection was present for Δweight (sensitivity=0.78; AUC=0.88) and LWR (sensitivity=0.81; AUC=0.87). The model considering the data of SD together with lying time at night was the best predictor of cows being lame, accounting for 40% of the variation in the likelihood of a cow being lame (sensitivity=0.94; specificity=0.80; AUC=0.86). In conclusion, the data derived from the 4-scale-weighing platform, either alone or combined with the lying time at night over 12 h, represent the most valuable parameters for automated identification of lame cows suffering from a claw horn lesion of one individual hind limb.
Resumo:
Administration of gonadotropins or testosterone (T) will maintain qualitatively normal spermatogenesis and fertility in hypophysectomized (APX) rats. However, quantitative maintenance of the spermatogenic process in APX rats treated with T alone or in combination with follicle stimulating hormone (FSH) has not been demonstrated. Studies reported here were conducted to determine whether it would be possible to increase intratesticular testosterone (ITT) levels in APX rats to those found in normal animals by administration of appropriate amounts of testosterone propionate (TP) and if under these conditions spermatogenesis can be maintained quantitatively. Quantitative analysis of spermatogenesis was performed on stages VI and VII of the spermatogenic cycle utilizing criteria of Leblond and Clermont (1952) all cell types were enumerated. In a series of experiments designed to investigate the effects of T on spermatogenesis, TP was administered to 60 day old APX rats twice daily for 30 days in doses ranging from 0.6 to 15 mg/day or from 0.6 to 6.0 mg/day in combination with FSH. The results of this study demonstrate that the efficiency of transformation of type A to type B spermatogonia and the efficacy of the meiotic prophase are related to ITT levels, and that quantitatively normal completion of the reduction division requires normal ITT levels. The ratio of spermatids to spermatocytes in the vehicle-treated APX rats was 1:1.38; in the APX rats treated with 15 mg of TP it was 1:4.0 (the theoretically expected number). This study is probably the first to demonstrate: (1) the pharmacokinetics of TP, (2) the profile and quantity of T-immunoactivity in both serum and testicular tissue of APX and IC rats as well as APX rats treated with TP alone or in combination with FSH, (3) the direct correlation of serum T and ITT levels in treated APX rats (r = 0.9, p < 0.001) as well as in the IC rats (r = 0.9, p < 0.001), (4) the significant increase in the number of Type B spermatogonia, preleptotene and pachytene spermatocytes and round spermatids in TP-treated APX rats, (5) the correlation of the number of round spermatids formed in IC rats to ITT levels (r = 0.9, p < 0.001), and (6) the correlation of the quantitative maintenance of spermatogenesis with ITT levels (r = 0.7, p < 0.001) in the testes of TP-treated APX rats. These results provide direct experimental evidence for the key role of T in the spermatogenic process. ^
Resumo:
Safer sex practices, such as consistent condom use, are essential to reduce HIV transmission. Determining causes and/or co-variants related to the likelihood of participating in high-risk sexual behaviors may allow the content of interventions and treatments to minimize HIV transmission to be tailored more effectively. The goal of this study was to examine whether a relationship exists between consistent condom use among African American HIV-positive crack cocaine users and both (1) the use of antiretroviral therapy, and (2) adherence to antiretroviral therapy regimens. The study population consisted of 390 participants. They were at least 18 years old, African American, HIV-positive, and had used crack cocaine within a month prior to an interview conducted sometime between April, 2004, and September, 2007. Bivariate associations were examined using contingency tables and χ2-statistics. The Mantel-Haenszel method was used to control for confounding. This study found neither a significant relationship between use of antiretroviral therapy and consistent condom use (Odds ratio (OR) = 1.38; 95% Confidence interval (95%CI) = 0.86–2.22), nor an association between antiretroviral therapy adherence and consistent condom use (OR = 1.02, 95%CI = 0.60–1.75). The exception was more consistent condom use when sex was traded for money or drugs in those on antiretroviral therapy, compared to those not on such therapy (OR = 2.28, 95%CI = 1.08–4.85). Further studies examining condom use and HIV treatment adherence are recommended. ^
Resumo:
Ordinal outcomes are frequently employed in diagnosis and clinical trials. Clinical trials of Alzheimer's disease (AD) treatments are a case in point using the status of mild, moderate or severe disease as outcome measures. As in many other outcome oriented studies, the disease status may be misclassified. This study estimates the extent of misclassification in an ordinal outcome such as disease status. Also, this study estimates the extent of misclassification of a predictor variable such as genotype status. An ordinal logistic regression model is commonly used to model the relationship between disease status, the effect of treatment, and other predictive factors. A simulation study was done. First, data based on a set of hypothetical parameters and hypothetical rates of misclassification was created. Next, the maximum likelihood method was employed to generate likelihood equations accounting for misclassification. The Nelder-Mead Simplex method was used to solve for the misclassification and model parameters. Finally, this method was applied to an AD dataset to detect the amount of misclassification present. The estimates of the ordinal regression model parameters were close to the hypothetical parameters. β1 was hypothesized at 0.50 and the mean estimate was 0.488, β2 was hypothesized at 0.04 and the mean of the estimates was 0.04. Although the estimates for the rates of misclassification of X1 were not as close as β1 and β2, they validate this method. X 1 0-1 misclassification was hypothesized as 2.98% and the mean of the simulated estimates was 1.54% and, in the best case, the misclassification of k from high to medium was hypothesized at 4.87% and had a sample mean of 3.62%. In the AD dataset, the estimate for the odds ratio of X 1 of having both copies of the APOE 4 allele changed from an estimate of 1.377 to an estimate 1.418, demonstrating that the estimates of the odds ratio changed when the analysis includes adjustment for misclassification. ^
Resumo:
An extension of k-ratio multiple comparison methods to rank-based analyses is described. The new method is analogous to the Duncan-Godbold approximate k-ratio procedure for unequal sample sizes or correlated means. The close parallel of the new methods to the Duncan-Godbold approach is shown by demonstrating that they are based upon different parameterizations as starting points.^ A semi-parametric basis for the new methods is shown by starting from the Cox proportional hazards model, using Wald statistics. From there the log-rank and Gehan-Breslow-Wilcoxon methods may be seen as score statistic based methods.^ Simulations and analysis of a published data set are used to show the performance of the new methods. ^
Resumo:
Although 1–24% of T cells are alloreactive, i.e., respond to MHC molecules encoded by a foreign haplotype, it is generally believed that T cells cannot recognize foreign peptides binding foreign MHC molecules. We show using a quantitative model that, if T cell selection and activation are affinity-driven, then an alloreactivity of 1–24% is incompatible with the textbook notion that self MHC restriction is absolute. If an average of 1% of clones are alloreactive, then according to our model, at most 20-fold more clones should, on average, be activated by antigens presented on self MHC than by antigens presented on foreign MHC. This ratio is at best 5 if alloreactivity is 5%. These results describe average properties of the murine immune system, but not the outcome of individual experiments. Using supercomputer technology, we simulated 100,000 MHC restriction experiments. Although the average restriction ratio was 7.1, restriction was absolute in 10% of the simulated experiments, greater than 100, although not absolute, in 29%, and below 6 in 24%. This extreme variability agrees with experimental estimates. Our analysis suggests that alloreactivity and average self MHC restriction both cannot be high, but that a low average restriction level is compatible with high levels in a significant number of experiments.
Resumo:
In this thesis, we present the generation and studies of a 87Rb Bose-Einstein condensate (BEC) perturbed by an oscillatory excitation. The atoms are trapped in a harmonic magnetic trap where, after an evaporative cooling process, we produce the BEC. In order to study the effect caused by oscillatory excitations, a quadrupole magnetic field time oscillatory is superimposed to the trapping potential. Through this perturbation, collective modes were observed. The dipole mode is excited even for low excitation amplitudes. However, a minimum excitation energy is needed to excite the condensate quadrupole mode. Observing the excited cloud in TOF expansion, we note that for excitation amplitude in which the quadrupole mode is excited, the cloud expands without invert its aspect ratio. By looking these clouds, after long time-of-flight, it was possible to see vortices and, sometimes, a turbulent state in the condensed cloud. We calculated the momentum distribution of the perturbed BECs and a power law behavior, like the law to Kolmogorov turbulence, was observed. Furthermore, we show that using the method that we have developed to calculate the momentum distribution, the distribution curve (including the power law exponent) exhibits a dependence on the quadrupole mode oscillation of the cloud. The randomness distribution of peaks and depletions in density distribution image of an expanded turbulent BEC, remind us to the intensity profile of a speckle light beam. The analogy between matter-wave speckle and light speckle is justified by showing the similarities in the spatial propagation (or time expansion) of the waves. In addition, the second order correlation function is evaluated and the same dependence with distance was observed for the both waves. This creates the possibility to understand the properties of quantum matter in a disordered state. The propagation of a three-dimensional speckle field (as the matter-wave speckle described here) creates an opportunity to investigate the speckle phenomenon existing in dimensions higher than 2D (the case of light speckle).
Resumo:
"November 1982."
Resumo:
Testing for simultaneous vicariance across comparative phylogeographic data sets is a notoriously difficult problem hindered by mutational variance, the coalescent variance, and variability across pairs of sister taxa in parameters that affect genetic divergence. We simulate vicariance to characterize the behaviour of several commonly used summary statistics across a range of divergence times, and to characterize this behaviour in comparative phylogeographic datasets having multiple taxon-pairs. We found Tajima's D to be relatively uncorrelated with other summary statistics across divergence times, and using simple hypothesis testing of simultaneous vicariance given variable population sizes, we counter-intuitively found that the variance across taxon pairs in Nei and Li's net nucleotide divergence (pi(net)), a common measure of population divergence, is often inferior to using the variance in Tajima's D across taxon pairs as a test statistic to distinguish ancient simultaneous vicariance from variable vicariance histories. The opposite and more intuitive pattern is found for testing more recent simultaneous vicariance, and overall we found that depending on the timing of vicariance, one of these two test statistics can achieve high statistical power for rejecting simultaneous vicariance, given a reasonable number of intron loci (> 5 loci, 400 bp) and a range of conditions. These results suggest that components of these two composite summary statistics should be used in future simulation-based methods which can simultaneously use a pool of summary statistics to test comparative the phylogeographic hypotheses we consider here.
Resumo:
The temperature dependence of the structure of the mixed-anion Tutton salt K-2[Cu(H2O)(6)](SO4)(2x)(SeO4)(2-2x) has been determined for crystals with 0, 17, 25, 68, 78, and 100% sulfate over the temperature range of 85-320 K. In every case, the [Cu(H2O)(6)](2+) ion adopts a tetragonally elongated coordination geometry with an orthorhombic distortion. However, for the compounds with 0, 17, and 25% sulfate, the long and intermediate bonds occur on a different pair of water molecules from those with 68, 78, and 100% sulfate. A thermal equilibrium between the two forms is observed for each crystal, with this developing more readily as the proportions of the two counterions become more similar. Attempts to prepare a crystal with approximately equal amounts of sulfate and selenate were unsuccessful. The temperature dependence of the bond lengths has been analyzed using a model in which the Jahn-Teller potential surface of the [Cu(H2O)(6)](2+) ion is perturbed by a lattice-strain interaction. The magnitude and sign of the orthorhombic component of this strain interaction depends on the proportion of sulfate to selenate. Significant deviations from Boltzmann statistics are observed for those crystals exhibiting a large temperature dependence of the average bond lengths, and this may be explained by cooperative interactions between neighboring complexes.
Resumo:
This paper presents a scientific and technical description of the modelling framework and the main results of modelling the long-term average sediment delivery at hillslope to medium-scale catchments over the entire Murray Darling Basin (MDB). A theoretical development that relates long-term averaged sediment delivery to the statistics of rainfall and catchment parameters is presented. The derived flood frequency approach was adapted to investigate the problem of regionalization of the sediment delivery ratio (SDR) across the Basin. SDR, a measure of catchment response to the upland erosion rate, was modeled by two lumped linear stores arranged in series: hillslope transport to the nearest streams and flow routing in the channel network. The theory shows that the ratio of catchment sediment residence time (SRT) to average effective rainfall duration is the most important control in the sediment delivery processes. In this study, catchment SRTs were estimated using travel time for overland flow multiplied by an enlargement factor which is a function of particle size. Rainfall intensity and effective duration statistics were regionalized by using long-term measurements from 195 pluviograph sites within and around the Basin. Finally, the model was implemented across the MDB by using spatially distributed soil, vegetation, topographical and land use properties under Geographic Information System (GIs) environment. The results predict strong variations in SDR from close to 0 in floodplains to 70% in the eastern uplands of the Basin. (c) 2005 Elsevier Ltd. All rights reserved.