77 resultados para robust estimation statistics
Resumo:
Finite mixture models are being increasingly used to model the distributions of a wide variety of random phenomena. While normal mixture models are often used to cluster data sets of continuous multivariate data, a more robust clustering can be obtained by considering the t mixture model-based approach. Mixtures of factor analyzers enable model-based density estimation to be undertaken for high-dimensional data where the number of observations n is very large relative to their dimension p. As the approach using the multivariate normal family of distributions is sensitive to outliers, it is more robust to adopt the multivariate t family for the component error and factor distributions. The computational aspects associated with robustness and high dimensionality in these approaches to cluster analysis are discussed and illustrated.
Resumo:
Despite its environmental (and financial) importance, there is no agreement in the literature as to which extractant most accurately estimates the phytoavailability of trace metals in soils. A large dataset was taken from the literature, and the effectiveness of various extractants to predict the phytoavailability of Cd, Zn, Ni, Cu, and Pb examined across a range of soil types and contamination levels. The data suggest that generally, the total soil trace metal content, and trace metal concentrations determined by complexing agents (such as the widely used DTPA and EDTA extractants) or acid extractants (such as 0.1 M HCl and the Mehlich 1 extractant) are only poorly correlated to plant phytoavailability. Whilst there is no consensus, it would appear that neutral salt extractants (such as 0.01 M CaCl2 and 0.1 M NaNO3) provide the most useful indication of metal phytoavailability across a range of metals of interest, although further research is required.
Resumo:
We present a fast method for finding optimal parameters for a low-resolution (threading) force field intended to distinguish correct from incorrect folds for a given protein sequence. In contrast to other methods, the parameterization uses information from >10(7) misfolded structures as well as a set of native sequence-structure pairs. In addition to testing the resulting force field's performance on the protein sequence threading problem, results are shown that characterize the number of parameters necessary for effective structure recognition.
Resumo:
Bioelectrical impedance analysis (BIA) was used to assess body composition in rats fed on either standard laboratory diet or on a high-fat diet designed to induce obesity. Bioelectrical impedance analysis predictions of total body water and thus fat-free mass (FFM) for the group mean values were generally within 5% of the measured values by tritiated water ((H2O)-H-3) dilution. The limits of agreement for the procedure were, however, large, approximately +/-25%, limiting the applicability of the technique for measurement of body composition in individual animals.
Resumo:
This note considers the value of surface response equations which can be used to calculate critical values for a range of unit root and cointegration tests popular in applied economic research.
Resumo:
Krylov subspace techniques have been shown to yield robust methods for the numerical computation of large sparse matrix exponentials and especially the transient solutions of Markov Chains. The attractiveness of these methods results from the fact that they allow us to compute the action of a matrix exponential operator on an operand vector without having to compute, explicitly, the matrix exponential in isolation. In this paper we compare a Krylov-based method with some of the current approaches used for computing transient solutions of Markov chains. After a brief synthesis of the features of the methods used, wide-ranging numerical comparisons are performed on a power challenge array supercomputer on three different models. (C) 1999 Elsevier Science B.V. All rights reserved.AMS Classification: 65F99; 65L05; 65U05.
Resumo:
An automated method for extracting brain volumes from three commonly acquired three-dimensional (3D) MR images (proton density, T1 weighted, and T2-weighted) of the human head is described. The procedure is divided into four levels: preprocessing, segmentation, scalp removal, and postprocessing. A user-provided reference point is the sole operator-dependent input required, The method's parameters were first optimized and then fixed and applied to 30 repeat data sets from 15 normal older adult subjects to investigate its reproducibility. Percent differences between total brain volumes (TBVs) for the subjects' repeated data sets ranged from .5% to 2.2%. We conclude that the method is both robust and reproducible and has the potential for wide application.
Resumo:
Background From the mid-1980s to mid-1990s, the WHO MONICA Project monitored coronary events and classic risk factors for coronary heart disease (CHD) in 38 populations from 21 countries. We assessed the extent to which changes in these risk factors explain the variation in the trends in coronary-event rates across the populations. Methods In men and women aged 35-64 years, non-fatal myocardial infarction and coronary deaths were registered continuously to assess trends in rates of coronary events. We carried out population surveys to estimate trends in risk factors. Trends in event rates were regressed on trends in risk score and in individual risk factors. Findings Smoking rates decreased in most male populations but trends were mixed in women; mean blood pressures and cholesterol concentrations decreased, body-mass index increased, and overall risk scores and coronary-event rates decreased. The model of trends in 10-year coronary-event rates against risk scores and single risk factors showed a poor fit, but this was improved with a 4-year time lag for coronary events. The explanatory power of the analyses was limited by imprecision of the estimates and homogeneity of trends in the study populations. Interpretation Changes in the classic risk factors seem to partly explain the variation in population trends in CHD. Residual variance is attributable to difficulties in measurement and analysis, including time lag, and to factors that were not included, such as medical interventions. The results support prevention policies based on the classic risk factors but suggest potential for prevention beyond these.
Resumo:
Hemichordates were traditionally allied to the chordates, but recent molecular analyses have suggested that hemichordates are a sister group to the echinoderms, a relationship that has important consequences for the interpretation of the evolution of deuterostome body plans. However, the molecular phylogenetic analyses to date have not provided robust support for the hemichordate + echinoderm clade. We use a maximum likelihood framework, including the parametric bootstrap, to reanalyze DNA data from complete mitochondrial genomes and nuclear 18S rRNA. This approach provides the first statistically significant support for the hemichordate + echinoderm clade from molecular data. This grouping implies that the ancestral deuterostome had features that included an adult with a pharynx and a dorsal nerve cord and an indirectly developing dipleurula-like larva.
Resumo:
Dendritic cells (DC) are considered to be the major cell type responsible for induction of primary immune responses. While they have been shown to play a critical role in eliciting allosensitization via the direct pathway, there is evidence that maturational and/or activational heterogeneity between DC in different donor organs may be crucial to allograft outcome. Despite such an important perceived role for DC, no accurate estimates of their number in commonly transplanted organs have been reported. Therefore, leukocytes and DC were visualized and enumerated in cryostat sections of normal mouse (C57BL/10, B10.BR, C3H) liver, heart, kidney and pancreas by immunohistochemistry (CD45 and MHC class II staining, respectively). Total immunopositive cell number and MHC class II+ cell density (C57BL/10 mice only) were estimated using established morphometric techniques - the fractionator and disector principles, respectively. Liver contained considerably more leukocytes (similar to 5-20 x 10(6)) and DC (similar to 1-3 x 10(6)) than the other organs examined (pancreas: similar to 0.6 x 10(6) and similar to 0.35 x 10(6): heart: similar to 0.8 x 10(6) and similar to 0.4 x 10(6); kidney similar to 1.2 x 10(6) and 0.65 x 10(6), respectively). In liver, DC comprised a lower proportion of all leukocytes (similar to 15-25%) than in the other parenchymal organs examined (similar to 40-60%). Comparatively, DC density in C57BL/10 mice was heart > kidney > pancreas much greater than liver (similar to 6.6 x 10(6), 5 x 10(6), 4.5 x 10(6) and 1.1 x 10(6) cells/cm(3), respectively). When compared to previously published data on allograft survival, the results indicate that the absolute number of MHC class II+ DC present in a donor organ is a poor predictor of graft outcome. Survival of solid organ allografts is more closely related to the density of the donor DC network within the graft. (C) 2000 Elsevier Science B.V. All rights reserved.
Resumo:
The evolution of event time and size statistics in two heterogeneous cellular automaton models of earthquake behavior are studied and compared to the evolution of these quantities during observed periods of accelerating seismic energy release Drier to large earthquakes. The two automata have different nearest neighbor laws, one of which produces self-organized critical (SOC) behavior (PSD model) and the other which produces quasi-periodic large events (crack model). In the PSD model periods of accelerating energy release before large events are rare. In the crack model, many large events are preceded by periods of accelerating energy release. When compared to randomized event catalogs, accelerating energy release before large events occurs more often than random in the crack model but less often than random in the PSD model; it is easier to tell the crack and PSD model results apart from each other than to tell either model apart from a random catalog. The evolution of event sizes during the accelerating energy release sequences in all models is compared to that of observed sequences. The accelerating energy release sequences in the crack model consist of an increase in the rate of events of all sizes, consistent with observations from a small number of natural cases, however inconsistent with a larger number of cases in which there is an increase in the rate of only moderate-sized events. On average, no increase in the rate of events of any size is seen before large events in the PSD model.
Resumo:
The acquisition of HI Parkes All Shy Survey (HIPASS) southern sky data commenced at the Australia Telescope National Facility's Parkes 64-m telescope in 1997 February, and was completed in 2000 March. HIPASS is the deepest HI survey yet of the sky south of declination +2 degrees, and is sensitive to emission out to 170 h(75)(-1) Mpc. The characteristic root mean square noise in the survey images is 13.3 mJy. This paper describes the survey observations, which comprise 23 020 eight-degree scans of 9-min duration, and details the techniques used to calibrate and image the data. The processing algorithms are successfully designed to be statistically robust to the presence of interference signals, and are particular to imaging point (or nearly point) sources. Specifically, a major improvement in image quality is obtained by designing a median-gridding algorithm which uses the median estimator in place of the mean estimator.