119 resultados para estimation method
em University of Queensland eSpace - Australia
Resumo:
The measurement of lifetime prevalence of depression in cross-sectional surveys is biased by recall problems. We estimated it indirectly for two countries using modelling, and quantified the underestimation in the empirical estimate for one. A microsimulation model was used to generate population-based epidemiological measures of depression. We fitted the model to 1-and 12-month prevalence data from the Netherlands Mental Health Survey and Incidence Study (NEMESIS) and the Australian Adult Mental Health and Wellbeing Survey. The lowest proportion of cases ever having an episode in their life is 30% of men and 40% of women, for both countries. This corresponds to a lifetime prevalence of 20 and 30%, respectively, in a cross-sectional setting (aged 15-65). The NEMESIS data were 38% lower than these estimates. We conclude that modelling enabled us to estimate lifetime prevalence of depression indirectly. This method is useful in the absence of direct measurement, but also showed that direct estimates are underestimated by recall bias and by the cross-sectional setting.
Resumo:
The need for methods of indirectly estimating migration flows is particularly important in developing countries, where migration data are often incomplete and inaccurate. This paper focuses on the use of an indirect internal migration estimation method applied to Mexican and Indonesian census data. It shows that the mobility propensities of infants can be used to infer the corresponding propensities of all other age groups. However, the promise of this method is reduced in instances of inadequate data, and great care must be taken to identify outlying values in the data and to correct obviously erroneous patterns. Future work increasingly will be directed to this issue.
Resumo:
Mineralogical analysis is often used to assess the liberation properties of particles. A direct method of estimating liberation is to actually break particles and then directly obtain liberation information from applying mineralogical analysis to each size-class of the product. Another technique is to artificially apply random breakage to the feed particle sections to estimate the resultant distribution of product particle sections. This technique provides a useful alternative estimation method. Because this technique is applied to particle sections, the actual liberation properties for particles can only be estimated by applying stereological correction. A recent stereological technique has been developed that allows the discrepancy between the linear intercept composition distribution and the particle section composition distribution to be used as guide for estimating the particle composition distribution. The paper will show results validating this new technique using numerical simulation. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Consider a network of unreliable links, modelling for example a communication network. Estimating the reliability of the network-expressed as the probability that certain nodes in the network are connected-is a computationally difficult task. In this paper we study how the Cross-Entropy method can be used to obtain more efficient network reliability estimation procedures. Three techniques of estimation are considered: Crude Monte Carlo and the more sophisticated Permutation Monte Carlo and Merge Process. We show that the Cross-Entropy method yields a speed-up over all three techniques.
Resumo:
Subsequent to the influential paper of [Chan, K.C., Karolyi, G.A., Longstaff, F.A., Sanders, A.B., 1992. An empirical comparison of alternative models of the short-term interest rate. Journal of Finance 47, 1209-1227], the generalised method of moments (GMM) has been a popular technique for estimation and inference relating to continuous-time models of the short-term interest rate. GMM has been widely employed to estimate model parameters and to assess the goodness-of-fit of competing short-rate specifications. The current paper conducts a series of simulation experiments to document the bias and precision of GMM estimates of short-rate parameters, as well as the size and power of [Hansen, L.P., 1982. Large sample properties of generalised method of moments estimators. Econometrica 50, 1029-1054], J-test of over-identifying restrictions. While the J-test appears to have appropriate size and good power in sample sizes commonly encountered in the short-rate literature, GMM estimates of the speed of mean reversion are shown to be severely biased. Consequently, it is dangerous to draw strong conclusions about the strength of mean reversion using GMM. In contrast, the parameter capturing the levels effect, which is important in differentiating between competing short-rate specifications, is estimated with little bias. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
In this paper, we propose a fast adaptive importance sampling method for the efficient simulation of buffer overflow probabilities in queueing networks. The method comprises three stages. First, we estimate the minimum cross-entropy tilting parameter for a small buffer level; next, we use this as a starting value for the estimation of the optimal tilting parameter for the actual (large) buffer level. Finally, the tilting parameter just found is used to estimate the overflow probability of interest. We study various properties of the method in more detail for the M/M/1 queue and conjecture that similar properties also hold for quite general queueing networks. Numerical results support this conjecture and demonstrate the high efficiency of the proposed algorithm.
Resumo:
An automated method for extracting brain volumes from three commonly acquired three-dimensional (3D) MR images (proton density, T1 weighted, and T2-weighted) of the human head is described. The procedure is divided into four levels: preprocessing, segmentation, scalp removal, and postprocessing. A user-provided reference point is the sole operator-dependent input required, The method's parameters were first optimized and then fixed and applied to 30 repeat data sets from 15 normal older adult subjects to investigate its reproducibility. Percent differences between total brain volumes (TBVs) for the subjects' repeated data sets ranged from .5% to 2.2%. We conclude that the method is both robust and reproducible and has the potential for wide application.
Resumo:
Lateral ventricular volumes based on segmented brain MR images can be significantly underestimated if partial volume effects are not considered. This is because a group of voxels in the neighborhood of lateral ventricles is often mis-classified as gray matter voxels due to partial volume effects. This group of voxels is actually a mixture of ventricular cerebro-spinal fluid and the white matter and therefore, a portion of it should be included as part of the lateral ventricular structure. In this note, we describe an automated method for the measurement of lateral ventricular volumes on segmented brain MR images. Image segmentation was carried in combination of intensity correction and thresholding. The method is featured with a procedure for addressing mis-classified voxels in the surrounding of lateral ventricles. A detailed analysis showed that lateral ventricular volumes could be underestimated by 10 to 30% depending upon the size of the lateral ventricular structure, if mis-classified voxels were not included. Validation of the method was done through comparison with the averaged manually traced volumes. Finally, the merit of the method is demonstrated in the evaluation of the rate of lateral ventricular enlargement. (C) 2001 Elsevier Science Inc. All rights reserved.
Resumo:
Introduction Bioelectrical impedance analysis (BIA) is a useful field measure to estimate total body water (TBW). No prediction formulae have been developed or validated against a reference method in patients with pancreatic cancer. The aim of this study was to assess the agreement between three prediction equations for the estimation of TBW in cachectic patients with pancreatic cancer. Methods Resistance was measured at frequencies of 50 and 200 kHz in 18 outpatients (10 males and eight females, age 70.2 +/- 11.8 years) with pancreatic cancer from two tertiary Australian hospitals. Three published prediction formulae were used to calculate TBW - TBWs developed in surgical patients, TBWca-uw and TBWca-nw developed in underweight and normal weight patients with end-stage cancer. Results There was no significant difference in the TBW estimated by the three prediction equations - TBWs 32.9 +/- 8.3 L, TBWca-nw 36.3 +/- 7.4 L, TBWca-uw 34.6 +/- 7.6 L. At a population level, there is agreement between prediction of TBW in patients with pancreatic cancer estimated from the three equations. The best combination of low bias and narrow limits of agreement was observed when TBW was estimated from the equation developed in the underweight cancer patients relative to the normal weight cancer patients. When no established BIA prediction equation exists, practitioners should utilize an equation developed in a population with similar critical characteristics such as diagnosis, weight loss, body mass index and/or age. Conclusions Further research is required to determine the accuracy of the BIA prediction technique against a reference method in patients with pancreatic cancer.
Resumo:
This study investigates whether different diurnal types (morning versus evening) differ in their estimation of time duration at different times of the day. Given that the performance of morning and evening types is typically best at their preferred times of day, and assuming different diurnal trends in subjective alertness (arousal?) for morning and evening types, and adopting the attentional gate model of time duration estimation, it was predicted that morning types would tend to underestimate and be more accurate in the morning compared to evening types where the opposite pattern was expected. Nineteen morning types, 18 evening types and 18 intermediate types were drawn from a large sample (N=1175) of undergraduates administered the Early/Late Preference Scale. Groups performed a time duration estimation task using the production method for estimating 20-s unfilled intervals at two times of day: 0800/1830. The median absolute error, median directional error and frequency of under- and overestimation were analysed using repeated-measures ANOVA. While all differences were statistically non-significant, the following trends were observed: morning types performed better than evening types; participants overestimated in the morning and underestimated in the evening; and participants were more accurate later in the day. It was concluded that the trends are inconsistent with a relationship between subjective alertness and time duration estimation but consistent with a possible relationship between time duration estimation and diurnal body temperature fluctuations. (C) 2002 Elsevier Ltd. All rights reserved.
Resumo:
We present a novel method, called the transform likelihood ratio (TLR) method, for estimation of rare event probabilities with heavy-tailed distributions. Via a simple transformation ( change of variables) technique the TLR method reduces the original rare event probability estimation with heavy tail distributions to an equivalent one with light tail distributions. Once this transformation has been established we estimate the rare event probability via importance sampling, using the classical exponential change of measure or the standard likelihood ratio change of measure. In the latter case the importance sampling distribution is chosen from the same parametric family as the transformed distribution. We estimate the optimal parameter vector of the importance sampling distribution using the cross-entropy method. We prove the polynomial complexity of the TLR method for certain heavy-tailed models and demonstrate numerically its high efficiency for various heavy-tailed models previously thought to be intractable. We also show that the TLR method can be viewed as a universal tool in the sense that not only it provides a unified view for heavy-tailed simulation but also can be efficiently used in simulation with light-tailed distributions. We present extensive simulation results which support the efficiency of the TLR method.
Resumo:
Genetic assignment methods use genotype likelihoods to draw inference about where individuals were or were not born, potentially allowing direct, real-time estimates of dispersal. We used simulated data sets to test the power and accuracy of Monte Carlo resampling methods in generating statistical thresholds for identifying F-0 immigrants in populations with ongoing gene flow, and hence for providing direct, real-time estimates of migration rates. The identification of accurate critical values required that resampling methods preserved the linkage disequilibrium deriving from recent generations of immigrants and reflected the sampling variance present in the data set being analysed. A novel Monte Carlo resampling method taking into account these aspects was proposed and its efficiency was evaluated. Power and error were relatively insensitive to the frequency assumed for missing alleles. Power to identify F-0 immigrants was improved by using large sample size (up to about 50 individuals) and by sampling all populations from which migrants may have originated. A combination of plotting genotype likelihoods and calculating mean genotype likelihood ratios (D-LR) appeared to be an effective way to predict whether F-0 immigrants could be identified for a particular pair of populations using a given set of markers.