339 resultados para Statistical parameters
Resumo:
The method of generalized estimating equation-, (GEEs) has been criticized recently for a failure to protect against misspecification of working correlation models, which in some cases leads to loss of efficiency or infeasibility of solutions. However, the feasibility and efficiency of GEE methods can be enhanced considerably by using flexible families of working correlation models. We propose two ways of constructing unbiased estimating equations from general correlation models for irregularly timed repeated measures to supplement and enhance GEE. The supplementary estimating equations are obtained by differentiation of the Cholesky decomposition of the working correlation, or as score equations for decoupled Gaussian pseudolikelihood. The estimating equations are solved with computational effort equivalent to that required for a first-order GEE. Full details and analytic expressions are developed for a generalized Markovian model that was evaluated through simulation. Large-sample ".sandwich" standard errors for working correlation parameter estimates are derived and shown to have good performance. The proposed estimating functions are further illustrated in an analysis of repeated measures of pulmonary function in children.
Resumo:
This article develops a method for analysis of growth data with multiple recaptures when the initial ages for all individuals are unknown. The existing approaches either impute the initial ages or model them as random effects. Assumptions about the initial age are not verifiable because all the initial ages are unknown. We present an alternative approach that treats all the lengths including the length at first capture as correlated repeated measures for each individual. Optimal estimating equations are developed using the generalized estimating equations approach that only requires the first two moment assumptions. Explicit expressions for estimation of both mean growth parameters and variance components are given to minimize the computational complexity. Simulation studies indicate that the proposed method works well. Two real data sets are analyzed for illustration, one from whelks (Dicathais aegaota) and the other from southern rock lobster (Jasus edwardsii) in South Australia.
Resumo:
- Study Design Controlled laboratory study - Objective To investigate the effect of a 12–mm in–shoe orthotic heel lift on Achilles tendon loading during shod walking using transmission–mode ultrasonography. - Background Orthotic heel lifts are thought to lower tension in the Achilles tendon but evidence for this effect is equivocal. - Methods The propagation speed of ultrasound, which is governed by the elastic modulus and density of tendon and is proportional to the tensile load to which it is exposed, was measured in the right Achilles tendon of twelve recreationally–active males during shod treadmill walking at matched speeds (3.4±0.7 km/h), with and without addition of a heel lift. Vertical ground reaction force and spatiotemporal gait parameters were simultaneously recorded. Data were acquired at 100Hz during 10s of steady–state walking. Statistical comparisons were made using paired t–tests (α=.05). - Results Ultrasound transmission speed in the Achilles tendon was characterized by two maxima (P1, P2) and minima (M1, M2) during walking. Addition of a heel lift to footwear resulted in a 2% increase and 2% decrease in the first vertical ground reaction force peak and the local minimum, respectively (P<.05). Peak ultrasonic velocity in the Achilles tendon (P1, P2, M2) was significantly lower with addition of an orthotic heel lift (P<.05). - Conclusions Peak ultrasound transmission speed in the Achilles tendon was lower with the addition of a 12–mm orthotic heel lift, indicating the heel lift reduced tensile load in the Achilles tendon, thereby counteracting the effect of footwear. These findings support the addition of orthotic heel lifts to footwear in the rehabilitation of Achilles tendon disorders where management aims to lower tension within the tendon. - Level of Evidence Therapy, level 2a
Resumo:
The article describes a generalized estimating equations approach that was used to investigate the impact of technology on vessel performance in a trawl fishery during 1988-96, while accounting for spatial and temporal correlations in the catch-effort data. Robust estimation of parameters in the presence of several levels of clustering depended more on the choice of cluster definition than on the choice of correlation structure within the cluster. Models with smaller cluster sizes produced stable results, while models with larger cluster sizes, that may have had complex within-cluster correlation structures and that had within-cluster covariates, produced estimates sensitive to the correlation structure. The preferred model arising from this dataset assumed that catches from a vessel were correlated in the same years and the same areas, but independent in different years and areas. The model that assumed catches from a vessel were correlated in all years and areas, equivalent to a random effects term for vessel, produced spurious results. This was an unexpected finding that highlighted the need to adopt a systematic strategy for modelling. The article proposes a modelling strategy of selecting the best cluster definition first, and the working correlation structure (within clusters) second. The article discusses the selection and interpretation of the model in the light of background knowledge of the data and utility of the model, and the potential for this modelling approach to apply in similar statistical situations.
Resumo:
Troxel, Lipsitz, and Brennan (1997, Biometrics 53, 857-869) considered parameter estimation from survey data with nonignorable nonresponse and proposed weighted estimating equations to remove the biases in the complete-case analysis that ignores missing observations. This paper suggests two alternative modifications for unbiased estimation of regression parameters when a binary outcome is potentially observed at successive time points. The weighting approach of Robins, Rotnitzky, and Zhao (1995, Journal of the American Statistical Association 90, 106-121) is also modified to obtain unbiased estimating functions. The suggested estimating functions are unbiased only when the missingness probability is correctly specified, and misspecification of the missingness model will result in biases in the estimates. Simulation studies are carried out to assess the performance of different methods when the covariate is binary or normal. For the simulation models used, the relative efficiency of the two new methods to the weighting methods is about 3.0 for the slope parameter and about 2.0 for the intercept parameter when the covariate is continuous and the missingness probability is correctly specified. All methods produce substantial biases in the estimates when the missingness model is misspecified or underspecified. Analysis of data from a medical survey illustrates the use and possible differences of these estimating functions.
Resumo:
James (1991, Biometrics 47, 1519-1530) constructed unbiased estimating functions for estimating the two parameters in the von Bertalanffy growth curve from tag-recapture data. This paper provides unbiased estimating functions for a class of growth models that incorporate stochastic components and explanatory variables. a simulation study using seasonal growth models indicates that the proposed method works well while the least-squares methods that are commonly used in the literature may produce substantially biased estimates. The proposed model and method are also applied to real data from tagged rack lobsters to assess the possible seasonal effect on growth.
Resumo:
We consider estimation of mortality rates and growth parameters from length-frequency data of a fish stock when there is individual variability in the von Bertalanffy growth parameter L-infinity and investigate the possible bias in the estimates when the individual variability is ignored. Three methods are examined: (i) the regression method based on the Beverton and Holt's (1956, Rapp. P.V. Reun. Cons. Int. Explor. Mer, 140: 67-83) equation; (ii) the moment method of Powell (1979, Rapp. PV. Reun. Int. Explor. Mer, 175: 167-169); and (iii) a generalization of Powell's method that estimates the individual variability to be incorporated into the estimation. It is found that the biases in the estimates from the existing methods are, in general, substantial, even when individual variability in growth is small and recruitment is uniform, and the generalized method performs better in terms of bias but is subject to a larger variation. There is a need to develop robust and flexible methods to deal with individual variability in the analysis of length-frequency data.
Resumo:
In the analysis of tagging data, it has been found that the least-squares method, based on the increment function known as the Fabens method, produces biased estimates because individual variability in growth is not allowed for. This paper modifies the Fabens method to account for individual variability in the length asymptote. Significance tests using t-statistics or log-likelihood ratio statistics may be applied to show the level of individual variability. Simulation results indicate that the modified method reduces the biases in the estimates to negligible proportions. Tagging data from tiger prawns (Penaeus esculentus and Penaeus semisulcatus) and rock lobster (Panulirus ornatus) are analysed as an illustration.
Resumo:
A modified conventional direct shear device was used to measure unsaturated shear strength of two silty soils at low suction values (0 ~ 50 kPa) that were achieved by following drying and wetting paths of soil water characteristic curves (SWCCs). The results revealed that the internal friction angle of the soils was not significantly affected by either the suction or the drying wetting SWCCs. The apparent cohesion of soil increased with a decreasing rate as suction increased. Shear stress-shear displacement curves obtained from soil specimens subjected to the same net normal stress and different suction values showed a higher initial stiffness and a greater peak stress as suction increased. A soil in wetting exhibited slightly higher peak shear stress and more contractive volume change behavior than that of soil in drying at the same net normal stress and suction.
Resumo:
In this paper, we tackle the problem of unsupervised domain adaptation for classification. In the unsupervised scenario where no labeled samples from the target domain are provided, a popular approach consists in transforming the data such that the source and target distributions be- come similar. To compare the two distributions, existing approaches make use of the Maximum Mean Discrepancy (MMD). However, this does not exploit the fact that prob- ability distributions lie on a Riemannian manifold. Here, we propose to make better use of the structure of this man- ifold and rely on the distance on the manifold to compare the source and target distributions. In this framework, we introduce a sample selection method and a subspace-based method for unsupervised domain adaptation, and show that both these manifold-based techniques outperform the cor- responding approaches based on the MMD. Furthermore, we show that our subspace-based approach yields state-of- the-art results on a standard object recognition benchmark.
Resumo:
Traffic incidents are recognised as one of the key sources of non-recurrent congestion that often leads to reduction in travel time reliability (TTR), a key metric of roadway performance. A method is proposed here to quantify the impacts of traffic incidents on TTR on freeways. The method uses historical data to establish recurrent speed profiles and identifies non-recurrent congestion based on their negative impacts on speeds. The locations and times of incidents are used to identify incidents among non-recurrent congestion events. Buffer time is employed to measure TTR. Extra buffer time is defined as the extra delay caused by traffic incidents. This reliability measure indicates how much extra travel time is required by travellers to arrive at their destination on time with 95% certainty in the case of an incident, over and above the travel time that would have been required under recurrent conditions. An extra buffer time index (EBTI) is defined as the ratio of extra buffer time to recurrent travel time, with zero being the best case (no delay). A Tobit model is used to identify and quantify factors that affect EBTI using a selected freeway segment in the Southeast Queensland, Australia network. Both fixed and random parameter Tobit specifications are tested. The estimation results reveal that models with random parameters offer a superior statistical fit for all types of incidents, suggesting the presence of unobserved heterogeneity across segments. What factors influence EBTI depends on the type of incident. In addition, changes in TTR as a result of traffic incidents are related to the characteristics of the incidents (multiple vehicles involved, incident duration, major incidents, etc.) and traffic characteristics.
Resumo:
Purpose A retrospective planning study comparing volumetric arc therapy (VMAT) and stereotactic body radiotherapy (SBRT) treatment plans for non-small cell lung cancer (NSCLC). Methods and materials Five randomly selected early stage lung cancer patients were included in the study. For each patient, four plans were created: the SBRT plan and three VMAT plans using different optimisation methodologies. A total of 20 different plans were evaluated. The dose parameters of dose conformity results and the target dose constraints results were compared for these plans. Results The mean planning target volume (PTV) for all the plans (SBRT and VMAT) was 18·3 cm3, with a range from 15·6 to 20·1 cm3. The maximum dose tolerance to 1 cc of all the plans was within 140% (84 Gy) of the prescribed dose, and 95% of the PTV of all the plans received 100% of the prescribed dose (60 Gy). In all the plans, 99% of the PTV received a dose >90% of the prescribed dose, and the mean dose in all the plans ranged from 67 to 72 Gy. The planning target dose conformity for the SBRT and the VMAT (0°, 15° collimator single arc plans and dual arc) plans showed the tightness of the prescription isodose conformity to the target. Conclusions SBRT and VMAT are radiotherapy approaches that increase doses to small tumour targets without increasing doses to the organs at risk. Although VMAT offers an alternative to SBRT for NSCLC and the potential advantage of VMAT is the reduced treatment times over SBRT, the statistical results show that there was no significant difference between the SBRT and VMAT optimised plans in terms of dose conformity and organ-at-risk sparing.
Resumo:
This paper conceptualizes a framework for bridging the BIM (building information modelling)-specifications divide through augmenting objects within BIM with specification parameters derived from a product library. We demonstrate how model information, enriched with data at various LODs (levels of development), can evolve simultaneously with design and construction using different representation of a window object embedded in a wall as lifecycle phase exemplars at different levels of granularity. The conceptual standpoint is informed by the need for exploring a methodological approach which extends beyond current limitations of current modelling platforms in enhancing the information content of BIM models. Therefore, this work demonstrates that BIM objects can be augmented with construction specification parameters leveraging product libraries.
Resumo:
The past decade has brought a proliferation of statistical genetic (linkage) analysis techniques, incorporating new methodology and/or improvement of existing methodology in gene mapping, specifically targeted towards the localization of genes underlying complex disorders. Most of these techniques have been implemented in user-friendly programs and made freely available to the genetics community. Although certain packages may be more 'popular' than others, a common question asked by genetic researchers is 'which program is best for me?'. To help researchers answer this question, the following software review aims to summarize the main advantages and disadvantages of the popular GENEHUNTER package.