234 resultados para Hartree Fock scheme correlation errors


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rank-based inference is widely used because of its robustness. This article provides optimal rank-based estimating functions in analysis of clustered data with random cluster effects. The extensive simulation studies carried out to evaluate the performance of the proposed method demonstrate that it is robust to outliers and is highly efficient given the existence of strong cluster correlations. The performance of the proposed method is satisfactory even when the correlation structure is misspecified, or when heteroscedasticity in variance is present. Finally, a real dataset is analyzed for illustration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A modeling paradigm is proposed for covariate, variance and working correlation structure selection for longitudinal data analysis. Appropriate selection of covariates is pertinent to correct variance modeling and selecting the appropriate covariates and variance function is vital to correlation structure selection. This leads to a stepwise model selection procedure that deploys a combination of different model selection criteria. Although these criteria find a common theoretical root based on approximating the Kullback-Leibler distance, they are designed to address different aspects of model selection and have different merits and limitations. For example, the extended quasi-likelihood information criterion (EQIC) with a covariance penalty performs well for covariate selection even when the working variance function is misspecified, but EQIC contains little information on correlation structures. The proposed model selection strategies are outlined and a Monte Carlo assessment of their finite sample properties is reported. Two longitudinal studies are used for illustration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To discuss generalized estimating equations as an extension of generalized linear models by commenting on the paper of Ziegler and Vens "Generalized Estimating Equations. Notes on the Choice of the Working Correlation Matrix". Methods Inviting an international group of experts to comment on this paper. Results Several perspectives have been taken by the discussants. Econometricians have established parallels to the generalized method of moments (GMM). Statisticians discussed model assumptions and the aspect of missing data Applied statisticians; commented on practical aspects in data analysis. Conclusions In general, careful modeling correlation is encouraged when considering estimation efficiency and other implications, and a comparison of choosing instruments in GMM and generalized estimating equations, (GEE) would be worthwhile. Some theoretical drawbacks of GEE need to be further addressed and require careful analysis of data This particularly applies to the situation when data are missing at random.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Selecting an appropriate working correlation structure is pertinent to clustered data analysis using generalized estimating equations (GEE) because an inappropriate choice will lead to inefficient parameter estimation. We investigate the well-known criterion of QIC for selecting a working correlation Structure. and have found that performance of the QIC is deteriorated by a term that is theoretically independent of the correlation structures but has to be estimated with an error. This leads LIS to propose a correlation information criterion (CIC) that substantially improves the QIC performance. Extensive simulation studies indicate that the CIC has remarkable improvement in selecting the correct correlation structures. We also illustrate our findings using a data set from the Madras Longitudinal Schizophrenia Study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In treatment comparison experiments, the treatment responses are often correlated with some concomitant variables which can be measured before or at the beginning of the experiments. In this article, we propose schemes for the assignment of experimental units that may greatly improve the efficiency of the comparison in such situations. The proposed schemes are based on general ranked set sampling. The relative efficiency and cost-effectiveness of the proposed schemes are studied and compared. It is found that some proposed schemes are always more efficient than the traditional simple random assignment scheme when the total cost is the same. Numerical studies show promising results using the proposed schemes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efficiency of analysis using generalized estimation equations is enhanced when intracluster correlation structure is accurately modeled. We compare two existing criteria (a quasi-likelihood information criterion, and the Rotnitzky-Jewell criterion) to identify the true correlation structure via simulations with Gaussian or binomial response, covariates varying at cluster or observation level, and exchangeable or AR(l) intracluster correlation structure. Rotnitzky and Jewell's approach performs better when the true intracluster correlation structure is exchangeable, while the quasi-likelihood criteria performs better for an AR(l) structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adaptions of weighted rank regression to the accelerated failure time model for censored survival data have been successful in yielding asymptotically normal estimates and flexible weighting schemes to increase statistical efficiencies. However, for only one simple weighting scheme, Gehan or Wilcoxon weights, are estimating equations guaranteed to be monotone in parameter components, and even in this case are step functions, requiring the equivalent of linear programming for computation. The lack of smoothness makes standard error or covariance matrix estimation even more difficult. An induced smoothing technique overcame these difficulties in various problems involving monotone but pure jump estimating equations, including conventional rank regression. The present paper applies induced smoothing to the Gehan-Wilcoxon weighted rank regression for the accelerated failure time model, for the more difficult case of survival time data subject to censoring, where the inapplicability of permutation arguments necessitates a new method of estimating null variance of estimating functions. Smooth monotone parameter estimation and rapid, reliable standard error or covariance matrix estimation is obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A 'pseudo-Bayesian' interpretation of standard errors yields a natural induced smoothing of statistical estimating functions. When applied to rank estimation, the lack of smoothness which prevents standard error estimation is remedied. Efficiency and robustness are preserved, while the smoothed estimation has excellent computational properties. In particular, convergence of the iterative equation for standard error is fast, and standard error calculation becomes asymptotically a one-step procedure. This property also extends to covariance matrix calculation for rank estimates in multi-parameter problems. Examples, and some simple explanations, are given.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical methods are often used to analyse commercial catch and effort data to provide standardised fishing effort and/or a relative index of fish abundance for input into stock assessment models. Achieving reliable results has proved difficult in Australia's Northern Prawn Fishery (NPF), due to a combination of such factors as the biological characteristics of the animals, some aspects of the fleet dynamics, and the changes in fishing technology. For this set of data, we compared four modelling approaches (linear models, mixed models, generalised estimating equations, and generalised linear models) with respect to the outcomes of the standardised fishing effort or the relative index of abundance. We also varied the number and form of vessel covariates in the models. Within a subset of data from this fishery, modelling correlation structures did not alter the conclusions from simpler statistical models. The random-effects models also yielded similar results. This is because the estimators are all consistent even if the correlation structure is mis-specified, and the data set is very large. However, the standard errors from different models differed, suggesting that different methods have different statistical efficiency. We suggest that there is value in modelling the variance function and the correlation structure, to make valid and efficient statistical inferences and gain insight into the data. We found that fishing power was separable from the indices of prawn abundance only when we offset the impact of vessel characteristics at assumed values from external sources. This may be due to the large degree of confounding within the data, and the extreme temporal changes in certain aspects of individual vessels, the fleet and the fleet dynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The method of generalised estimating equations for regression modelling of clustered outcomes allows for specification of a working matrix that is intended to approximate the true correlation matrix of the observations. We investigate the asymptotic relative efficiency of the generalised estimating equation for the mean parameters when the correlation parameters are estimated by various methods. The asymptotic relative efficiency depends on three-features of the analysis, namely (i) the discrepancy between the working correlation structure and the unobservable true correlation structure, (ii) the method by which the correlation parameters are estimated and (iii) the 'design', by which we refer to both the structures of the predictor matrices within clusters and distribution of cluster sizes. Analytical and numerical studies of realistic data-analysis scenarios show that choice of working covariance model has a substantial impact on regression estimator efficiency. Protection against avoidable loss of efficiency associated with covariance misspecification is obtained when a 'Gaussian estimation' pseudolikelihood procedure is used with an AR(1) structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Some studies suggested that adequate vitamin D might reduce inflammation in adults. However, little is known about this association in early life. We aimed to determine the relationship between cord blood 25-hydroxyvitamin D (25(OH)D) and C-reactive protein (CRP) in neonates. Cord blood levels of 25(OH)D and CRP were measured in 1491 neonates in Hefei, China. Potential confounders including maternal sociodemographic characteristics, perinatal health status, lifestyle, and birth outcomes were prospectively collected. The average values of cord blood 25(OH)D and CRP were 39.43 nmol/L (SD = 20.35) and 6.71 mg/L (SD = 3.07), respectively. Stratified by 25(OH)D levels, per 10 nmol/L increase in 25(OH)D, CRP decreased by 1.42 mg/L (95% CI: 0.90, 1.95) among neonates with 25(OH)D <25.0 nmol/L, and decreased by 0.49 mg/L (95% CI: 0.17, 0.80) among neonates with 25(OH)D between 25.0 nmol/L and 49.9 nmol/L, after adjusting for potential confounders. However, no significant association between 25(OH)D and CRP was observed among neonates with 25(OH)D ≥50 nmol/L. Cord blood 25(OH)D and CRP levels showed a significant seasonal trend with lower 25(OH)D and higher CRP during winter-spring than summer-autumn. Stratified by season, a significant linear association of 25(OH)D with CRP was observed in neonates born in winter-spring (adjusted β = −0.11, 95% CI: −0.13, −0.10), but not summer-autumn. Among neonates born in winter-spring, neonates with 25(OH)D <25 nmol/L had higher risk of CRP ≥10 mg/L (adjusted OR = 3.06, 95% CI: 2.00, 4.69), compared to neonates with 25(OH)D ≥25 nmol/L. Neonates with vitamin D deficiency had higher risk of exposure to elevated inflammation at birth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Suboptimal restraint use, particularly the incorrect use of restraints, is a significant and widespread problem among child vehicle occupants, and increases the risk of injury. Previous research has identified comfort as a potential factor influencing suboptimal restraint use. Both the real comfort experienced by the child and the parent’s perception of the child’s comfort are reported to influence the optimal use of restraints. Problems with real comfort may lead the child to misuse the restraint in their attempt to achieve better comfort whilst parent-perceived discomfort has been reported as a driver for premature graduation and inappropriate restraint choice. However, this work has largely been qualitative. There has been no research that objectively studies either the association between real and parent-perceived comfort, or any association between comfort and suboptimal restraint use. One barrier to such studies is the absence of validated tools for quantifying real comfort in children. We aimed to develop methods to examine both real and parent-perceived comfort and examine their effects on suboptimal restraint use. We conducted online parent surveys (n=470) to explore what drives parental perceptions of their child’s comfort in restraint systems (study 1) and used data from field observation studies (n=497) to examine parent-perceived comfort and its relationship with observed restraint use (study 2). We developed methods to measure comfort in children in a laboratory setting (n=14) using video analysis to estimate a Discomfort Avoidance Behaviour (DAB) score, pressure mapping and adapted survey tools to differentiate between comfortable and induced discomfort conditions (study 3). The DAB rate was then used to compare an integrated booster with an add-on booster (study 4) Preliminary analysis of our recent online survey of Australian parents (study 1) indicates that 23% of parents report comfort as a consideration when making a decision to change restraints. Logistic regression modelling of data collected during the field observation study (study 2) revealed that parent-perceived discomfort was not significantly associated with premature graduation. Contrary to expectation, children of parents who reported that their child was comfortable were almost twice as likely to have been incorrectly restrained (p<0.01, 95% CI 1.24 - 2.77).In the laboratory study (study 3) we found our adapted survey tools did not provide a reliable measurement of real comfort among children. However our DAB score was able to differentiate between comfortable and induced discomfort conditions and correlated well with pressure mapping. Preliminary results from the laboratory comparison study (study 4) indicate a positive correlation between DAB rate and use errors. In experiments conducted to date, we have seen a significantly higher DAB rate in the integrated booster compared to the add-on booster (p < 0.01). However, this needs to be confirmed in a naturalistic setting and in further experiments that take length of time under observation into account. Our results suggest that while some parents report concern about their child’s comfort, parent-reported comfort levels were not associated with restraint choice. If comfort is important for optimal restraint use, it is likely to be the real comfort of the child rather than that reported by the parent. The method we have developed for studying real comfort can be used in naturalistic studies involving child occupants to further understand this relationship. This work will be of interest to vehicle and child restraint manufacturers interested in improving restraint design for young occupants as well as researchers and other stakeholders interested in reducing the incidence of restraint misuse among children.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose This study evaluated the impact of patient set-up errors on the probability of pulmonary and cardiac complications in the irradiation of left-sided breast cancer. Methods and Materials Using the CMS XiO Version 4.6 (CMS Inc., St Louis, MO) radiotherapy planning system's NTCP algorithm and the Lyman -Kutcher-Burman (LKB) model, we calculated the DVH indices for the ipsilateral lung and heart and the resultant normal tissue complication probabilities (NTCP) for radiation-induced pneumonitis and excess cardiac mortality in 12 left-sided breast cancer patients. Results Isocenter shifts in the posterior direction had the greatest effect on the lung V20, heart V25, mean and maximum doses to the lung and the heart. Dose volume histograms (DVH) results show that the ipsilateral lung V20 tolerance was exceeded in 58% of the patients after 1cm posterior shifts. Similarly, the heart V25 tolerance was exceeded after 1cm antero-posterior and left-right isocentric shifts in 70% of the patients. The baseline NTCPs for radiation-induced pneumonitis ranged from 0.73% - 3.4% with a mean value of 1.7%. The maximum reported NTCP for radiation-induced pneumonitis was 5.8% (mean 2.6%) after 1cm posterior isocentric shift. The NTCP for excess cardiac mortality were 0 % in 100% of the patients (n=12) before and after setup error simulations. Conclusions Set-up errors in left sided breast cancer patients have a statistically significant impact on the Lung NTCPs and DVH indices. However, with a central lung distance of 3cm or less (CLD <3cm), and a maximum heart distance of 1.5cm or less (MHD<1.5cm), the treatment plans could tolerate set-up errors of up to 1cm without any change in the NTCP to the heart.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Images from cell biology experiments often indicate the presence of cell clustering, which can provide insight into the mechanisms driving the collective cell behaviour. Pair-correlation functions provide quantitative information about the presence, or absence, of clustering in a spatial distribution of cells. This is because the pair-correlation function describes the ratio of the abundance of pairs of cells, separated by a particular distance, relative to a randomly distributed reference population. Pair-correlation functions are often presented as a kernel density estimate where the frequency of pairs of objects are grouped using a particular bandwidth (or bin width), Δ>0. The choice of bandwidth has a dramatic impact: choosing Δ too large produces a pair-correlation function that contains insufficient information, whereas choosing Δ too small produces a pair-correlation signal dominated by fluctuations. Presently, there is little guidance available regarding how to make an objective choice of Δ. We present a new technique to choose Δ by analysing the power spectrum of the discrete Fourier transform of the pair-correlation function. Using synthetic simulation data, we confirm that our approach allows us to objectively choose Δ such that the appropriately binned pair-correlation function captures known features in uniform and clustered synthetic images. We also apply our technique to images from two different cell biology assays. The first assay corresponds to an approximately uniform distribution of cells, while the second assay involves a time series of images of a cell population which forms aggregates over time. The appropriately binned pair-correlation function allows us to make quantitative inferences about the average aggregate size, as well as quantifying how the average aggregate size changes with time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A 59-year-old man was mistakenly prescribed Slow-Na instead of Slow-K due to incorrect selection from a drop-down list in the prescribing software. This error was identified by a pharmacist during a home medicine review (HMR) before the patient began taking the supplement. The reported error emphasizes the need for vigilance due to the emergence of novel look-alike, sound-alike (LASA) drug pairings. This case highlights the important role of pharmacists in medication safety.