893 resultados para likelihood to publication


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A significant problem in the collection of responses to potentially sensitive questions, such as relating to illegal, immoral or embarrassing activities, is non-sampling error due to refusal to respond or false responses. Eichhorn & Hayre (1983) suggested the use of scrambled responses to reduce this form of bias. This paper considers a linear regression model in which the dependent variable is unobserved but for which the sum or product with a scrambling random variable of known distribution, is known. The performance of two likelihood-based estimators is investigated, namely of a Bayesian estimator achieved through a Markov chain Monte Carlo (MCMC) sampling scheme, and a classical maximum-likelihood estimator. These two estimators and an estimator suggested by Singh, Joarder & King (1996) are compared. Monte Carlo results show that the Bayesian estimator outperforms the classical estimators in almost all cases, and the relative performance of the Bayesian estimator improves as the responses become more scrambled.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A mixture model for long-term survivors has been adopted in various fields such as biostatistics and criminology where some individuals may never experience the type of failure under study. It is directly applicable in situations where the only information available from follow-up on individuals who will never experience this type of failure is in the form of censored observations. In this paper, we consider a modification to the model so that it still applies in the case where during the follow-up period it becomes known that an individual will never experience failure from the cause of interest. Unless a model allows for this additional information, a consistent survival analysis will not be obtained. A partial maximum likelihood (ML) approach is proposed that preserves the simplicity of the long-term survival mixture model and provides consistent estimators of the quantities of interest. Some simulation experiments are performed to assess the efficiency of the partial ML approach relative to the full ML approach for survival in the presence of competing risks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In their letter, Gogarten et al. question the effectiveness of the epidural regimens across the trial centers. In our original publication (1), we clearly demonstrated that patients in the epidural group had a working epidural block intraoperatively (evidenced by significantly more hypotension) and postoperatively (evidenced by significantly improved pain scores for 3 days).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Medication errors are a leading cause of unintended harm to patients in Australia and internationally. Research in this area has paid relatively little attention to the interactions between organisational factors and violations of procedures in producing errors, although violations have been found to increase the likelihood of these errors. This study investigated the role of organisational factors in contributing to violations by nurses when administering medications. Data were collected using a self-report questionnaire completed by 506 nurses working in either rural or remote areas in Queensland, Australia. This instrument was used to develop a path model wherein organisational variables predicted 21% of the variance in self-reported violations. Expectations of medical officers mediated the relationship between working conditions of nursing staff and violation behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aortic valve calcium (AVC) can be quantified on the same computed tomographic scan as coronary artery calcium (CAC). Although CAC is an established predictor of cardiovascular events, limited evidence is available for an independent predictive value for AVC. We studied a cohort of 8,401 asymptomatic subjects (mean age 53 10 years, 69% men), who were free of known coronary heart disease and were undergoing electron beam computed tomography for assessment of subclinical atherosclerosis. The patients were followed for a median of 5 years (range 1 to 7) for the occurrence of mortality from any cause. Multivariate Cox regression models were developed to predict all-cause mortality according to the presence of AVC. A total of 517 patients (6%) had AVC on electron beam computed tomography. During follow-up, 124 patients died (1.5%), for an overall survival rate of 96.1% and 98.7% for those with and without AVC, respectively (hazard ratio 3.39, 95% confidence interval 2.09 to 5.49). After adjustment for age, gender, hypertension, dyslipidemia, diabetes mellitus, smoking, and a family history of premature coronary heart disease, AVC remained a significant predictor of mortality (hazard ratio 1.82, 95% confidence interval 1.11 to 2.98). Likelihood ratio chi-square statistics demonstrated that the addition of AVC contributed significantly to the prediction of mortality in a model adjusted for traditional risk factors (chi-square = 5.03, p = 0.03) as well as traditional risk factors plus the presence of CAC (chi-square = 3.58, p = 0.05). In conclusion, AVC was associated with increased all-cause mortality, independent of the traditional risk factors and the presence of CAC. (C) 2010 Published by Elsevier Inc. (Am J Cardiol 2010;106:1787-1791)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The CAMCOG is a brief neuropsychological battery designed to assess global cognitive function and ascertain the impairments that are required for the diagnosis of dementia. To date, the cut-off scores for mild cognitive impairment (MCI) have not been determined. Given the need for an earlier diagnosis of mild dementia, new cut-off values are also necessary, taking into account cultural and educational effects. Methods One hundred and fifty-seven older adults (mean age: 69.6 +/- 7.4 years) with 8 or more years of formal education (mean years of schooling 14.2 +/- 3.8) attending a memory clinic at the Institute of Psychiatry University of Sao Paulo were included. Subjects were divided into three groups according to their cognitive status, established through clinical and neuropsychological assessment: normal controls, n = 62; MCI, n = 65; and mild or moderate dementia, n = 30. ROC curve analyses were performed for dementia vs controls, MCI vs controls and MCI vs dementia. Results The cut-off values were: 92/93 for dementia is controls (AUC = 0.99: sensitivity: 100%, specificity: 95%); 95/96 for MCI vs controls (AUC = 0.83, sensitivity: 64%, specificity: 88%), and 85/86 for MCI vs dementia (AUC = 0.91, sensitivity: 81%, specificity: 88%). The total CAMCOG score was more accurate than its subtests Mini-mental State Examination, Verbal Fluency Test and Clock Drawing Test when used separately. Conclusions The CAMCOG discriminated controls and MCI from demented patients, but was less accurate to discriminate MCI from controls. The best cut-off value to differentiate controls and demented was higher than suggested in the original publication, probably because only cases of mild to moderate dementia were included. This is important given the need for a diagnostic at earlier stages of Alzheimer`s disease. Copyright (C) 2008 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After outlining some comparative features of poverty in India, this article reviews critically recent literature on the dynamics of poverty. On economic efficiency grounds, it rejects the view that the chronically poor are more deserving than the non-chronic poor of poverty assistance. Mechanisms of households and communities for coping with poverty are discussed. The possibility is raised that where poverty has been persistent that rational methods for coping with it are likely to be well established, and less suffering may occur than for households and communities thrown temporarily into poverty. However, situations can also be envisaged where such rational behaviours deepen the poverty trap and create unfavourable externalities for poverty alleviation. Conflict can arise between programmes to alleviate poverty in poor communities and the sustainability of these communities and their local cultures. Problems posed by this are discussed. Furthermore, the impact of market extension on poor landholders is considered. In contrast to the prevailing view that increased market extension and liberalisation is favourable to poor farmers, it is argued that inescapable market transaction cost makes it difficult for the poor to survive as landholders in a fluid and changing market system. The likelihood of poor landholders joining the landless poor rises, and if they migrate from the countryside to the city they face further adjustment hurdles. Consequently, poor landholders may be poorer after the extension of the market system and only their offspring may reap benefits from market reforms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A mixture model incorporating long-term survivors has been adopted in the field of biostatistics where some individuals may never experience the failure event under study. The surviving fractions may be considered as cured. In most applications, the survival times are assumed to be independent. However, when the survival data are obtained from a multi-centre clinical trial, it is conceived that the environ mental conditions and facilities shared within clinic affects the proportion cured as well as the failure risk for the uncured individuals. It necessitates a long-term survivor mixture model with random effects. In this paper, the long-term survivor mixture model is extended for the analysis of multivariate failure time data using the generalized linear mixed model (GLMM) approach. The proposed model is applied to analyse a numerical data set from a multi-centre clinical trial of carcinoma as an illustration. Some simulation experiments are performed to assess the applicability of the model based on the average biases of the estimates formed. Copyright (C) 2001 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Questionnaire surveys, while more economical, typically achieve poorer response rates than interview surveys. We used data from a national volunteer cohort of young adult twins, who were scheduled for assessment by questionnaire in 1989 and by interview in 1996-2000, to identify predictors of questionnaire non-response. Out of a total of 8536 twins, 5058 completed the questionnaire survey (59% response rate), and 6255 completed a telephone interview survey conducted a decade later (73% response rate). Multinomial logit models were fitted to the interview data to identify socioeconomic, psychiatric and health behavior correlates of non-response in the earlier questionnaire survey. Male gender, education below University level, and being a dizygotic rather than monozygotic twin, all predicted reduced likelihood of participating in the questionnaire survey. Associations between questionnaire response status and psychiatric history and health behavior variables were modest, with history of alcohol dependence and childhood conduct disorder predicting decreased probability of returning a questionnaire, and history of smoking and heavy drinking more weakly associated with non-response. Body-mass index showed no association with questionnaire non-response. Despite a poor response rate to the self-report questionnaire survey, we found only limited sampling biases for most variables. While not appropriate for studies where socioeconomic variables are critical, it appears that survey by questionnaire, with questionnaire administration by telephone to non-responders, will represent a viable strategy for gene-mapping studies requiring that large numbers of relatives be screened.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerous everyday tasks require the nervous system to program a prehensile movement towards a target object positioned in a cluttered environment. Adult humans are extremely proficient in avoiding contact with any non-target objects (obstacles) whilst carrying out such movements. A number of recent studies have highlighted the importance of considering the control of reach-to-grasp (prehension) movements in the presence of such obstacles. The current study was constructed with the aim of beginning the task of studying the relative impact on prehension as the position of obstacles is varied within the workspace. The experimental design ensured that the obstacles were positioned within the workspace in locations where they did not interfere physically with the path taken by the hand when no obstacle was present. In all positions, the presence of an obstacle caused the hand to slow down and the maximum grip aperture to decrease. Nonetheless, the effect of the obstacle varied according to its position within the workspace. In the situation where an obstacle was located a small distance to the right of a target object, the obstacle showed a large effect on maximum grip aperture but a relatively small effect on movement time. In contrast, an object positioned in front and to the right of a target object had a large effect on movement speed but a relatively small effect on maximum grip aperture. It was found that the presence of two obstacles caused the system to decrease further the movement speed and maximum grip aperture. The position of the two obstacles dictated the extent to which their presence affected the movement parameters. These results show that the antic ipated likelihood of a collision with potential obstacles affects the planning of movement duration and maximum grip aperture in prehension.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When the data consist of certain attributes measured on the same set of items in different situations, they would be described as a three-mode three-way array. A mixture likelihood approach can be implemented to cluster the items (i.e., one of the modes) on the basis of both of the other modes simultaneously (i.e,, the attributes measured in different situations). In this paper, it is shown that this approach can be extended to handle three-mode three-way arrays where some of the data values are missing at random in the sense of Little and Rubin (1987). The methodology is illustrated by clustering the genotypes in a three-way soybean data set where various attributes were measured on genotypes grown in several environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To describe and analyse the study design and manuscript deficiencies in original research articles submitted to Emergency Medicine. Methods: This was a retrospective, analytical study. Articles were enrolled if the reports of the Section Editor and two reviewers were available. Data were extracted from these reports only. Outcome measures were the mean number and nature of the deficiencies and the mean reviewers’ assessment score. Results: Fifty-seven articles were evaluated (28 accepted for publication, 19 rejected, 10 pending revision). The mean (± SD) number of deficiencies was 18.1 ± 6.9, 16.4 ± 6.5 and 18.4 ± 6.7 for all articles, articles accepted for publication and articles rejected, respectively (P = 0.31 between accepted and rejected articles). The mean assessment scores (0–10) were 5.5 ± 1.5, 5.9 ± 1.5 and 4.7 ± 1.4 for all articles, articles accepted for publication and articles rejected, respectively. Accepted articles had a significantly higher assessment score than rejected articles (P = 0.006). For each group, there was a negative correlation between the number of deficiencies and the mean assessment score (P > 0.05). Significantly more rejected articles ‘… did not further our knowledge’ (P = 0.0014) and ‘… did not describe background information adequately’ (P = 0.049). Many rejected articles had ‘… findings that were not clinically or socially significant’ (P = 0.07). Common deficiencies among all articles included ambiguity of the methods (77%) and results (68%), conclusions not warranted by the data (72%), poor referencing (56%), inadequate study design description (51%), unclear tables (49%), an overly long discussion (49%), limitations of the study not described (51%), inadequate definition of terms (49%) and subject selection bias (40%). Conclusions: Researchers should undertake studies that are likely to further our knowledge and be clinically or socially significant. Deficiencies in manuscript preparation are more frequent than mistakes in study design and execution. Specific training or assistance in manuscript preparation is indicated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 16S rRNA gene (16S rDNA) is currently the most widely used gene for estimating the evolutionary history of prokaryotes, To date, there are more than 30 000 16S rDNA sequences available from the core databases, GenBank, EMBL and DDBJ, This great number may cause a dilemma when composing datasets for phylogenetic analysis, since the choice and number of reference organisms are known to affect the resulting tree topology. A group of sequences appearing monophyletic in one dataset may not be so in another. This can be especially problematic when establishing the relationships of distantly related sequences at the division (phylum) level. In this study, a multiple-outgroup approach to resolving division-level phylogenetic relationships is suggested using 16S rDNA data. The approach is illustrated by two case studies concerning the monophyly of two recently proposed bacterial divisions, OP9 and OP10.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Binning and truncation of data are common in data analysis and machine learning. This paper addresses the problem of fitting mixture densities to multivariate binned and truncated data. The EM approach proposed by McLachlan and Jones (Biometrics, 44: 2, 571-578, 1988) for the univariate case is generalized to multivariate measurements. The multivariate solution requires the evaluation of multidimensional integrals over each bin at each iteration of the EM procedure. Naive implementation of the procedure can lead to computationally inefficient results. To reduce the computational cost a number of straightforward numerical techniques are proposed. Results on simulated data indicate that the proposed methods can achieve significant computational gains with no loss in the accuracy of the final parameter estimates. Furthermore, experimental results suggest that with a sufficient number of bins and data points it is possible to estimate the true underlying density almost as well as if the data were not binned. The paper concludes with a brief description of an application of this approach to diagnosis of iron deficiency anemia, in the context of binned and truncated bivariate measurements of volume and hemoglobin concentration from an individual's red blood cells.