914 resultados para Error Probability
Resumo:
We address the problem of selecting the best linear unbiased predictor (BLUP) of the latent value (e.g., serum glucose fasting level) of sample subjects with heteroskedastic measurement errors. Using a simple example, we compare the usual mixed model BLUP to a similar predictor based on a mixed model framed in a finite population (FPMM) setup with two sources of variability, the first of which corresponds to simple random sampling and the second, to heteroskedastic measurement errors. Under this last approach, we show that when measurement errors are subject-specific, the BLUP shrinkage constants are based on a pooled measurement error variance as opposed to the individual ones generally considered for the usual mixed model BLUP. In contrast, when the heteroskedastic measurement errors are measurement condition-specific, the FPMM BLUP involves different shrinkage constants. We also show that in this setup, when measurement errors are subject-specific, the usual mixed model predictor is biased but has a smaller mean squared error than the FPMM BLUP which points to some difficulties in the interpretation of such predictors. (C) 2011 Elsevier By. All rights reserved.
Resumo:
The main goal of this article is to consider influence assessment in models with error-prone observations and variances of the measurement errors changing across observations. The techniques enable to identify potential influential elements and also to quantify the effects of perturbations in these elements on some results of interest. The approach is illustrated with data from the WHO MONICA Project on cardiovascular disease.
Resumo:
The scope of this study was to estimate calibrated values for dietary data obtained by the Food Frequency Questionnaire for Adolescents (FFQA) and illustrate the effect of this approach on food consumption data. The adolescents were assessed on two occasions, with an average interval of twelve months. In 2004, 393 adolescents participated, and 289 were then reassessed in 2005. Dietary data obtained by the FFQA were calibrated using the regression coefficients estimated from the average of two 24-hour recalls (24HR) of the subsample. The calibrated values were similar to the the 24HR reference measurement in the subsample. In 2004 and 2005 a significant difference was observed between the average consumption levels of the FFQA before and after calibration for all nutrients. With the use of calibrated data the proportion of schoolchildren who had fiber intake below the recommended level increased. Therefore, it is seen that calibrated data can be used to obtain adjusted associations due to reclassification of subjects within the predetermined categories.
Resumo:
Since a genome is a discrete sequence, the elements of which belong to a set of four letters, the question as to whether or not there is an error-correcting code underlying DNA sequences is unavoidable. The most common approach to answering this question is to propose a methodology to verify the existence of such a code. However, none of the methodologies proposed so far, although quite clever, has achieved that goal. In a recent work, we showed that DNA sequences can be identified as codewords in a class of cyclic error-correcting codes known as Hamming codes. In this paper, we show that a complete intron-exon gene, and even a plasmid genome, can be identified as a Hamming code codeword as well. Although this does not constitute a definitive proof that there is an error-correcting code underlying DNA sequences, it is the first evidence in this direction.
Resumo:
Workplace accidents involving machines are relevant for their magnitude and their impacts on worker health. Despite consolidated critical statements, explanation centered on errors of operators remains predominant with industry professionals, hampering preventive measures and the improvement of production-system reliability. Several initiatives were adopted by enforcement agencies in partnership with universities to stimulate production and diffusion of analysis methodologies with a systemic approach. Starting from one accident case that occurred with a worker who operated a brake-clutch type mechanical press, the article explores cognitive aspects and the existence of traps in the operation of this machine. It deals with a large-sized press that, despite being endowed with a light curtain in areas of access to the pressing zone, did not meet legal requirements. The safety devices gave rise to an illusion of safety, permitting activation of the machine when a worker was still found within the operational zone. Preventive interventions must stimulate the tailoring of systems to the characteristics of workers, minimizing the creation of traps and encouraging safety policies and practices that replace judgments of behaviors that participate in accidents by analyses of reasons that lead workers to act in that manner.
Resumo:
To estimate causal relationships, time series econometricians must be aware of spurious correlation, a problem first mentioned by Yule (1926). To deal with this problem, one can work either with differenced series or multivariate models: VAR (VEC or VECM) models. These models usually include at least one cointegration relation. Although the Bayesian literature on VAR/VEC is quite advanced, Bauwens et al. (1999) highlighted that "the topic of selecting the cointegrating rank has not yet given very useful and convincing results". The present article applies the Full Bayesian Significance Test (FBST), especially designed to deal with sharp hypotheses, to cointegration rank selection tests in VECM time series models. It shows the FBST implementation using both simulated and available (in the literature) data sets. As illustration, standard non informative priors are used.
Resumo:
This paper analyzes concepts of independence and assumptions of convexity in the theory of sets of probability distributions. The starting point is Kyburg and Pittarelli's discussion of "convex Bayesianism" (in particular their proposals concerning E-admissibility, independence, and convexity). The paper offers an organized review of the literature on independence for sets of probability distributions; new results on graphoid properties and on the justification of "strong independence" (using exchangeability) are presented. Finally, the connection between Kyburg and Pittarelli's results and recent developments on the axiomatization of non-binary preferences, and its impact on "complete" independence, are described.
Resumo:
Objectives To evaluate the accuracy and probabilities of different fetal ultrasound parameters to predict neonatal outcome in isolated congenital diaphragmatic hernia (CDH). Methods Between January 2004 and December 2010, we evaluated prospectively 108 fetuses with isolated CDH (82 left-sided and 26 right-sided). The following parameters were evaluated: gestational age at diagnosis, side of the diaphragmatic defect, presence of polyhydramnios, presence of liver herniated into the fetal thorax (liver-up), lung-to-head ratio (LHR) and observed/expected LHR (o/e-LHR), observed/expected contralateral and total fetal lung volume (o/e-ContFLV and o/e-TotFLV) ratios, ultrasonographic fetal lung volume/fetal weight ratio (US-FLW), observed/expected contralateral and main pulmonary artery diameter (o/e-ContPA and o/eMPA) ratios and the contralateral vascularization index (Cont-VI). The outcomes were neonatal death and severe postnatal pulmonary arterial hypertension (PAH). Results Neonatal mortality was 64.8% (70/108). Severe PAH was diagnosed in 68 (63.0%) cases, of which 63 died neonatally (92.6%) (P < 0.001). Gestational age at diagnosis, side of the defect and polyhydramnios were not associated with poor outcome (P > 0.05). LHR, o/eLHR, liver-up, o/e-ContFLV, o/e-TotFLV, US-FLW, o/eContPA, o/e-MPA and Cont-VI were associated with both neonatal death and severe postnatal PAH (P < 0.001). Receiver-operating characteristics curves indicated that measuring total lung volumes (o/e-TotFLV and US-FLW) was more accurate than was considering only the contralateral lung sizes (LHR, o/e-LHR and o/e-ContFLV; P < 0.05), and Cont-VI was the most accurate ultrasound parameter to predict neonatal death and severe PAH (P < 0.001). Conclusions Evaluating total lung volumes is more accurate than is measuring only the contralateral lung size. Evaluating pulmonary vascularization (Cont-VI) is the most accurate predictor of neonatal outcome. Estimating the probability of survival and severe PAH allows classification of cases according to prognosis. Copyright (C) 2011 ISUOG. Published by John Wiley & Sons, Ltd.
Resumo:
Abstract Background Smear-negative pulmonary tuberculosis (SNPTB) accounts for 30% of Pulmonary Tuberculosis (PTB) cases reported annually in developing nations. Polymerase chain reaction (PCR) may provide an alternative for the rapid detection of Mycobacterium tuberculosis (MTB); however little data are available regarding the clinical utility of PCR in SNPTB, in a setting with a high burden of TB/HIV co-infection. Methods To evaluate the performance of the PCR dot-blot in parallel with pretest probability (Clinical Suspicion) in patients suspected of having SNPTB, a prospective study of 213 individuals with clinical and radiological suspicion of SNPTB was carried out from May 2003 to May 2004, in a TB/HIV reference hospital. Respiratory specialists estimated the pretest probability of active disease into high, intermediate, low categories. Expectorated sputum was examined by direct microscopy (Ziehl-Neelsen staining), culture (Lowenstein Jensen) and PCR dot-blot. Gold standard was based on culture positivity combined with the clinical definition of PTB. Results In smear-negative and HIV subjects, active PTB was diagnosed in 28.4% (43/151) and 42.2% (19/45), respectively. In the high, intermediate and low pretest probability categories active PTB was diagnosed in 67.4% (31/46), 24% (6/25), 7.5% (6/80), respectively. PCR had sensitivity of 65% (CI 95%: 50%–78%) and specificity of 83% (CI 95%: 75%–89%). There was no difference in the sensitivity of PCR in relation to HIV status. PCR sensitivity and specificity among non-previously TB treated and those treated in the past were, respectively: 69%, 43%, 85% and 80%. The high pretest probability, when used as a diagnostic test, had sensitivity of 72% (CI 95%:57%–84%) and specificity of 86% (CI 95%:78%–92%). Using the PCR dot-blot in parallel with high pretest probability as a diagnostic test, sensitivity, specificity, positive and negative predictive values were: 90%, 71%, 75%, and 88%, respectively. Among non-previously TB treated and HIV subjects, this approach had sensitivity, specificity, positive and negative predictive values of 91%, 79%, 81%, 90%, and 90%, 65%, 72%, 88%, respectively. Conclusion PCR dot-blot associated with a high clinical suspicion may provide an important contribution to the diagnosis of SNPTB mainly in patients that have not been previously treated attended at a TB/HIV reference hospital.
Resumo:
Abstract Background The generalized odds ratio (GOR) was recently suggested as a genetic model-free measure for association studies. However, its properties were not extensively investigated. We used Monte Carlo simulations to investigate type-I error rates, power and bias in both effect size and between-study variance estimates of meta-analyses using the GOR as a summary effect, and compared these results to those obtained by usual approaches of model specification. We further applied the GOR in a real meta-analysis of three genome-wide association studies in Alzheimer's disease. Findings For bi-allelic polymorphisms, the GOR performs virtually identical to a standard multiplicative model of analysis (e.g. per-allele odds ratio) for variants acting multiplicatively, but augments slightly the power to detect variants with a dominant mode of action, while reducing the probability to detect recessive variants. Although there were differences among the GOR and usual approaches in terms of bias and type-I error rates, both simulation- and real data-based results provided little indication that these differences will be substantial in practice for meta-analyses involving bi-allelic polymorphisms. However, the use of the GOR may be slightly more powerful for the synthesis of data from tri-allelic variants, particularly when susceptibility alleles are less common in the populations (≤10%). This gain in power may depend on knowledge of the direction of the effects. Conclusions For the synthesis of data from bi-allelic variants, the GOR may be regarded as a multiplicative-like model of analysis. The use of the GOR may be slightly more powerful in the tri-allelic case, particularly when susceptibility alleles are less common in the populations.
Resumo:
OBJECTIVE: To estimate the pretest probability of Cushing's syndrome (CS) diagnosis by a Bayesian approach using intuitive clinical judgment. MATERIALS AND METHODS: Physicians were requested, in seven endocrinology meetings, to answer three questions: "Based on your personal expertise, after obtaining clinical history and physical examination, without using laboratorial tests, what is your probability of diagnosing Cushing's Syndrome?"; "For how long have you been practicing Endocrinology?"; and "Where do you work?". A Bayesian beta regression, using the WinBugs software was employed. RESULTS: We obtained 294 questionnaires. The mean pretest probability of CS diagnosis was 51.6% (95%CI: 48.7-54.3). The probability was directly related to experience in endocrinology, but not with the place of work. CONCLUSION: Pretest probability of CS diagnosis was estimated using a Bayesian methodology. Although pretest likelihood can be context-dependent, experience based on years of practice may help the practitioner to diagnosis CS. Arq Bras Endocrinol Metab. 2012;56(9):633-7
Resumo:
OBJECTIVE: The objective of this study was to evaluate the frequencies of human platelet antigens in oncohematological patients with thrombocytopenia and to analyze the probability of their incompatibility with platelet transfusions. METHODS: Platelet antigen genotyping was performed by sequence-specific primer polymerase chain reaction (SSP-PCR) for the HPA-1a, HPA-1b, HPA-2a, HPA-2b, HPA-3a, HPA-3b, HPA-4a, HPA-4b, HPA-5a, HPA-5b; HPA-15a, HPA-15b alleles in 150 patients of the Hematology Service of the Hospital das Clínicas (FMUSP). RESULTS: The allele frequencies found were: HPA-1a: 0.837; HPA-1b: 0.163; HPA-2a: 0.830; HPA-2b: 0.170; HPA-3a: 0.700; HPA-3b: 0.300; HPA-4a: 1; HPA-4b: 0; HPA-5a: 0.887; HPA-5b: 0.113; HPA-15a: 0.457 and HPA-15b: 0.543. CONCLUSIONS: Data from the present study showed that the A allele is more common in the population than the B allele, except for HPA-15. This suggests that patients homozygous for the B allele are more predisposed to present alloimmunization and refractoriness to platelet transfusions by immune causes. Platelet genotyping could be of great value in the diagnosis of alloimmune thrombocytopenia and to provide compatible platelet concentrates for these patients.
Resumo:
This paper discusses the theoretical and experimental results obtained for the excitonic binding energy (Eb) in a set of single and coupled double quantum wells (SQWs and CDQWs) of GaAs/AlGaAs with different Al concentrations (Al%) and inter-well barrier thicknesses. To obtain the theoretical Eb the method proposed by Mathieu, Lefebvre and Christol (MLC) was used, which is based on the idea of fractional-dimension space, together with the approach proposed by Zhao et al., which extends the MLC method for application in CDQWs. Through magnetophotoluminescence (MPL) measurements performed at 4 K with magnetic fields ranging from 0 T to 12 T, the diamagnetic shift curves were plotted and adjusted using two expressions: one appropriate to fit the curve in the range of low intensity fields and another for the range of high intensity fields, providing the experimental Eb values. The effects of increasing the Al% and the inter-well barrier thickness on Eb are discussed. The Eb reduction when going from the SQW to the CDQW with 5 Å inter-well barrier is clearly observed experimentally for 35% Al concentration and this trend can be noticed even for concentrations as low as 25% and 15%, although the Eb variations in these latter cases are within the error bars. As the Zhao's approach is unable to describe this effect, the wave functions and the probability densities for electrons and holes were calculated, allowing us to explain this effect as being due to a decrease in the spatial superposition of the wave functions caused by the thin inter-well barrier.
Resumo:
[ES] Este trabajo presenta algunas posibilidades de aprovechamiento de la opinión cualitativa de un auditor. Se desarrolla en torno a un caso ficticio que contiene las ideas básicas sobre la metodología expuesta.