904 resultados para Uncertainty bias


Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: The aim of this study was to examine the differences between those who gave informed consent to a study on substance use and those who did not, and to analyze whether differences changed with varying nonconsent rates. METHOD: Cross-sectional questionnaire data on demographics, alcohol, smoking, and cannabis use were obtained for 6,099 French- and 5,720 German-speaking 20-year-old Swiss men. Enrollment took place over 11 months for the Cohort Study on Substance Use Risk Factors (C-SURF). Consenters and nonconsenters were asked to complete a short questionnaire. Data for nearly the entire population were available because 94% responded. Weekly differences in consent rates were analyzed. Regressions examined the associations of substance use with consent giving and consent rates and the interaction between the two. RESULTS: Nonconsenters had higher substance use patterns, although they were more often alcohol abstainers; differences were small and not always significant and did not decrease as consent rates increased. CONCLUSIONS: Substance use currently is a minor sensitive topic among young men, resulting in small differences between nonconsenters and consenters. As consent rates increase, additional individuals are similar to those observed at lower consent rates. Estimates of analytical studies looking at associations of substance use with other variables will not differ at reasonable consent rates of 50%-80%. Descriptive prevalence studies may be biased, but only at very low rates of consent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CONTEXT: Communication guidelines often advise physicians to disclose to their patients medical uncertainty regarding the diagnosis, origin of the problem, and treatment. However, the effect of the expression of such uncertainty on patient outcomes (e.g. satisfaction) has produced conflicting results in the literature that indicate either no effect or a negative effect. The differences in the results of past studies may be explained by the fact that potential gender effects on the link between physician-expressed uncertainty and patient outcomes have not been investigated systematically. OBJECTIVES: On the basis of previous research documenting indications that patients may judge female physicians by more severe criteria than they do male physicians, and that men are more prejudiced than women towards women, we predicted that physician-expressed uncertainty would have more of a negative impact on patient satisfaction when the physician in question was female rather than male, and especially when the patient was a man. METHODS: We conducted two studies with complementary designs. Study 1 was a randomised controlled trial conducted in a simulated setting (120 analogue patients Analogue patients are healthy participants asked to put themselves in the shoes of real medical patients by imagining being the patients of physicians shown on videos); Study 2 was a field study conducted in real medical interviews (36 physicians, 69 patients). In Study 1, participants were presented with vignettes that varied in terms of the physician's gender and physician-expressed uncertainty (high versus low). In Study 2, physicians were filmed during real medical consultations and the level of uncertainty they expressed was coded by an independent rater according to the videos. In both studies, patient satisfaction was assessed using a questionnaire. RESULTS: The results confirmed that expressed uncertainty was negatively related to patient satisfaction only when the physician was a woman (Studies 1 and 2) and when the patient was a man (Study 2). CONCLUSIONS: We believe that patients have the right to be fully informed of any medical uncertainties. If our results are confirmed in further research, the question of import will refer not to whether female physicians should communicate uncertainty, but to how they should communicate it. For instance, if it proves true that uncertainty negatively impacts on (male) patients' satisfaction, female physicians might want to counterbalance this impact by emphasizing other communication skills.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MOTIVATION: Comparative analyses of gene expression data from different species have become an important component of the study of molecular evolution. Thus methods are needed to estimate evolutionary distances between expression profiles, as well as a neutral reference to estimate selective pressure. Divergence between expression profiles of homologous genes is often calculated with Pearson's or Euclidean distance. Neutral divergence is usually inferred from randomized data. Despite being widely used, neither of these two steps has been well studied. Here, we analyze these methods formally and on real data, highlight their limitations and propose improvements. RESULTS: It has been demonstrated that Pearson's distance, in contrast to Euclidean distance, leads to underestimation of the expression similarity between homologous genes with a conserved uniform pattern of expression. Here, we first extend this study to genes with conserved, but specific pattern of expression. Surprisingly, we find that both Pearson's and Euclidean distances used as a measure of expression similarity between genes depend on the expression specificity of those genes. We also show that the Euclidean distance depends strongly on data normalization. Next, we show that the randomization procedure that is widely used to estimate the rate of neutral evolution is biased when broadly expressed genes are abundant in the data. To overcome this problem, we propose a novel randomization procedure that is unbiased with respect to expression profiles present in the datasets. Applying our method to the mouse and human gene expression data suggests significant gene expression conservation between these species. CONTACT: marc.robinson-rechavi@unil.ch; sven.bergmann@unil.ch SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Next-generation sequencing (NGS) technologies have become the standard for data generation in studies of population genomics, as the 1000 Genomes Project (1000G). However, these techniques are known to be problematic when applied to highly polymorphic genomic regions, such as the human leukocyte antigen (HLA) genes. Because accurate genotype calls and allele frequency estimations are crucial to population genomics analyses, it is important to assess the reliability of NGS data. Here, we evaluate the reliability of genotype calls and allele frequency estimates of the single-nucleotide polymorphisms (SNPs) reported by 1000G (phase I) at five HLA genes (HLA-A, -B, -C, -DRB1, and -DQB1). We take advantage of the availability of HLA Sanger sequencing of 930 of the 1092 1000G samples and use this as a gold standard to benchmark the 1000G data. We document that 18.6% of SNP genotype calls in HLA genes are incorrect and that allele frequencies are estimated with an error greater than ±0.1 at approximately 25% of the SNPs in HLA genes. We found a bias toward overestimation of reference allele frequency for the 1000G data, indicating mapping bias is an important cause of error in frequency estimation in this dataset. We provide a list of sites that have poor allele frequency estimates and discuss the outcomes of including those sites in different kinds of analyses. Because the HLA region is the most polymorphic in the human genome, our results provide insights into the challenges of using of NGS data at other genomic regions of high diversity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several observations support the hypothesis that differences in synaptic and regional cerebral plasticity between the sexes account for the high ratio of males to females in autism. First, males are more susceptible than females to perturbations in genes involved in synaptic plasticity. Second, sex-related differences in non-autistic brain structure and function are observed in highly variable regions, namely, the heteromodal associative cortices, and overlap with structural particularities and enhanced activity of perceptual associative regions in autistic individuals. Finally, functional cortical reallocations following brain lesions in non-autistic adults (for example, traumatic brain injury, multiple sclerosis) are sex-dependent. Interactions between genetic sex and hormones may therefore result in higher synaptic and consecutively regional plasticity in perceptual brain areas in males than in females. The onset of autism may largely involve mutations altering synaptic plasticity that create a plastic reaction affecting the most variable and sexually dimorphic brain regions. The sex ratio bias in autism may arise because males have a lower threshold than females for the development of this plastic reaction following a genetic or environmental event.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Geophysical data may provide crucial information about hydrological properties, states, and processes that are difficult to obtain by other means. Large data sets can be acquired over widely different scales in a minimally invasive manner and at comparatively low costs, but their effective use in hydrology makes it necessary to understand the fidelity of geophysical models, the assumptions made in their construction, and the links between geophysical and hydrological properties. Geophysics has been applied for groundwater prospecting for almost a century, but it is only in the last 20 years that it is regularly used together with classical hydrological data to build predictive hydrological models. A largely unexplored venue for future work is to use geophysical data to falsify or rank competing conceptual hydrological models. A promising cornerstone for such a model selection strategy is the Bayes factor, but it can only be calculated reliably when considering the main sources of uncertainty throughout the hydrogeophysical parameter estimation process. Most classical geophysical imaging tools tend to favor models with smoothly varying property fields that are at odds with most conceptual hydrological models of interest. It is thus necessary to account for this bias or use alternative approaches in which proposed conceptual models are honored at all steps in the model building process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Geophysical tomography captures the spatial distribution of the underlying geophysical property at a relatively high resolution, but the tomographic images tend to be blurred representations of reality and generally fail to reproduce sharp interfaces. Such models may cause significant bias when taken as a basis for predictive flow and transport modeling and are unsuitable for uncertainty assessment. We present a methodology in which tomograms are used to condition multiple-point statistics (MPS) simulations. A large set of geologically reasonable facies realizations and their corresponding synthetically calculated cross-hole radar tomograms are used as a training image. The training image is scanned with a direct sampling algorithm for patterns in the conditioning tomogram, while accounting for the spatially varying resolution of the tomograms. In a post-processing step, only those conditional simulations that predicted the radar traveltimes within the expected data error levels are accepted. The methodology is demonstrated on a two-facies example featuring channels and an aquifer analog of alluvial sedimentary structures with five facies. For both cases, MPS simulations exhibit the sharp interfaces and the geological patterns found in the training image. Compared to unconditioned MPS simulations, the uncertainty in transport predictions is markedly decreased for simulations conditioned to tomograms. As an improvement to other approaches relying on classical smoothness-constrained geophysical tomography, the proposed method allows for: (1) reproduction of sharp interfaces, (2) incorporation of realistic geological constraints and (3) generation of multiple realizations that enables uncertainty assessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Managers can craft effective integrated strategy by properly assessing regulatory uncertainty. Leveraging the existing political markets literature, we predict regulatory uncertainty from the novel interaction of demand and supply side rivalries across a range of political markets. We argue for two primary drivers of regulatory uncertainty: ideology-motivated interests opposed to the firm and a lack of competition for power among political actors supplying public policy. We align three, previously disparate dimensions of nonmarket strategy - profile level, coalition breadth, and pivotal target - to levels of regulatory uncertainty. Through this framework, we demonstrate how and when firms employ different nonmarket strategies. To illustrate variation in nonmarket strategy across levels of regulatory uncertainty, we analyze several market entry decisions of foreign firms operating in the global telecommunications sector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT The citriculture consists in several environmental risks, as weather changes and pests, and also consists in considerable financial risk, mainly due to the period ofreturn on the initial investment. This study was motivated by the need to assess the risks of a business activity such as citriculture. Our objective was to build a stochastic simulation model to achieve the economic and financial analysis of an orange producer in the Midwest region of the state of Sao Paulo, under conditions of uncertainty. The parameters used were the Net Present Value (NPV), the Modified Internal Rate of Return(MIRR), and the Discounted Payback. To evaluate the risk conditions we built a probabilistic model of pseudorandom numbers generated with Monte Carlo method. The results showed that the activity analyzed provides a risk of 42.8% to reach a NPV negative; however, the yield assessed by MIRR was 7.7%, higher than the yield from the reapplication of the positive cash flows. The financial investment pays itself after the fourteenth year of activity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In radionuclide metrology, Monte Carlo (MC) simulation is widely used to compute parameters associated with primary measurements or calibration factors. Although MC methods are used to estimate uncertainties, the uncertainty associated with radiation transport in MC calculations is usually difficult to estimate. Counting statistics is the most obvious component of MC uncertainty and has to be checked carefully, particularly when variance reduction is used. However, in most cases fluctuations associated with counting statistics can be reduced using sufficient computing power. Cross-section data have intrinsic uncertainties that induce correlations when apparently independent codes are compared. Their effect on the uncertainty of the estimated parameter is difficult to determine and varies widely from case to case. Finally, the most significant uncertainty component for radionuclide applications is usually that associated with the detector geometry. Recent 2D and 3D x-ray imaging tools may be utilized, but comparison with experimental data as well as adjustments of parameters are usually inevitable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter presents possible uses and examples of Monte Carlo methods for the evaluation of uncertainties in the field of radionuclide metrology. The method is already well documented in GUM supplement 1, but here we present a more restrictive approach, where the quantities of interest calculated by the Monte Carlo method are estimators of the expectation and standard deviation of the measurand, and the Monte Carlo method is used to propagate the uncertainties of the input parameters through the measurement model. This approach is illustrated by an example of the activity calibration of a 103Pd source by liquid scintillation counting and the calculation of a linear regression on experimental data points. An electronic supplement presents some algorithms which may be used to generate random numbers with various statistical distributions, for the implementation of this Monte Carlo calculation method.