22 resultados para certainty and truth

em Université de Lausanne, Switzerland


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Interpretability and power of genome-wide association studies can be increased by imputing unobserved genotypes, using a reference panel of individuals genotyped at higher marker density. For many markers, genotypes cannot be imputed with complete certainty, and the uncertainty needs to be taken into account when testing for association with a given phenotype. In this paper, we compare currently available methods for testing association between uncertain genotypes and quantitative traits. We show that some previously described methods offer poor control of the false-positive rate (FPR), and that satisfactory performance of these methods is obtained only by using ad hoc filtering rules or by using a harsh transformation of the trait under study. We propose new methods that are based on exact maximum likelihood estimation and use a mixture model to accommodate nonnormal trait distributions when necessary. The new methods adequately control the FPR and also have equal or better power compared to all previously described methods. We provide a fast software implementation of all the methods studied here; our new method requires computation time of less than one computer-day for a typical genome-wide scan, with 2.5 M single nucleotide polymorphisms and 5000 individuals.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The family doctor facing complexity must decide in situations of low certainty and low agreement. Complexity is in part subjective but can also be measured. Changes in the health systems aim to reduce health costs. They tend to give priority to simple situations and to neglect complexity. One role of an academic institute of family medicine is to present and promote the results of scientific research supporting the principles of family medicine, taking into account both the local context and health systems reforms. In Switzerland the new challenge is the introduction of managed care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Starting from the observation that ghosts are strikingly recurrent and prominent figures in late-twentieth African diasporic literature, this dissertation proposes to account for this presence by exploring its various functions. It argues that, beyond the poetic function the ghost performs as metaphor, it also does cultural, theoretical and political work that is significant to the African diaspora in its dealings with issues of history, memory and identity. Toni Morrison's Beloved (1987) serves as a guide for introducing the many forms, qualities and significations of the ghost, which are then explored and analyzed in four chapters that look at Fred D'Aguiar's Feeding the Ghosts (1998), Gloria Naylor's Mama Day (1988), Paule Marshall's Praisesong for the Widow (1983) and a selection of novels, short stories and poetry by Michelle Cliff. Moving thematically through these texts, the discussion shifts from history through memory to identity as it examines how the ghost trope allows the writers to revisit sites of trauma; revise historical narratives that are constituted and perpetuated by exclusions and invisibilities; creatively and critically repossess a past marked by violence, dislocation and alienation and reclaim the diasporic culture it contributed to shaping; destabilize and deconstruct the hegemonic, normative categories and boundaries that delimit race or sexuality and envision other, less limited and limiting definitions of identity. These diverse and interrelated concerns are identified and theorized as participating in a project of "re-vision," a critical project that constitutes an epistemological as much as a political gesture. The author-based structure allows for a detailed analysis of the texts and highlights the distinctive shapes the ghost takes and the particular concerns it serves to address in each writer's literary and political project. However, using the ghost as a guide into these texts, taken collectively, also throws into relief new connections between them and sheds light on the complex ways in which the interplay of history, memory and identity positions them as products of and contributions to an African diasporic (literary) culture. If it insists on the cultural specificity of African diasporic ghosts, tracing its origins to African cultures and spiritualities, the argument also follows gothic studies' common view that ghosts in literary and cultural productions-like other related figures of the living dead-respond to particular conditions and anxieties. Considering the historical and political context in which the texts under study were produced, the dissertation makes connections between the ghosts in them and African diasporic people's disillusionment with the broken promises of the civil rights movement in the United States and of postcolonial independence in the Caribbean. It reads the texts' theoretical concerns and narrative qualities alongside the contestation of traditional historiography by black and postcolonial studies as well as the broader challenge to conventional notions such as truth, reality, meaning, power or identity by poststructuralism, postcolonialism or queer theory. Drawing on these various theoretical approaches and critical tools to elucidate the ghost's deconstructive power for African diasporic writers' concerns, this work ultimately offers a contribution to "speciality studies," which is currently emerging as a new field of scholarship in cultural theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation focuses on the practice of regulatory governance, throughout the study of the functioning of formally independent regulatory agencies (IRAs), with special attention to their de facto independence. The research goals are grounded on a "neo-positivist" (or "reconstructed positivist") position (Hawkesworth 1992; Radaelli 2000b; Sabatier 2000). This perspective starts from the ontological assumption that even if subjective perceptions are constitutive elements of political phenomena, a real world exists beyond any social construction and can, however imperfectly, become the object of scientific inquiry. Epistemologically, it follows that hypothetical-deductive theories with explanatory aims can be tested by employing a proper methodology and set of analytical techniques. It is thus possible to make scientific inferences and general conclusions to a certain extent, according to a Bayesian conception of knowledge, in order to update the prior scientific beliefs in the truth of the related hypotheses (Howson 1998), while acknowledging the fact that the conditions of truth are at least partially subjective and historically determined (Foucault 1988; Kuhn 1970). At the same time, a sceptical position is adopted towards the supposed disjunction between facts and values and the possibility of discovering abstract universal laws in social science. It has been observed that the current version of capitalism corresponds to the golden age of regulation, and that since the 1980s no government activity in OECD countries has grown faster than regulatory functions (Jacobs 1999). Following an apparent paradox, the ongoing dynamics of liberalisation, privatisation, decartelisation, internationalisation, and regional integration hardly led to the crumbling of the state, but instead promoted a wave of regulatory growth in the face of new risks and new opportunities (Vogel 1996). Accordingly, a new order of regulatory capitalism is rising, implying a new division of labour between state and society and entailing the expansion and intensification of regulation (Levi-Faur 2005). The previous order, relying on public ownership and public intervention and/or on sectoral self-regulation by private actors, is being replaced by a more formalised, expert-based, open, and independently regulated model of governance. Independent regulation agencies (IRAs), that is, formally independent administrative agencies with regulatory powers that benefit from public authority delegated from political decision makers, represent the main institutional feature of regulatory governance (Gilardi 2008). IRAs constitute a relatively new technology of regulation in western Europe, at least for certain domains, but they are increasingly widespread across countries and sectors. For instance, independent regulators have been set up for regulating very diverse issues, such as general competition, banking and finance, telecommunications, civil aviation, railway services, food safety, the pharmaceutical industry, electricity, environmental protection, and personal data privacy. Two attributes of IRAs deserve a special mention. On the one hand, they are formally separated from democratic institutions and elected politicians, thus raising normative and empirical concerns about their accountability and legitimacy. On the other hand, some hard questions about their role as political actors are still unaddressed, though, together with regulatory competencies, IRAs often accumulate executive, (quasi-)legislative, and adjudicatory functions, as well as about their performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ultrasound segmentation is a challenging problem due to the inherent speckle and some artifacts like shadows, attenuation and signal dropout. Existing methods need to include strong priors like shape priors or analytical intensity models to succeed in the segmentation. However, such priors tend to limit these methods to a specific target or imaging settings, and they are not always applicable to pathological cases. This work introduces a semi-supervised segmentation framework for ultrasound imaging that alleviates the limitation of fully automatic segmentation, that is, it is applicable to any kind of target and imaging settings. Our methodology uses a graph of image patches to represent the ultrasound image and user-assisted initialization with labels, which acts as soft priors. The segmentation problem is formulated as a continuous minimum cut problem and solved with an efficient optimization algorithm. We validate our segmentation framework on clinical ultrasound imaging (prostate, fetus, and tumors of the liver and eye). We obtain high similarity agreement with the ground truth provided by medical expert delineations in all applications (94% DICE values in average) and the proposed algorithm performs favorably with the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To evaluate accuracy and reproducibility of flow velocity and volume measurements in a phantom and in human coronary arteries using breathhold velocity-encoded (VE) MRI with spiral k-space sampling at 3 Tesla. MATERIALS AND METHODS: Flow velocity assessment was performed using VE MRI with spiral k-space sampling. Accuracy of VE MRI was tested in vitro at five constant flow rates. Reproducibility was investigated in 19 healthy subjects (mean age 25.4 +/- 1.2 years, 11 men) by repeated acquisition in the right coronary artery (RCA). RESULTS: MRI-measured flow rates correlated strongly with volumetric collection (Pearson correlation r = 0.99; P < 0.01). Due to limited sample resolution, VE MRI overestimated the flow rate by 47% on average when nonconstricted region-of-interest segmentation was used. Using constricted region-of-interest segmentation with lumen size equal to ground-truth luminal size, less than 13% error in flow rate was found. In vivo RCA flow velocity assessment was successful in 82% of the applied studies. High interscan, intra- and inter-observer agreement was found for almost all indices describing coronary flow velocity. Reproducibility for repeated acquisitions varied by less than 16% for peak velocity values and by less than 24% for flow volumes. CONCLUSION: 3T breathhold VE MRI with spiral k-space sampling enables accurate and reproducible assessment of RCA flow velocity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this research was to evaluate how fingerprint analysts would incorporate information from newly developed tools into their decision making processes. Specifically, we assessed effects using the following: (1) a quality tool to aid in the assessment of the clarity of the friction ridge details, (2) a statistical tool to provide likelihood ratios representing the strength of the corresponding features between compared fingerprints, and (3) consensus information from a group of trained fingerprint experts. The measured variables for the effect on examiner performance were the accuracy and reproducibility of the conclusions against the ground truth (including the impact on error rates) and the analyst accuracy and variation for feature selection and comparison.¦The results showed that participants using the consensus information from other fingerprint experts demonstrated more consistency and accuracy in minutiae selection. They also demonstrated higher accuracy, sensitivity, and specificity in the decisions reported. The quality tool also affected minutiae selection (which, in turn, had limited influence on the reported decisions); the statistical tool did not appear to influence the reported decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bioterrorism literally means using microorganisms or infected samples to cause terror and panic in populations. Bioterrorism had already started 14 centuries before Christ, when the Hittites sent infected rams to their enemies. However, apart from some rare well-documented events, it is often very difficult for historians and microbiologists to differentiate natural epidemics from alleged biological attacks, because: (i) little information is available for times before the advent of modern microbiology; (ii) truth may be manipulated for political reasons, especially for a hot topic such as a biological attack; and (iii) the passage of time may also have distorted the reality of the past. Nevertheless, we have tried to provide to clinical microbiologists an overview of some likely biological warfare that occurred before the 18th century and that included the intentional spread of epidemic diseases such as tularaemia, plague, malaria, smallpox, yellow fever, and leprosy. We also summarize the main events that occurred during the modern microbiology era, from World War I to the recent 'anthrax letters' that followed the World Trade Center attack of September 2001. Again, the political polemic surrounding the use of infectious agents as a weapon may distort the truth. This is nicely exemplified by the Sverdlovsk accident, which was initially attributed by the authorities to a natural foodborne outbreak, and was officially recognized as having a military cause only 13 years later.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Truth-telling is a complex task requiring multiple skills in communication, understanding, and empathy. Its application in the context of breaking bad news (BBN) is distressing and problematic if conducted with insufficient skills. PURPOSE: We investigated the long-term influence of a simulated patient-based teaching intervention integrating the learning of communication skills within an ethical reflection on students' ethical attitudes towards truth-telling, perceived competence and comfort in BBN. METHODS: We followed two cohorts of medical students from the preclinical third year to their clinical rotations (fifth year). We analysed their ethical attitudes and level of comfort and competence in BBN before, after the intervention, and during clinical rotations. RESULTS: Students' ethical attitudes towards truth-telling remained stable. Students feeling uncomfortable or incompetent improved their level of perceived comfort or competence after the intervention, but those feeling comfortable or competent became more aware of the difficulty of the situation, and consequently decreased their level of comfort and competence. CONCLUSIONS: Confronting students with a realistic situation and integrating the practice of communication skills within an ethical reflection may be effective in maintaining ethical attitudes towards truth-telling, in developing new skills and increasing awareness about the difficulty and challenges of a BBN situation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a deep study on tissue modelization andclassification Techniques on T1-weighted MR images. Threeapproaches have been taken into account to perform thisvalidation study. Two of them are based on FiniteGaussian Mixture (FGM) model. The first one consists onlyin pure gaussian distributions (FGM-EM). The second oneuses a different model for partial volume (PV) (FGM-GA).The third one is based on a Hidden Markov Random Field(HMRF) model. All methods have been tested on a DigitalBrain Phantom image considered as the ground truth. Noiseand intensity non-uniformities have been added tosimulate real image conditions. Also the effect of ananisotropic filter is considered. Results demonstratethat methods relying in both intensity and spatialinformation are in general more robust to noise andinhomogeneities. However, in some cases there is nosignificant differences between all presented methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Dermatophytes are the main cause of onychomycoses, but various nondermatophyte filamentous fungi are often isolated from abnormal nails. The correct identification of the aetiological agent of nail infections is necessary in order to recommend appropriate treatment. OBJECTIVE: To evaluate a rapid polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) assay based on 28S rDNA for fungal identification in nails on a large number of samples in comparison with cultures. METHODS: Infectious fungi were analysed using PCR-RFLP in 410 nail samples in which fungal elements were observed in situ by direct mycological examination (positive samples). The results were compared with those previously obtained by culture of fungi on Sabouraud agar from the same nail samples. RESULTS: PCR-RFLP identification of fungi in nails allowed validation of the results obtained in culture when Trichophyton spp. grew from infected samples. In addition, nondermatophyte filamentous fungi could be identified with certainty as the infectious agents in onychomycosis, and discriminated from dermatophytes as well as from transient contaminants. The specificity of the culture results relative to PCR-RFLP appeared to be 81%, 71%, 52% and 63% when Fusarium spp., Scopulariopsis brevicaulis, Aspergillus spp. and Candida spp., respectively, grew on Sabouraud agar. It was also possible to identify the infectious agent when direct nail mycological examination showed fungal elements, but negative results were obtained from fungal culture. CONCLUSIONS: Improved sensitivity for the detection of fungi in nails was obtained using the PCR-RFLP assay. Rapid and reliable molecular identification of the infectious fungus can be used routinely and presents several important advantages compared with culture in expediting the choice of appropriate antifungal therapy.