869 resultados para Evidence evaluation
Resumo:
Introduction: Colonic lesions are predominant in patients with schistosomiasis. However, carbohydrate alterations in colonic schistosomiasis remain unclear. Lectin-ligands allow us to identify changes in the saccharide patterns of cells. Methods: Biopsies of descending and rectosigmoid colon of patients were submitted to WGA and Con A lectin histochemistry. Results: WGA stained stroma and gland cells of descending colon and rectosigmoid tissues in a granular strong cytoplasmatic pattern in schistosomiasis specimens differing from normal control and Con A failing to recognize all samples analyzed. Conclusions: WGA ligands are expressed differently in patients with hepatosplenic schistosomiasis and no evidence of egg-granuloma system.
Resumo:
Forensic scientists working in 12 state or private laboratories participated in collaborative tests to improve the reliability of the presentation of DNA data at trial. These tests were motivated in response to the growing criticism of the power of DNA evidence. The experts' conclusions in the tests are presented and discussed in the context of the Bayesian approach to interpretation. The use of a Bayesian approach and subjective probabilities in trace evaluation permits, in an easy and intuitive manner, the integration into the decision procedure of any revision of the measure of uncertainty in the light of new information. Such an integration is especially useful with forensic evidence. Furthermore, we believe that this probabilistic model is a useful tool (a) to assist scientists in the assessment of the value of scientific evidence, (b) to help jurists in the interpretation of judicial facts and (c) to clarify the respective roles of scientists and of members of the court. Respondents to the survey were reluctant to apply this methodology in the assessment of DNA evidence.
Resumo:
This study was commissioned by the European Committee on Crime Problems at the Council of Europe to describe and discuss the standards used to asses the admissibility and appraisal of scientific evidence in various member countries. After documenting cases in which faulty forensic evidence seems to have played a critical role, the authors describe the legal foundations of the issues of admissibility and assessment of the probative value in the field of scientific evidence, contrasting criminal justice systems of accusatorial and inquisitorial tradition and the various risks that they pose in terms of equality of arms. Special attention is given to communication issues between lawyers and scientific experts. The authors eventually investigate possible ways of improving the system. Among these mechanisms, emphasis is put on the adoption of a common terminology for expressing the weight of evidence. It is also proposed to adopt an harmonized interpretation framework among forensic experts rooted in good practices of logical inference.
Resumo:
We report results from a randomized policy experiment designed to test whether increasedaudit risk deters rent extraction in local public procurement and service delivery in Brazil. Ourestimates suggest that temporarily increasing annual audit risk by about 20 percentage pointsreduced the proportion of irregular local procurement processes by about 17 percentage points.This reduction was driven entirely by irregularities involving mismanagement or corruption. Incontrast, we find no evidence that increased audit risk affected the quality of publicly providedpreventive and primary health care services -measured based on user satisfaction surveys- orcompliance with national regulations of the conditional cash transfer program "Bolsa Família".
Resumo:
Fingerprint practitioners rely on level 3 features to make decisions in relation to the source of an unknown friction ridge skin impression. This research proposes to assess the strength of evidence associated with pores when shown in (dis)agreement between a mark and a reference print. Based upon an algorithm designed to automatically detect pores, a metric is defined in order to compare different impressions. From this metric, the weight of the findings is quantified using a likelihood ratio. The results obtained on four configurations and 54 donors show the significant contribution of the pore features and translate into statistical terms what latent fingerprint examiners have developed holistically through experience. The system provides LRs that are indicative of the true state under both the prosecution and the defense propositions. Not only such a system brings transparency regarding the weight to assign to such features, but also forces a discussion in relation to the risks of such a model to mislead.
Resumo:
OBJECTIVE: To extract and to validate a brief version of the DISCERN which could identify mental health-related websites with good content quality. METHOD: The present study is based on the analysis of data issued from six previous studies which used DISCERN and a standardized tool for the evaluation of content quality (evidence-based health information) of 388 mental health-related websites. After extracting the Brief DISCERN, several psychometric properties (content validity through a Factor analysis, internal consistency by the Cronbach's alpha index, predictive validity through the diagnostic tests, concurrent validity by the strength of association between the Brief DISCERN and the original DISCERN scores) were investigated to ascertain its general applicability. RESULTS: A Brief DISCERN composed of two factors and six items was extracted from the original 16 items version of the DISCERN. Cronbach's alpha coefficients were more than acceptable for the complete questionnaire (alpha=0.74) and for the two distinct domains: treatments information (alpha=0.87) and reliability (alpha=0.83). Sensibility and specificity of the Brief DISCERN cut-off score > or =16 in the detection of good content quality websites were 0.357 and 0.945, respectively. Its predictive positive and negative values were 0.98 and 0.83, respectively. A statistically significant linear correlation was found between the total scores of the Brief DISCERN and those of the original DISCERN (r=0.84 and p<0.0005). CONCLUSION: The Brief DISCERN seems to be a reliable and valid instrument able to discriminate between websites with good and poor content quality. PRACTICE IMPLICATIONS: The Brief DISCERN is a simple tool which could facilitate the identification of good information on the web by patients and general consumers.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
Résumé Ce travail de thèse étudie des moyens de formalisation permettant d'assister l'expert forensique dans la gestion des facteurs influençant l'évaluation des indices scientifiques, tout en respectant des procédures d'inférence établies et acceptables. Selon une vue préconisée par une partie majoritaire de la littérature forensique et juridique - adoptée ici sans réserve comme point de départ - la conceptualisation d'une procédure évaluative est dite 'cohérente' lors qu'elle repose sur une implémentation systématique de la théorie des probabilités. Souvent, par contre, la mise en oeuvre du raisonnement probabiliste ne découle pas de manière automatique et peut se heurter à des problèmes de complexité, dus, par exemple, à des connaissances limitées du domaine en question ou encore au nombre important de facteurs pouvant entrer en ligne de compte. En vue de gérer ce genre de complications, le présent travail propose d'investiguer une formalisation de la théorie des probabilités au moyen d'un environment graphique, connu sous le nom de Réseaux bayesiens (Bayesian networks). L'hypothèse principale que cette recherche envisage d'examiner considère que les Réseaux bayesiens, en concert avec certains concepts accessoires (tels que des analyses qualitatives et de sensitivité), constituent une ressource clé dont dispose l'expert forensique pour approcher des problèmes d'inférence de manière cohérente, tant sur un plan conceptuel que pratique. De cette hypothèse de travail, des problèmes individuels ont été extraits, articulés et abordés dans une série de recherches distinctes, mais interconnectées, et dont les résultats - publiés dans des revues à comité de lecture - sont présentés sous forme d'annexes. D'un point de vue général, ce travail apporte trois catégories de résultats. Un premier groupe de résultats met en évidence, sur la base de nombreux exemples touchant à des domaines forensiques divers, l'adéquation en termes de compatibilité et complémentarité entre des modèles de Réseaux bayesiens et des procédures d'évaluation probabilistes existantes. Sur la base de ces indications, les deux autres catégories de résultats montrent, respectivement, que les Réseaux bayesiens permettent également d'aborder des domaines auparavant largement inexplorés d'un point de vue probabiliste et que la disponibilité de données numériques dites 'dures' n'est pas une condition indispensable pour permettre l'implémentation des approches proposées dans ce travail. Le présent ouvrage discute ces résultats par rapport à la littérature actuelle et conclut en proposant les Réseaux bayesiens comme moyen d'explorer des nouvelles voies de recherche, telles que l'étude de diverses formes de combinaison d'indices ainsi que l'analyse de la prise de décision. Pour ce dernier aspect, l'évaluation des probabilités constitue, dans la façon dont elle est préconisée dans ce travail, une étape préliminaire fondamentale de même qu'un moyen opérationnel.
Resumo:
The electrochemical behavior of the interaction of amodiaquine with DNA on a carbon paste electrode was studied using voltametric techniques. In an acid medium, an electroactive adduct is formed when amodiaquine interacts with DNA. The anodic peak is dependent on pH, scan rate and the concentration of the pharmaceutical. Adduct formation is irreversible in nature, and preferentially occurs by interaction of the amodiaquine with the guanine group. Theoretical calculations for optimization of geometry, and DFT analyses and on the electrostatic potential map (EPM), were used in the investigation of adduct formation between amodiaquine and DNA.