68 resultados para Probabilistic methodology
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
Unlike the evaluation of single items of scientific evidence, the formal study and analysis of the jointevaluation of several distinct items of forensic evidence has to date received some punctual, ratherthan systematic, attention. Questions about the (i) relationships among a set of (usually unobservable)propositions and a set of (observable) items of scientific evidence, (ii) the joint probative valueof a collection of distinct items of evidence as well as (iii) the contribution of each individual itemwithin a given group of pieces of evidence still represent fundamental areas of research. To somedegree, this is remarkable since both, forensic science theory and practice, yet many daily inferencetasks, require the consideration of multiple items if not masses of evidence. A recurrent and particularcomplication that arises in such settings is that the application of probability theory, i.e. the referencemethod for reasoning under uncertainty, becomes increasingly demanding. The present paper takesthis as a starting point and discusses graphical probability models, i.e. Bayesian networks, as frameworkwithin which the joint evaluation of scientific evidence can be approached in some viable way.Based on a review of existing main contributions in this area, the article here aims at presentinginstances of real case studies from the author's institution in order to point out the usefulness andcapacities of Bayesian networks for the probabilistic assessment of the probative value of multipleand interrelated items of evidence. A main emphasis is placed on underlying general patterns of inference,their representation as well as their graphical probabilistic analysis. Attention is also drawnto inferential interactions, such as redundancy, synergy and directional change. These distinguish thejoint evaluation of evidence from assessments of isolated items of evidence. Together, these topicspresent aspects of interest to both, domain experts and recipients of expert information, because theyhave bearing on how multiple items of evidence are meaningfully and appropriately set into context.
Resumo:
This chapter describes the profile of the HIA, provides insight into the process and gives an example of how political decisions may be made on behalf of a concerned population through an HIA approach. [Introduction p. 284]
Resumo:
Well developed experimental procedures currently exist for retrieving and analyzing particle evidence from hands of individuals suspected of being associated with the discharge of a firearm. Although analytical approaches (e.g. automated Scanning Electron Microscopy with Energy Dispersive X-ray (SEM-EDS) microanalysis) allow the determination of the presence of elements typically found in gunshot residue (GSR) particles, such analyses provide no information about a given particle's actual source. Possible origins for which scientists may need to account for are a primary exposure to the discharge of a firearm or a secondary transfer due to a contaminated environment. In order to approach such sources of uncertainty in the context of evidential assessment, this paper studies the construction and practical implementation of graphical probability models (i.e. Bayesian networks). These can assist forensic scientists in making the issue tractable within a probabilistic perspective. The proposed models focus on likelihood ratio calculations at various levels of detail as well as case pre-assessment.
Resumo:
Citalopram, a new bicyclic antidepressant, is the most selective serotonin reuptake inhibitor. In a number of double-blind controlled studies, citalopram was compared to placebo and to known tricyclic antidepressants. These studies have shown their efficacy and good safety. The inefficacy of a psychotropic treatment in at least 20% of depressives has led a number of authors to propose original drug combinations and associations, like antidepressant/lithium (Li), antidepressant/sleep deprivation (agrypnia), antidepressant/ECT, or antidepressant/LT3. The aim of this investigation is to evaluate the clinical effectiveness and safety of a combined citalopram/lithium treatment in therapy-resistant patients, taking account of serotonergic functions, as tested by the fenfluramine/prolactin test, and of drug pharmacokinetics and pharmacogenetics of metabolism. DESIGN OF THE STUDY: A washout period of 3 days before initiating the treatment is included. After an open treatment phase of 28 days (D) with citalopram (20 mg D1-D3; 40 mg D4-D14; 40 or 60 mg D15-D28; concomitant medication allowed: chloral, chlorazepate), the nonresponding patients [less than 50% improvement in the total score on the 21 item-Hamilton Depression Rating Scale (HDRS)] are selected and treated with or without Li (randomized in double-blind conditions: citalopram/Li or citalopram/placebo) during the treatment (D29-D35). Thereafter, all patients included in the double-blind phase subsequently receive an open treatment with citalopram/Li for 7 days (D36-D42). The hypothesis of a relationship between serotoninergic functions in patients using the fenfluramine/prolactin test (D1) and the clinical response to citalopram (and Li) is assessed. Moreover, it is evaluated whether the pharmacogenetic status of the patients, as determined by the mephenytoin/dextromethorphan test (D0-D28), is related to the metabolism of fenfluramine and citalopram, and also to the clinical response. CLINICAL ASSESSMENT: Patients with a diagnosis of major depressive disorders according to DSM III are submitted to a clinical assessment of D1, D7, D14, D28, D35, D42: HDRS, CGI (clinical global impression), VAS (visual analog scales for self-rating of depression), HDRS (Hamilton depression rating scale, 21 items), UKU (side effects scale), and to clinical laboratory examens, as well as ECG, control of weight, pulse, blood pressure at D1, D28, D35. Fenfluramine/prolactin test: A butterfly needle is inserted in a forearm vein at 7 h 45 and is kept patent with liquemine. Samples for plasma prolactin, and d- and l-fenfluramine determinations are drawn at 8 h 15 (base line). Patients are given 60 mg fenfluramine (as a racemate) at 8 h 30. Kinetic points are determined at 9 h 30, 10 h 30, 11 h 30, 12 h 30, 13 h 30. Plasma levels of d- and l-fenfluramine are determined by gas chromatography and prolactin by IRNA. Mephenytoin/dextromethorphan test: Patients empty their bladders before the test; they are then given 25 mg dextropethorphan and 100 mg mephenytoin (as a racemate) at 8 h 00. They collect all urines during the following 8 hours. The metabolic ratio is determined by gas chromatography (metabolic ratio dextromethorphan/dextrorphan greater than 0.3 = PM (poor metabolizer); mephenytoin/4-OH-mephenytoin greater than 5.6, or mephenytoin S/R greater than 0.8 = PM). Citalopram plasma levels: Plasma levels of citalopram, desmethylcitalopram and didesmethylcitalopram are determined by gas chromatography--mass spectrometry. RESULTS OF THE PILOT STUDY. The investigation has been preceded by a pilot study including 14 patients, using the abovementioned protocol, except that all nonresponders were medicated with citalopram/Li on D28 to D42. The mean total score (n = 14) on the 21 item Hamilton scale was significantly reduced after the treatment, ie from 26.93 +/- 5.80 on D1 to 8.57 +/- 6.90 on D35 (p less than 0.001). A similar patCitalopram, a new bicyclic antidepressant, is the most selective serotonin reu
Resumo:
ABSTRACT: BACKGROUND: The Psychiatric arm of the population-based CoLaus study (PsyCoLaus) is designed to: 1) establish the prevalence of threshold and subthreshold psychiatric syndromes in the 35 to 66 year-old population of the city of Lausanne (Switzerland); 2) test the validity of postulated definitions for subthreshold mood and anxiety syndromes; 3) determine the associations between psychiatric disorders, personality traits and cardiovascular diseases (CVD), 4) identify genetic variants that can modify the risk for psychiatric disorders and determine whether genetic risk factors are shared between psychiatric disorders and CVD. This paper presents the method as well as somatic and sociodemographic characteristics of the sample. METHODS: All 35 to 66 year-old persons previously selected for the population-based CoLaus survey on risk factors for CVD were asked to participate in a substudy assessing psychiatric conditions. This investigation included the Diagnostic Interview for Genetic Studies to elicit diagnostic criteria for threshold disorders according to DSM-IV and algorithmically defined subthreshold syndromes. Complementary information was gathered on potential risk and protective factors for psychiatric disorders, migraine and on the morbidity of first-degree family members, whereas the collection of DNA and plasma samples was part of the original somatic study (CoLaus). RESULTS: A total of 3,691 individuals completed the psychiatric evaluation (67% participation). The gender distribution of the sample did not differ significantly from that of the general population in the same age range. Although the youngest 5-year band of the cohort was underrepresented and the oldest 5-year band overrepresented, participants of PsyCoLaus and individuals who refused to participate revealed comparable scores on the General Health Questionnaire, a self-rating instrument completed at the somatic exam. CONCLUSIONS: Despite limitations resulting from the relatively low participation in the context of a comprehensive and time-consuming investigation, the PsyCoLaus study should significantly contribute to the current understanding of psychiatric disorders and comorbid somatic conditions by: 1) establishing the clinical relevance of specific psychiatric syndromes below the DSM-IV threshold; 2) determining comorbidity between risk factors for CVD and psychiatric disorders; 3) assessing genetic variants associated with common psychiatric disorders and 4) identifying DNA markers shared between CVD and psychiatric disorders.
Resumo:
BACKGROUND: The proportion of surgery performed as a day case varies greatly between countries. Low rates suggest a large growth potential in many countries. Measuring the potential development of one day surgery should be grounded on a comprehensive list of eligible procedures, based on a priori criteria, independent of local practices. We propose an algorithmic method, using only routinely available hospital data to identify surgical hospitalizations that could have been performed as one day treatment. METHODS: Moving inpatient surgery to one day surgery was considered feasible if at least one surgical intervention was eligible for one day surgery and if none of the following criteria were present: intervention or affection requiring an inpatient stay, patient transferred or died, and length of stay greater than four days. The eligibility of a procedure to be treated as a day case was mainly established on three a priori criteria: surgical access (endoscopic or not), the invasiveness of the procedure and the size of the operated organ. Few overrides of these criteria occurred when procedures were associated with risk of immediate complications, slow physiological recovery or pain treatment requiring hospital infrastructure. The algorithm was applied to a random sample of one million inpatient US stays and more than 600 thousand Swiss inpatient stays, in the year 2002. RESULTS: The validity of our method was demonstrated by the few discrepancies between the a priori criteria based list of eligible procedures, and a state list used for reimbursement purposes, the low proportion of hospitalizations eligible for one day care found in the US sample (4.9 versus 19.4% in the Swiss sample), and the distribution of the elective procedures found eligible in Swiss hospitals, well supported by the literature. There were large variations of the proportion of candidates for one day surgery among elective surgical hospitalizations between Swiss hospitals (3 to 45.3%). CONCLUSION: The proposed approach allows the monitoring of the proportion of inpatient stay candidates for one day surgery. It could be used for infrastructure planning, resources negotiation and the surveillance of appropriate resource utilization.
Resumo:
In Switzerland, organ procurement is well organized at the national-level but transplant outcomes have not been systematically monitored so far. Therefore, a novel project, the Swiss Transplant Cohort Study (STCS), was established. The STCS is a prospective multicentre study, designed as a dynamic cohort, which enrolls all solid organ recipients at the national level. The features of the STCS are a flexible patient-case system that allows capturing all transplant scenarios and collection of patient-specific and allograft-specific data. Beyond comprehensive clinical data, specific focus is directed at psychosocial and behavioral factors, infectious disease development, and bio-banking. Between May 2008 and end of 2011, the six Swiss transplant centers recruited 1,677 patients involving 1,721 transplantations, and a total of 1,800 organs implanted in 15 different transplantation scenarios. 10 % of all patients underwent re-transplantation and 3% had a second transplantation, either in the past or during follow-up. 34% of all kidney allografts originated from living donation. Until the end of 2011 we observed 4,385 infection episodes in our patient population. The STCS showed operative capabilities to collect high-quality data and to adequately reflect the complexity of the post-transplantation process. The STCS represents a promising novel project for comparative effectiveness research in transplantation medicine.
Resumo:
Résumé L'eau est souvent considérée comme une substance ordinaire puisque elle est très commune dans la nature. En fait elle est la plus remarquable de toutes les substances. Sans l'eau la vie sur la terre n'existerait pas. L'eau représente le composant majeur de la cellule vivante, formant typiquement 70 à 95% de la masse cellulaire et elle fournit un environnement à d'innombrables organismes puisque elle couvre 75% de la surface de terre. L'eau est une molécule simple faite de deux atomes d'hydrogène et un atome d'oxygène. Sa petite taille semble en contradiction avec la subtilité de ses propriétés physiques et chimiques. Parmi celles-là, le fait que, au point triple, l'eau liquide est plus dense que la glace est particulièrement remarquable. Malgré son importance particulière dans les sciences de la vie, l'eau est systématiquement éliminée des spécimens biologiques examinés par la microscopie électronique. La raison en est que le haut vide du microscope électronique exige que le spécimen biologique soit solide. Pendant 50 ans la science de la microscopie électronique a adressé ce problème résultant en ce moment en des nombreuses techniques de préparation dont l'usage est courrant. Typiquement ces techniques consistent à fixer l'échantillon (chimiquement ou par congélation), remplacer son contenu d'eau par un plastique doux qui est transformé à un bloc rigide par polymérisation. Le bloc du spécimen est coupé en sections minces (denviron 50 nm) avec un ultramicrotome à température ambiante. En général, ces techniques introduisent plusieurs artefacts, principalement dû à l'enlèvement d'eau. Afin d'éviter ces artefacts, le spécimen peut être congelé, coupé et observé à basse température. Cependant, l'eau liquide cristallise lors de la congélation, résultant en une importante détérioration. Idéalement, l'eau liquide est solidifiée dans un état vitreux. La vitrification consiste à refroidir l'eau si rapidement que les cristaux de glace n'ont pas de temps de se former. Une percée a eu lieu quand la vitrification d'eau pure a été découverte expérimentalement. Cette découverte a ouvert la voie à la cryo-microscopie des suspensions biologiques en film mince vitrifié. Nous avons travaillé pour étendre la technique aux spécimens épais. Pour ce faire les échantillons biologiques doivent être vitrifiés, cryo-coupées en sections vitreuse et observées dans une cryo-microscope électronique. Cette technique, appelée la cryo- microscopie électronique des sections vitrifiées (CEMOVIS), est maintenant considérée comme étant la meilleure façon de conserver l'ultrastructure de tissus et cellules biologiques dans un état très proche de l'état natif. Récemment, cette technique est devenue une méthode pratique fournissant des résultats excellents. Elle a cependant, des limitations importantes, la plus importante d'entre elles est certainement dû aux artefacts de la coupe. Ces artefacts sont la conséquence de la nature du matériel vitreux et le fait que les sections vitreuses ne peuvent pas flotter sur un liquide comme c'est le cas pour les sections en plastique coupées à température ambiante. Le but de ce travail a été d'améliorer notre compréhension du processus de la coupe et des artefacts de la coupe. Nous avons ainsi trouvé des conditions optimales pour minimiser ou empêcher ces artefacts. Un modèle amélioré du processus de coupe et une redéfinitions des artefacts de coupe sont proposés. Les résultats obtenus sous ces conditions sont présentés et comparés aux résultats obtenus avec les méthodes conventionnelles. Abstract Water is often considered to be an ordinary substance since it is transparent, odourless, tasteless and it is very common in nature. As a matter of fact it can be argued that it is the most remarkable of all substances. Without water life on Earth would not exist. Water is the major component of cells, typically forming 70 to 95% of cellular mass and it provides an environment for innumerable organisms to live in, since it covers 75% of Earth surface. Water is a simple molecule made of two hydrogen atoms and one oxygen atom, H2O. The small size of the molecule stands in contrast with its unique physical and chemical properties. Among those the fact that, at the triple point, liquid water is denser than ice is especially remarkable. Despite its special importance in life science, water is systematically removed from biological specimens investigated by electron microscopy. This is because the high vacuum of the electron microscope requires that the biological specimen is observed in dry conditions. For 50 years the science of electron microscopy has addressed this problem resulting in numerous preparation techniques, presently in routine use. Typically these techniques consist in fixing the sample (chemically or by freezing), replacing its water by plastic which is transformed into rigid block by polymerisation. The block is then cut into thin sections (c. 50 nm) with an ultra-microtome at room temperature. Usually, these techniques introduce several artefacts, most of them due to water removal. In order to avoid these artefacts, the specimen can be frozen, cut and observed at low temperature. However, liquid water crystallizes into ice upon freezing, thus causing severe damage. Ideally, liquid water is solidified into a vitreous state. Vitrification consists in solidifying water so rapidly that ice crystals have no time to form. A breakthrough took place when vitrification of pure water was discovered. Since this discovery, the thin film vitrification method is used with success for the observation of biological suspensions of. small particles. Our work was to extend the method to bulk biological samples that have to be vitrified, cryosectioned into vitreous sections and observed in cryo-electron microscope. This technique is called cryo-electron microscopy of vitreous sections (CEMOVIS). It is now believed to be the best way to preserve the ultrastructure of biological tissues and cells very close to the native state for electron microscopic observation. Since recently, CEMOVIS has become a practical method achieving excellent results. It has, however, some sever limitations, the most important of them certainly being due to cutting artefacts. They are the consequence of the nature of vitreous material and the fact that vitreous sections cannot be floated on a liquid as is the case for plastic sections cut at room temperature. The aim of the present work has been to improve our understanding of the cutting process and of cutting artefacts, thus finding optimal conditions to minimise or prevent these artefacts. An improved model of the cutting process and redefinitions of cutting artefacts are proposed. Results obtained with CEMOVIS under these conditions are presented and compared with results obtained with conventional methods.
Resumo:
Electrical resistivity tomography (ERT) is a well-established method for geophysical characterization and has shown potential for monitoring geologic CO2 sequestration, due to its sensitivity to electrical resistivity contrasts generated by liquid/gas saturation variability. In contrast to deterministic inversion approaches, probabilistic inversion provides the full posterior probability density function of the saturation field and accounts for the uncertainties inherent in the petrophysical parameters relating the resistivity to saturation. In this study, the data are from benchtop ERT experiments conducted during gas injection into a quasi-2D brine-saturated sand chamber with a packing that mimics a simple anticlinal geological reservoir. The saturation fields are estimated by Markov chain Monte Carlo inversion of the measured data and compared to independent saturation measurements from light transmission through the chamber. Different model parameterizations are evaluated in terms of the recovered saturation and petrophysical parameter values. The saturation field is parameterized (1) in Cartesian coordinates, (2) by means of its discrete cosine transform coefficients, and (3) by fixed saturation values in structural elements whose shape and location is assumed known or represented by an arbitrary Gaussian Bell structure. Results show that the estimated saturation fields are in overall agreement with saturations measured by light transmission, but differ strongly in terms of parameter estimates, parameter uncertainties and computational intensity. Discretization in the frequency domain (as in the discrete cosine transform parameterization) provides more accurate models at a lower computational cost compared to spatially discretized (Cartesian) models. A priori knowledge about the expected geologic structures allows for non-discretized model descriptions with markedly reduced degrees of freedom. Constraining the solutions to the known injected gas volume improved estimates of saturation and parameter values of the petrophysical relationship. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity.
Resumo:
The GH-2000 and GH-2004 projects have developed a method for detecting GH misuse based on measuring insulin-like growth factor-I (IGF-I) and the amino-terminal pro-peptide of type III collagen (P-III-NP). The objectives were to analyze more samples from elite athletes to improve the reliability of the decision limit estimates, to evaluate whether the existing decision limits needed revision, and to validate further non-radioisotopic assays for these markers. The study included 998 male and 931 female elite athletes. Blood samples were collected according to World Anti-Doping Agency (WADA) guidelines at various sporting events including the 2011 International Association of Athletics Federations (IAAF) World Athletics Championships in Daegu, South Korea. IGF-I was measured by the Immunotech A15729 IGF-I IRMA, the Immunodiagnostic Systems iSYS IGF-I assay and a recently developed mass spectrometry (LC-MS/MS) method. P-III-NP was measured by the Cisbio RIA-gnost P-III-P, Orion UniQ? PIIINP RIA and Siemens ADVIA Centaur P-III-NP assays. The GH-2000 score decision limits were developed using existing statistical techniques. Decision limits were determined using a specificity of 99.99% and an allowance for uncertainty because of the finite sample size. The revised Immunotech IGF-I - Orion P-III-NP assay combination decision limit did not change significantly following the addition of the new samples. The new decision limits are applied to currently available non-radioisotopic assays to measure IGF-I and P-III-NP in elite athletes, which should allow wider flexibility to implement the GH-2000 marker test for GH misuse while providing some resilience against manufacturer withdrawal or change of assays. Copyright © 2015 John Wiley & Sons, Ltd.