56 resultados para Logical positivism.
Resumo:
This case report outlines research undertaken as the result of a document examination case in which two envelopes were involved. The combination of the circumstances of the case and the results of the examination allows a simple application of a logical approach to pre-assess the probability that an envelope (or a package) potentially recovered at a suspect's home comes from the same batch (same source) as questioned envelopes. This highlights that it is useful to examine envelopes.
Resumo:
Undernutrition is a widespread problem in the intensive care and is associated with a worse clinical outcome. Enteral nutrition is the recommended nutritional support in ICU patients. However, enteral nutrition is frequently insufficient to cover protein-energy needs. The initiation of supplemental parenteral nutrition, when enteral nutrition is insufficient, could optimize the nutritional therapy. Such a combination could allow reducing morbidity, length of stay and recovery, as well as improving quality of life and health care costs. Prospective studies are currently underway to test this hypothesis.
Resumo:
Estimating the time since discharge of a spent cartridge or a firearm can be useful in criminal situa-tions involving firearms. The analysis of volatile gunshot residue remaining after shooting using solid-phase microextraction (SPME) followed by gas chromatography (GC) was proposed to meet this objective. However, current interpretative models suffer from several conceptual drawbacks which render them inadequate to assess the evidential value of a given measurement. This paper aims to fill this gap by proposing a logical approach based on the assessment of likelihood ratios. A probabilistic model was thus developed and applied to a hypothetical scenario where alternative hy-potheses about the discharge time of a spent cartridge found on a crime scene were forwarded. In order to estimate the parameters required to implement this solution, a non-linear regression model was proposed and applied to real published data. The proposed approach proved to be a valuable method for interpreting aging-related data.
Resumo:
BACKGROUND: Multiple interventions were made to optimize the medication process in our intensive care unit (ICU). 1 Transcriptions from the medical order form to the administration plan were eliminated by merging both into a single document; 2 the new form was built in a logical sequence and was highly structured to promote completeness and standardization of information; 3 frequently used drug names, approved units, and fixed routes were pre-printed; 4 physicians and nurses were trained with regard to the correct use of the new form. This study was aimed at evaluating the impact of these interventions on clinically significant types of medication errors. METHODS: Eight types of medication errors were measured by a prospective chart review before and after the interventions in the ICU of a public tertiary care hospital. We used an interrupted time-series design to control the secular trends. RESULTS: Over 85 days, 9298 lines of drug prescription and/or administration to 294 patients, corresponding to 754 patient-days were collected and analysed for the three series before and three series following the intervention. Global error rate decreased from 4.95 to 2.14% (-56.8%, P < 0.001). CONCLUSIONS: The safety of the medication process in our ICU was improved by simple and inexpensive interventions. In addition to the optimization of the prescription writing process, the documentation of intravenous preparation, and the scheduling of administration, the elimination of the transcription in combination with the training of users contributed to reducing errors and carried an interesting potential to increase safety.
Resumo:
This paper extends previous research and discussion on the use of multivariate continuous data, which are about to become more prevalent in forensic science. As an illustrative example, attention is drawn here on the area of comparative handwriting examinations. Multivariate continuous data can be obtained in this field by analysing the contour shape of loop characters through Fourier analysis. This methodology, based on existing research in this area, allows one describe in detail the morphology of character contours throughout a set of variables. This paper uses data collected from female and male writers to conduct a comparative analysis of likelihood ratio based evidence assessment procedures in both, evaluative and investigative proceedings. While the use of likelihood ratios in the former situation is now rather well established (typically, in order to discriminate between propositions of authorship of a given individual versus another, unknown individual), focus on the investigative setting still remains rather beyond considerations in practice. This paper seeks to highlight that investigative settings, too, can represent an area of application for which the likelihood ratio can offer a logical support. As an example, the inference of gender of the writer of an incriminated handwritten text is forwarded, analysed and discussed in this paper. The more general viewpoint according to which likelihood ratio analyses can be helpful for investigative proceedings is supported here through various simulations. These offer a characterisation of the robustness of the proposed likelihood ratio methodology.
Resumo:
2 Abstract2.1 En françaisLe séquençage du génome humain est un pré-requis fondamental à la compréhension de la biologie de l'être humain. Ce projet achevé, les scientifiques ont dû faire face à une tâche aussi importante, comprendre cette suite de 3 milliards de lettres qui compose notre génome. Le consortium ENCODE (ENCyclopedia Of Dna Elements) fût formé comme une suite logique au projet du génome humain. Son rôle est d'identifier tous les éléments fonctionnels de notre génome incluant les régions transcrites, les sites d'attachement des facteurs de transcription, les sites hypersensibles à la DNAse I ainsi que les marqueurs de modification des histones. Dans le cadre de ma thèse doctorale, j'ai participé à 2 sous-projets d'ENCODE. En premier lieu, j'ai eu la tâche de développer et d'optimiser une technique de validation expérimentale à haut rendement de modèles de gènes qui m'a permis d'estimer la qualité de la plus récente annotation manuelle. Ce nouveau processus de validation est bien plus efficace que la technique RNAseq qui est actuellement en train de devenir la norme. Cette technique basée sur la RT-PCR, m'a notamment permis de découvrir de nouveaux exons dans 10% des régions interrogées. En second lieu j'ai participé à une étude ayant pour but d'identifier les extrémités de tous les gènes des chromosomes humains 21 et 22. Cette étude à permis l'identification à large échelle de transcrits chimères comportant des séquences provenant de deux gènes distincts pouvant être à une grande distance l'un de autre.2.2 In EnglishThe completion of the human genome sequence js the prerequisite to fully understand the biology of human beings. This project achieved, scientists had to face another challenging task, understanding the meaning of the 3 billion letters composing this genome. As a logical continuation of the human genome project, the ENCODE (ENCyclopedia Of DNA Elements) consortium was formed with the aim of annotating all its functional elements. These elements include transcribed regions, transcription binding sites, DNAse I hypersensitive sites and histone modification marks. In the frame of my PhD thesis, I was involved in two sub-projects of ENCODE. Firstly I developed and optimized an high throughput method to validate gene models, which allowed me to assess the quality of the most recent manually-curated annotation. This novel experimental validation pipeline is extremely effective, far more so than transcriptome profiling through RNA sequencing, which is becoming the norm. This RT-PCR-seq targeted-approach is likewise particularly efficient in identifying novel exons, as we discovered about 10% of loci with unannotated exons. Secondly, I participated to a study aiming to identify the gene boundaries of all genes in the human chromosome 21 and 22. This study led to the identification of chimeric transcripts that are composed of sequences coming form two distinct genes that can be map far away from each other.
Resumo:
The dynamical analysis of large biological regulatory networks requires the development of scalable methods for mathematical modeling. Following the approach initially introduced by Thomas, we formalize the interactions between the components of a network in terms of discrete variables, functions, and parameters. Model simulations result in directed graphs, called state transition graphs. We are particularly interested in reachability properties and asymptotic behaviors, which correspond to terminal strongly connected components (or "attractors") in the state transition graph. A well-known problem is the exponential increase of the size of state transition graphs with the number of network components, in particular when using the biologically realistic asynchronous updating assumption. To address this problem, we have developed several complementary methods enabling the analysis of the behavior of large and complex logical models: (i) the definition of transition priority classes to simplify the dynamics; (ii) a model reduction method preserving essential dynamical properties, (iii) a novel algorithm to compact state transition graphs and directly generate compressed representations, emphasizing relevant transient and asymptotic dynamical properties. The power of an approach combining these different methods is demonstrated by applying them to a recent multilevel logical model for the network controlling CD4+ T helper cell response to antigen presentation and to a dozen cytokines. This model accounts for the differentiation of canonical Th1 and Th2 lymphocytes, as well as of inflammatory Th17 and regulatory T cells, along with many hybrid subtypes. All these methods have been implemented into the software GINsim, which enables the definition, the analysis, and the simulation of logical regulatory graphs.
Resumo:
This study was commissioned by the European Committee on Crime Problems at the Council of Europe to describe and discuss the standards used to asses the admissibility and appraisal of scientific evidence in various member countries. After documenting cases in which faulty forensic evidence seems to have played a critical role, the authors describe the legal foundations of the issues of admissibility and assessment of the probative value in the field of scientific evidence, contrasting criminal justice systems of accusatorial and inquisitorial tradition and the various risks that they pose in terms of equality of arms. Special attention is given to communication issues between lawyers and scientific experts. The authors eventually investigate possible ways of improving the system. Among these mechanisms, emphasis is put on the adoption of a common terminology for expressing the weight of evidence. It is also proposed to adopt an harmonized interpretation framework among forensic experts rooted in good practices of logical inference.
Resumo:
In principle, we should be glad that Eric Kmiec and his colleagues published in Science's STKE (1) a detailed experimental protocol of their gene repair method (2, 3). However, a careful reading of their contribution raises more doubts about the method. The research published in Science five years ago by Kmiec and his colleagues was said to demonstrate that chimeric RNA-DNA oligonucleotides could correct the mutation responsible for sickle cell anemia with 50% efficiency (4). Such a remarkable result prompted many laboratories to attempt to replicate the research or utilize the method on their own systems. However, if the method worked at all, which it rarely did, the achieved efficiency was usually lower by several orders of magnitude. Now, in the Science's STKE protocol, we are given crucial information about the method and why it is so important to utilize these expensive chimeric RNA-DNA constructs. In the introduction we are told that the RNA-DNA duplex is more stable than a DNA-DNA duplex and so extends the half-life of the complexes formed between the targeted DNA and the chimeric RNA-DNA oligonucleotides. This logical explanation, however, conflicts with the statement in the section entitled "Transfection with Oligonucleotides and Plasmid DNA" that Kmiec and colleagues have recently demonstrated that classical single-stranded DNA oligonucleotides with a few protective phosphothioate linkages have a "gene repair conversion frequency rivaling that of the RNA/DNA chimera". Indeed, the research cited for that result actually states that single-stranded DNA oligonucleotides are in fact several-fold more efficient (3.7-fold) than the RNA-DNA chimeric constructs (5). If that is the case, it raises the question of why Kmiec and colleagues emphasize the importance of the RNA in their original chimeric constructs. Their own new results show that modified single-stranded DNA oligonucleotides are more effective than the expensive RNA-DNA hybrids. Moreover, the current efficiency of the gene repair by RNA-DNA hybrids, according to Kmiec and colleagues in their recent paper is only 4×10-4 even after several hours of pre-selection permitting multiplification of bacterial cells with the corrected plasmid (5). This efficiency is much lower than the 50% value reported five years ago, but is assuredly much closer to the reality.
Resumo:
La gestion des risques est souvent appréhendée par l'utilisation de méthodes linéaires mettant l'accent sur des raisonnements de positionnement et de type causal : à tel événement correspond tel risque et telle conséquence. Une prise en compte des interrelations entre risques est souvent occultée et les risques sont rarement analysés dans leurs dynamiques et composantes non linéaires. Ce travail présente ce que les méthodes systémiques et notamment l'étude des systèmes complexes sont susceptibles d'apporter en matière de compréhension, de management et d'anticipation et de gestion des risques d'entreprise, tant sur le plan conceptuel que de matière appliquée. En partant des définitions relatives aux notions de systèmes et de risques dans différents domaines, ainsi que des méthodes qui sont utilisées pour maîtriser les risques, ce travail confronte cet ensemble à ce qu'apportent les approches d'analyse systémique et de modélisation des systèmes complexes. En mettant en évidence les effets parfois réducteurs des méthodes de prise en compte des risques en entreprise ainsi que les limitations des univers de risques dues, notamment, à des définitions mal adaptées, ce travail propose également, pour la Direction d'entreprise, une palette des outils et approches différentes, qui tiennent mieux compte de la complexité, pour gérer les risques, pour aligner stratégie et management des risques, ainsi que des méthodes d'analyse du niveau de maturité de l'entreprise en matière de gestion des risques. - Risk management is often assessed through linear methods which stress positioning and causal logical frameworks: to such events correspond such consequences and such risks accordingly. Consideration of the interrelationships between risks is often overlooked and risks are rarely analyzed in their dynamic and nonlinear components. This work shows what systemic methods, including the study of complex systems, are likely to bring to knowledge, management, anticipation of business risks, both on the conceptual and the practical sides. Based on the definitions of systems and risks in various areas, as well as methods used to manage risk, this work confronts these concepts with approaches of complex systems analysis and modeling. This work highlights the reducing effects of some business risk analysis methods as well as limitations of risk universes caused in particular by unsuitable definitions. As a result this work also provides chief officers with a range of different tools and approaches which allows them a better understanding of complexity and as such a gain in efficiency in their risk management practices. It results in a better fit between strategy and risk management. Ultimately the firm gains in its maturity of risk management.