141 resultados para solution types
Resumo:
Recognition and identification processes for deceased persons. Determining the identity of deceased persons is a routine task performed essentially by police departments and forensic experts. This thesis highlights the processes necessary for the proper and transparent determination of the civil identities of deceased persons. The identity of a person is defined as the establishment of a link between that person ("the source") and information pertaining to the same individual ("identifiers"). Various identity forms could emerge, depending on the nature of the identifiers. There are two distinct types of identity, namely civil identity and biological identity. The paper examines four processes: identification by witnesses (the recognition process) and comparisons of fingerprints, dental data and DNA profiles (the identification processes). During the recognition process, the memory function is examined and helps to clarify circumstances that may give rise to errors. To make the process more rigorous, a body presentation procedure is proposed to investigators. Before examining the other processes, three general concepts specific to forensic science are considered with regard to the identification of a deceased person, namely, matter divisibility (Inman and Rudin), transfer (Locard) and uniqueness (Kirk). These concepts can be applied to the task at hand, although some require a slightly broader scope of application. A cross comparison of common forensic fields and the identification of deceased persons reveals certain differences, including 1 - reverse positioning of the source (i.e. the source is not sought from traces, but rather the identifiers are obtained from the source); 2 - the need for civil identity determination in addition to the individualisation stage; and 3 - a more restricted population (closed set), rather than an open one. For fingerprints, dental and DNA data, intravariability and intervariability are examined, as well as changes in these post mortem (PM) identifiers. Ante-mortem identifiers (AM) are located and AM-PM comparisons made. For DNA, it has been shown that direct identifiers (taken from a person whose civil identity has been alleged) tend to lead to determining civil identity whereas indirect identifiers (obtained from a close relative) direct towards a determination of biological identity. For each process, a Bayesian model is presented which includes sources of uncertainty deemed to be relevant. The results of the different processes combine to structure and summarise an overall outcome and a methodology. The modelling of dental data presents a specific difficulty with respect to intravariability, which in itself is not quantifiable. The concept of "validity" is, therefore, suggested as a possible solution to the problem. Validity uses various parameters that have an acknowledged impact on teeth intravariability. In cases where identifying deceased persons proves to be extremely difficult due to the limited discrimination of certain procedures, the use of a Bayesian approach is of great value in bringing a transparent and synthetic value. RESUME : Titre: Processus de reconnaissance et d'identification de personnes décédées. L'individualisation de personnes décédées est une tâche courante partagée principalement par des services de police, des odontologues et des laboratoires de génétique. L'objectif de cette recherche est de présenter des processus pour déterminer valablement, avec une incertitude maîtrisée, les identités civiles de personnes décédées. La notion d'identité est examinée en premier lieu. L'identité d'une personne est définie comme l'établissement d'un lien entre cette personne et des informations la concernant. Les informations en question sont désignées par le terme d'identifiants. Deux formes distinctes d'identité sont retenues: l'identité civile et l'identité biologique. Quatre processus principaux sont examinés: celui du témoignage et ceux impliquant les comparaisons d'empreintes digitales, de données dentaires et de profils d'ADN. Concernant le processus de reconnaissance, le mode de fonctionnement de la mémoire est examiné, démarche qui permet de désigner les paramètres pouvant conduire à des erreurs. Dans le but d'apporter un cadre rigoureux à ce processus, une procédure de présentation d'un corps est proposée à l'intention des enquêteurs. Avant d'entreprendre l'examen des autres processus, les concepts généraux propres aux domaines forensiques sont examinés sous l'angle particulier de l'identification de personnes décédées: la divisibilité de la matière (Inman et Rudin), le transfert (Locard) et l'unicité (Kirk). Il est constaté que ces concepts peuvent être appliqués, certains nécessitant toutefois un léger élargissement de leurs principes. Une comparaison croisée entre les domaines forensiques habituels et l'identification de personnes décédées montre des différences telles qu'un positionnement inversé de la source (la source n'est plus à rechercher en partant de traces, mais ce sont des identifiants qui sont recherchés en partant de la source), la nécessité de devoir déterminer une identité civile en plus de procéder à une individualisation ou encore une population d'intérêt limitée plutôt qu'ouverte. Pour les empreintes digitales, les dents et l'ADN, l'intra puis l'inter-variabilité sont examinées, de même que leurs modifications post-mortem (PM), la localisation des identifiants ante-mortem (AM) et les comparaisons AM-PM. Pour l'ADN, il est démontré que les identifiants directs (provenant de la personne dont l'identité civile est supposée) tendent à déterminer une identité civile alors que les identifiants indirects (provenant d'un proche parent) tendent à déterminer une identité biologique. Puis une synthèse des résultats provenant des différents processus est réalisée grâce à des modélisations bayesiennes. Pour chaque processus, une modélisation est présentée, modélisation intégrant les paramètres reconnus comme pertinents. À ce stade, une difficulté apparaît: celle de quantifier l'intra-variabilité dentaire pour laquelle il n'existe pas de règle précise. La solution préconisée est celle d'intégrer un concept de validité qui intègre divers paramètres ayant un impact connu sur l'intra-variabilité. La possibilité de formuler une valeur de synthèse par l'approche bayesienne s'avère d'une aide précieuse dans des cas très difficiles pour lesquels chacun des processus est limité en termes de potentiel discriminant.
Resumo:
OBJECTIVE: Enteral glutamine supplementation and antioxidants have been shown to be beneficial in some categories of critically ill patients. This study investigated the impact on organ function and clinical outcome of an enteral solution enriched with glutamine and antioxidant micronutrients in patients with trauma and with burns. METHODS: This was a prospective study of a historical control group including critically ill, burned and major trauma patients (n = 86, 40 patients with burns and 46 with trauma, 43 in each group) on admission to an intensive care unit in a university hospital (matching for severity, age, and sex). The intervention aimed to deliver a 500-mL enteral solution containing 30 g of glutamine per day, selenium, zinc, and vitamin E (Gln-AOX) for a maximum of 10 d, in addition to control treatment consisting of enteral nutrition in all patients and intravenous trace elements in all burn patients. RESULTS: Patients were comparable at baseline, except for more inhalation injuries in the burn-Gln-AOX group (P = 0.10) and greater neurologic impairment in the trauma-Gln-AOX group (P = 0.022). Intestinal tolerance was good. The full 500-mL dose was rarely delivered, resulting in a low mean glutamine daily dose (22 g for burn patients and 16 g for trauma patients). In burn patients intravenous trace element delivery was superior to the enteral dose. The evolution of the Sequential Organ Failure Assessment score and other outcome variables did not differ significantly between groups. C-reactive protein decreased faster in the Gln-AOX group. CONCLUSION: The Gln-AOX supplement was well tolerated in critically ill, injured patients, but did not improve outcome significantly. The delivery of glutamine below the 0.5-g/kg recommended dose in association with high intravenous trace element substitution doses in burn patients are likely to have blunted the impact by not reaching an efficient treatment dose. Further trials testing higher doses of Gln are required.
Resumo:
Volumes of data used in science and industry are growing rapidly. When researchers face the challenge of analyzing them, their format is often the first obstacle. Lack of standardized ways of exploring different data layouts requires an effort each time to solve the problem from scratch. Possibility to access data in a rich, uniform manner, e.g. using Structured Query Language (SQL) would offer expressiveness and user-friendliness. Comma-separated values (CSV) are one of the most common data storage formats. Despite its simplicity, with growing file size handling it becomes non-trivial. Importing CSVs into existing databases is time-consuming and troublesome, or even impossible if its horizontal dimension reaches thousands of columns. Most databases are optimized for handling large number of rows rather than columns, therefore, performance for datasets with non-typical layouts is often unacceptable. Other challenges include schema creation, updates and repeated data imports. To address the above-mentioned problems, I present a system for accessing very large CSV-based datasets by means of SQL. It's characterized by: "no copy" approach - data stay mostly in the CSV files; "zero configuration" - no need to specify database schema; written in C++, with boost [1], SQLite [2] and Qt [3], doesn't require installation and has very small size; query rewriting, dynamic creation of indices for appropriate columns and static data retrieval directly from CSV files ensure efficient plan execution; effortless support for millions of columns; due to per-value typing, using mixed text/numbers data is easy; very simple network protocol provides efficient interface for MATLAB and reduces implementation time for other languages. The software is available as freeware along with educational videos on its website [4]. It doesn't need any prerequisites to run, as all of the libraries are included in the distribution package. I test it against existing database solutions using a battery of benchmarks and discuss the results.
Resumo:
We previously showed that exposure of 3D organotypic rat brain cell cultures to 1mM 2-methylcitrate (2-MCA) or 3-hydroxyglutarate (3- OHGA) every 12h over three days (DIV11-DIV14) results in ammonium accumulation and cell death. The aim of this study was to define the time course (every 24h) of the observed effects. Ammonium in culture medium already increased at DIV12 staying stable on the following days under 3-OHGA exposure, while it increased consecutively up to much higher levels under 2-MCA exposure. Lactate increase and glucose decrease were observed from DIV13 and DIV14, respectively. We conclude that ammonium accumulation precedes alterations of energy metabolism. As observed by immunohistochemistry glial cells were the predominant dying cells. Immunoblotting and immunohistochemistry with cell death specific markers (caspase-3, alpha-fodrin, LC3) showed that 2-MCA exposure significantly increased apoptosis on DIV14, but did not alter autophagy or necrosis. In contrast, 3-OHGA exposure substantially increased necrosis already from DIV13, while no change was observed for apoptosis and autophagy. In conclusion, ammonium accumulation, secondary disturbance of energy metabolism and glial cell death are involved in the neuropathogenesis ofmethylmalonic aciduria and glutaric aciduria type I. Interestingly, brain cells are dying by necrosis under 3-OHGA exposure and by apoptosis under 2-MCA exposure.
Resumo:
Depth-dose curves in LiF detectors of different effective thicknesses, together with their responses, were calculated for typical nuclear medicine radiation fields with 99mTc, 18F and 90Y sources. Responses were analysed in function of the radionuclide, detector effective thickness and irradiation geometry. On the other hand the results of the nuclear medicine measurement campaign of the ORAMED project were presented focussing on the dose distribution across the hand and on the appropriate position to wear the dosimeter.According to the results, thin LiF detectors provide better responses in all cases. Its use is essential for 18F, since thick dosimeters can underestimate Hp(0.07) up to a 50% because of the very inhomogeneous dose deposition on the active layer. The preliminary results of the measurement campaign showed that the index tip of the non-dominant hand is usually the most exposed position among the 22 monitored positions. It was also found that, in average, wrist dosimeters are likely to underestimate the maximum skin dose by a factor of the order of 20. This factor is reduced to around 6 for a ring dosimeter worn on the base of the index of the non-dominant hand. Thus, for typical nuclear medicine procedures, the base of the index of the non-dominant hand is recommended as the best monitoring option.
Resumo:
The ecological relevance of behavioural syndromes is little studied in cooperative breeding systems where it is assumed that the behavioural type might influence individual decisions on helping and dispersal (e.g. shy, nonaggressive and nonexplorative individuals remain philopatric and helpful, whereas bold, aggressive, explorative individuals compete for vacancies outside their group and disperse). We measured the behavioural type of 19 subordinates in the cooperatively breeding cichlid fish Neolamprologus pulcher in their natural environment by quantifying six behavioural traits up to four times ('trials') in three different contexts, by presenting them with a conspecific intruder, a predator or nothing inside a tube. We found only moderate within-context repeatability (intraclass correlation coefficients) of the focal individual's behaviour, except for attacking either the conspecific or the predator inside the tube. The focal individual's attack rate of the tube was also positively affected by its group size. Averaging traits per context removed the between-trial variation, and consequently the across-context repeatability was very high for all six traits, except for territory maintenance. Trait values depended significantly on the context, except for territory defence. Consequently, individuals could be classified into different behavioural types based on their reaction towards the tube, but surprisingly, and opposite to laboratory studies in this species, ranging propensity and territory maintenance were not included in this behavioural syndrome. We suggest that more studies are needed to compare standardized focal personality tests (e.g. exploration propensity) with actual behaviour observed in nature (e.g. ranging and dispersal).
The relationship between Lamb weather types and long-term changes in flood frequency, River Eden, UK
Resumo:
Research has found that both flood magnitude and frequency in the UK may have increased over the last five decades. However, evaluating whether or not this is a systematic trend is difficult because of the lack of longer records. Here we compile and consider an extreme flood record that extends back to 1770. Since 1770, there have been 137 recorded extreme floods. However, over this period, there is not a unidirectional trend of rising extreme flood risk over time. Instead, there are clear flood-rich and flood-poor periods. Three main flood-rich periods were identified: 18731904, 19231933, and 1994 onwards. To provide a first analysis of what is driving these periods, and given the paucity of more sophisticated datasets that extend back to the 18th century, objective Lamb weather types were used. Of the 27 objective Lamb weather types, only 11 could be associated with the extreme floods during the gauged period, and only 5 of these accounted for > 80% of recorded extreme floods The importance of these five weather types over a longer timescale for flood risk in Carlisle was assessed, through calculating the proportion of each hydrological year classified as being associated with these flood-generating weather types. Two periods clearly had more than the average proportions of the year classified as one of the flood causing weather types; 19001940 and 19832007; and these two periods both contained flood-rich hydrological records. Thus, the analysis suggests that systematic organisation of the North Atlantic climate system may be manifest as periods of elevated and reduced flood risk, an observation that has major implications for analyses that assume that climatic drivers of flood risk can be either statistically stationary or are following a simple trend. Copyright (c) 2011 Royal Meteorological Society
Resumo:
Various site-specific recombination enzymes produce different types of knots or catenanes while acting on circular DNA in vitro and in vivo. By analysing the types of knots or links produced, it is possible to reconstruct the order of events during the reaction and to deduce the molecular "architecture" of the complexes that different enzymes form with DNA. Until recently it was necessary to use laborious electron microscopy methods to identify the types of knots or catenanes that migrate in different bands on the agarose gels used to analyse the products of the reaction. We reported recently that electrophoretic migration of different knots and catenanes formed on the same size DNA molecules is simply related to the average crossing number of the ideal representations of the corresponding knots and catenanes. Here we explain this relation by demonstrating that the expected sedimentation coefficient of randomly fluctuating knotted or catenated DNA molecules in solution shows approximately linear correlation with the average crossing number of ideal configurations of the corresponding knots or catenanes.
Resumo:
RATIONALE: Induction of oxidative stress and impairment of the antioxidant defense are considered important biological responses following nanoparticle (NP) exposure. The acellular in vitro dithiothreitol (DTT) assay is proposed to measure the oxidative potential of NP. In addition, DTT can be considered as a model compound of sulfur containing antioxidants. The objective of this work is to evaluate the surface reactivity in solution of a NP panel toward DTT. METHOD: The NP panel was composed of four carbonaceous particles, six types of metal oxides and silver with primary size ranged from 7 to 300 nm. Suspensions were prepared in surfactant solution with 30 min sonication. DTT was used as reductant to evaluate the oxidative properties of the different NP. The determination of the NP ability to catalyze electron transfer from DTT to oxygen was carried out as described in Sauvain et al., Nanotoxicology, 2008, 2:3, 121−129. RESULTS: All the carbonaceous NP catalyzed the oxidation of DTT by oxygen following the mass based order: carbon black > diesel exhaust particle > nanotubes > fullerene. A contrasting reactivity was observed for the metallic NP. Except for nickel oxide and metallic silver, which reacted similarly to the carbonaceous NP, all other metal oxides hindered the oxidation of DTT by oxygen, with ZnO being the most effective one. CONCLUSIONS : DTT was stabilized against oxidation in the presence of metal oxide NP in the solution. This suggests that different chemical interactions take place compared with carbonaceous NP. To explain these differences, we hypothesize that DTT could form complexes with the metal oxide surface (or dissolved metal ions), rendering it less susceptible to oxidation. By analogy, such a process could be thought to apply in biological systems with sulfur−containing antioxidants, reducing their buffer capacity. Such NP could thus contribute to oxidative stress by an alternative mechanism.
Resumo:
This paper presents a statistical model for the quantification of the weight of fingerprint evidence. Contrarily to previous models (generative and score-based models), our model proposes to estimate the probability distributions of spatial relationships, directions and types of minutiae observed on fingerprints for any given fingermark. Our model is relying on an AFIS algorithm provided by 3M Cogent and on a dataset of more than 4,000,000 fingerprints to represent a sample from a relevant population of potential sources. The performance of our model was tested using several hundreds of minutiae configurations observed on a set of 565 fingermarks. In particular, the effects of various sub-populations of fingers (i.e., finger number, finger general pattern) on the expected evidential value of our test configurations were investigated. The performance of our model indicates that the spatial relationship between minutiae carries more evidential weight than their type or direction. Our results also indicate that the AFIS component of our model directly enables us to assign weight to fingerprint evidence without the need for the additional layer of complex statistical modeling involved by the estimation of the probability distributions of fingerprint features. In fact, it seems that the AFIS component is more sensitive to the sub-population effects than the other components of the model. Overall, the data generated during this research project contributes to support the idea that fingerprint evidence is a valuable forensic tool for the identification of individuals.
Resumo:
Etude critique de Charles Larmore, Modernité et morale (Paris, PUF, 1993). Cet article présente et discute le projet de son auteur de défendre l'idée d'une morale « pragmatiste » et « intuitionniste ». Restituant la position de l'auteur, il expose les arguments en faveur d'une conception pragmatiste de la vérité morale et ceux en faveur du recours à l'intuition pour découvrir le contenu de nos obligations morales. Dans une brève note critique finale, il suggère que le pragmatisme semble peu à même d'échapper tout à fait au reproche de relativisme.
Resumo:
Motivation: Genome-wide association studies have become widely used tools to study effects of genetic variants on complex diseases. While it is of great interest to extend existing analysis methods by considering interaction effects between pairs of loci, the large number of possible tests presents a significant computational challenge. The number of computations is further multiplied in the study of gene expression quantitative trait mapping, in which tests are performed for thousands of gene phenotypes simultaneously. Results: We present FastEpistasis, an efficient parallel solution extending the PLINK epistasis module, designed to test for epistasis effects when analyzing continuous phenotypes. Our results show that the algorithm scales with the number of processors and offers a reduction in computation time when several phenotypes are analyzed simultaneously. FastEpistasis is capable of testing the association of a continuous trait with all single nucleotide polymorphism ( SNP) pairs from 500 000 SNPs, totaling 125 billion tests, in a population of 5000 individuals in 29, 4 or 0.5 days using 8, 64 or 512 processors.