980 resultados para Identification algorithms
Resumo:
Aim The jaguar, Panthera onca, is a species of global conservation concern. In Mexico, the northernmost part of its distribution range, its conservation status, is particularly critical, while its potential and actual distribution is poorly known. We propose an ensemble model (EM) of the potential distribution for the jaguar in Mexico and identify the priority areas for conservation.Location Mexico.Methods We generated our EM based on three presence-only methods (Ecological Niche Factor Analysis, Mahalanobis distance, Maxent) and considering environmental, biological and anthropogenic factors. We used this model to evaluate the efficacy of the existing Mexican protected areas (PAs), to evaluate the adequacy of the jaguar conservation units (JCUs) and to propose new areas that should be considered for conservation and management of the species in Mexico.Results Our results outline that 16% of Mexico (c. 312,000 km2) can be considered as suitable for the presence of the jaguar. Furthermore, 13% of the suitable areas are included in existing PAs and 14% are included in JCUs (Sanderson et al., 2002).Main conclusions Clearly much more should be carried out to establish a proactive conservation strategy. Based on our results, we propose here new jaguar conservation and management areas that are important for a nationwide conservation blueprint.
Resumo:
BACKGROUND: DNA sequence polymorphisms analysis can provide valuable information on the evolutionary forces shaping nucleotide variation, and provides an insight into the functional significance of genomic regions. The recent ongoing genome projects will radically improve our capabilities to detect specific genomic regions shaped by natural selection. Current available methods and software, however, are unsatisfactory for such genome-wide analysis. RESULTS: We have developed methods for the analysis of DNA sequence polymorphisms at the genome-wide scale. These methods, which have been tested on a coalescent-simulated and actual data files from mouse and human, have been implemented in the VariScan software package version 2.0. Additionally, we have also incorporated a graphical-user interface. The main features of this software are: i) exhaustive population-genetic analyses including those based on the coalescent theory; ii) analysis adapted to the shallow data generated by the high-throughput genome projects; iii) use of genome annotations to conduct a comprehensive analyses separately for different functional regions; iv) identification of relevant genomic regions by the sliding-window and wavelet-multiresolution approaches; v) visualization of the results integrated with current genome annotations in commonly available genome browsers. CONCLUSION: VariScan is a powerful and flexible suite of software for the analysis of DNA polymorphisms. The current version implements new algorithms, methods, and capabilities, providing an important tool for an exhaustive exploratory analysis of genome-wide DNA polymorphism data.
Resumo:
Two methods of differential isotopic coding of carboxylic groups have been developed to date. The first approach uses d0- or d3-methanol to convert carboxyl groups into the corresponding methyl esters. The second relies on the incorporation of two 18O atoms into the C-terminal carboxylic group during tryptic digestion of proteins in H(2)18O. However, both methods have limitations such as chromatographic separation of 1H and 2H derivatives or overlap of isotopic distributions of light and heavy forms due to small mass shifts. Here we present a new tagging approach based on the specific incorporation of sulfanilic acid into carboxylic groups. The reagent was synthesized in a heavy form (13C phenyl ring), showing no chromatographic shift and an optimal isotopic separation with a 6 Da mass shift. Moreover, sulfanilic acid allows for simplified fragmentation in matrix-assisted laser desorption/ionization (MALDI) due the charge fixation of the sulfonate group at the C-terminus of the peptide. The derivatization is simple, specific and minimizes the number of sample treatment steps that can strongly alter the sample composition. The quantification is reproducible within an order of magnitude and can be analyzed either by electrospray ionization (ESI) or MALDI. Finally, the method is able to specifically identify the C-terminal peptide of a protein by using GluC as the proteolytic enzyme.
Resumo:
An ammonium chloride procedure was used to prepare a bacterial pellet from positive blood cultures, which was used for direct inoculation of VITEK 2 cards. Correct identification reached 99% for Enterobacteriaceae and 74% for staphylococci. For antibiotic susceptibility testing, very major and major errors were 0.1 and 0.3% for Enterobacteriaceae, and 0.7 and 0.1% for staphylococci, respectively. Thus, bacterial pellets prepared with ammonium chloride allow direct inoculation of VITEK cards with excellent accuracy for Enterobacteriaceae and a lower accuracy for staphylococci.
Resumo:
For the last 2 decades, supertree reconstruction has been an active field of research and has seen the development of a large number of major algorithms. Because of the growing popularity of the supertree methods, it has become necessary to evaluate the performance of these algorithms to determine which are the best options (especially with regard to the supermatrix approach that is widely used). In this study, seven of the most commonly used supertree methods are investigated by using a large empirical data set (in terms of number of taxa and molecular markers) from the worldwide flowering plant family Sapindaceae. Supertree methods were evaluated using several criteria: similarity of the supertrees with the input trees, similarity between the supertrees and the total evidence tree, level of resolution of the supertree and computational time required by the algorithm. Additional analyses were also conducted on a reduced data set to test if the performance levels were affected by the heuristic searches rather than the algorithms themselves. Based on our results, two main groups of supertree methods were identified: on one hand, the matrix representation with parsimony (MRP), MinFlip, and MinCut methods performed well according to our criteria, whereas the average consensus, split fit, and most similar supertree methods showed a poorer performance or at least did not behave the same way as the total evidence tree. Results for the super distance matrix, that is, the most recent approach tested here, were promising with at least one derived method performing as well as MRP, MinFlip, and MinCut. The output of each method was only slightly improved when applied to the reduced data set, suggesting a correct behavior of the heuristic searches and a relatively low sensitivity of the algorithms to data set sizes and missing data. Results also showed that the MRP analyses could reach a high level of quality even when using a simple heuristic search strategy, with the exception of MRP with Purvis coding scheme and reversible parsimony. The future of supertrees lies in the implementation of a standardized heuristic search for all methods and the increase in computing power to handle large data sets. The latter would prove to be particularly useful for promising approaches such as the maximum quartet fit method that yet requires substantial computing power.
Resumo:
Matrix attachment regions are DNA sequences found throughout eukaryotic genomes that are believed to define boundaries interfacing heterochromatin and euchromatin domains, thereby acting as epigenetic regulators. When included in expression vectors, MARs can improve and sustain transgene expression, and a search for more potent novel elements is therefore actively pursued to further improve recombinant protein production. Here we describe the isolation of new MARs from the mouse genome using a modified in silico analysis. One of these MARs was found to be a powerful activator of transgene expression in stable transfections. Interestingly, this MAR also increased GFP and/or immunoglobulin expression from some but not all expression vectors in transient transfections. This effect was attributed to the presence or absence of elements on the vector backbone, providing an explanation for earlier discrepancies as to the ability of this class of elements to affect transgene expression under such conditions.
Resumo:
Recognition and identification processes for deceased persons. Determining the identity of deceased persons is a routine task performed essentially by police departments and forensic experts. This thesis highlights the processes necessary for the proper and transparent determination of the civil identities of deceased persons. The identity of a person is defined as the establishment of a link between that person ("the source") and information pertaining to the same individual ("identifiers"). Various identity forms could emerge, depending on the nature of the identifiers. There are two distinct types of identity, namely civil identity and biological identity. The paper examines four processes: identification by witnesses (the recognition process) and comparisons of fingerprints, dental data and DNA profiles (the identification processes). During the recognition process, the memory function is examined and helps to clarify circumstances that may give rise to errors. To make the process more rigorous, a body presentation procedure is proposed to investigators. Before examining the other processes, three general concepts specific to forensic science are considered with regard to the identification of a deceased person, namely, matter divisibility (Inman and Rudin), transfer (Locard) and uniqueness (Kirk). These concepts can be applied to the task at hand, although some require a slightly broader scope of application. A cross comparison of common forensic fields and the identification of deceased persons reveals certain differences, including 1 - reverse positioning of the source (i.e. the source is not sought from traces, but rather the identifiers are obtained from the source); 2 - the need for civil identity determination in addition to the individualisation stage; and 3 - a more restricted population (closed set), rather than an open one. For fingerprints, dental and DNA data, intravariability and intervariability are examined, as well as changes in these post mortem (PM) identifiers. Ante-mortem identifiers (AM) are located and AM-PM comparisons made. For DNA, it has been shown that direct identifiers (taken from a person whose civil identity has been alleged) tend to lead to determining civil identity whereas indirect identifiers (obtained from a close relative) direct towards a determination of biological identity. For each process, a Bayesian model is presented which includes sources of uncertainty deemed to be relevant. The results of the different processes combine to structure and summarise an overall outcome and a methodology. The modelling of dental data presents a specific difficulty with respect to intravariability, which in itself is not quantifiable. The concept of "validity" is, therefore, suggested as a possible solution to the problem. Validity uses various parameters that have an acknowledged impact on teeth intravariability. In cases where identifying deceased persons proves to be extremely difficult due to the limited discrimination of certain procedures, the use of a Bayesian approach is of great value in bringing a transparent and synthetic value. RESUME : Titre: Processus de reconnaissance et d'identification de personnes décédées. L'individualisation de personnes décédées est une tâche courante partagée principalement par des services de police, des odontologues et des laboratoires de génétique. L'objectif de cette recherche est de présenter des processus pour déterminer valablement, avec une incertitude maîtrisée, les identités civiles de personnes décédées. La notion d'identité est examinée en premier lieu. L'identité d'une personne est définie comme l'établissement d'un lien entre cette personne et des informations la concernant. Les informations en question sont désignées par le terme d'identifiants. Deux formes distinctes d'identité sont retenues: l'identité civile et l'identité biologique. Quatre processus principaux sont examinés: celui du témoignage et ceux impliquant les comparaisons d'empreintes digitales, de données dentaires et de profils d'ADN. Concernant le processus de reconnaissance, le mode de fonctionnement de la mémoire est examiné, démarche qui permet de désigner les paramètres pouvant conduire à des erreurs. Dans le but d'apporter un cadre rigoureux à ce processus, une procédure de présentation d'un corps est proposée à l'intention des enquêteurs. Avant d'entreprendre l'examen des autres processus, les concepts généraux propres aux domaines forensiques sont examinés sous l'angle particulier de l'identification de personnes décédées: la divisibilité de la matière (Inman et Rudin), le transfert (Locard) et l'unicité (Kirk). Il est constaté que ces concepts peuvent être appliqués, certains nécessitant toutefois un léger élargissement de leurs principes. Une comparaison croisée entre les domaines forensiques habituels et l'identification de personnes décédées montre des différences telles qu'un positionnement inversé de la source (la source n'est plus à rechercher en partant de traces, mais ce sont des identifiants qui sont recherchés en partant de la source), la nécessité de devoir déterminer une identité civile en plus de procéder à une individualisation ou encore une population d'intérêt limitée plutôt qu'ouverte. Pour les empreintes digitales, les dents et l'ADN, l'intra puis l'inter-variabilité sont examinées, de même que leurs modifications post-mortem (PM), la localisation des identifiants ante-mortem (AM) et les comparaisons AM-PM. Pour l'ADN, il est démontré que les identifiants directs (provenant de la personne dont l'identité civile est supposée) tendent à déterminer une identité civile alors que les identifiants indirects (provenant d'un proche parent) tendent à déterminer une identité biologique. Puis une synthèse des résultats provenant des différents processus est réalisée grâce à des modélisations bayesiennes. Pour chaque processus, une modélisation est présentée, modélisation intégrant les paramètres reconnus comme pertinents. À ce stade, une difficulté apparaît: celle de quantifier l'intra-variabilité dentaire pour laquelle il n'existe pas de règle précise. La solution préconisée est celle d'intégrer un concept de validité qui intègre divers paramètres ayant un impact connu sur l'intra-variabilité. La possibilité de formuler une valeur de synthèse par l'approche bayesienne s'avère d'une aide précieuse dans des cas très difficiles pour lesquels chacun des processus est limité en termes de potentiel discriminant.
Resumo:
Cette recherche s'applique aux témoins glaciaires des Chablais dans quatre de leurs dimensions : géopatrimoine, connaissance objective, inventaire de géosites et valorisation. Elle est organisée sur le canevas d'un processus de patrimonialisation auquel elle participe et qu'elle interroge à la fois. En 2009, débutait le projet 123 Chablais, pour une durée de quatre ans. Il concernait l'ensemble du territoire chablaisien, réparti sur deux pays (France et Suisse) et trois entités administratives (département de la Haute-Savoie, cantons de Vaud et du Valais). Ce projet, élaboré dans le cadre du programme Interreg IV France-Suisse, avait pour but de dynamiser le développement économique local en s'appuyant sur les patrimoines régionaux. Le géopatrimoine, identifié comme une de ces ressources, faisait donc l'objet de plusieurs actions, dont cette recherche. En parallèle, le Chablais haut-savoyard préparait sa candidature pour rejoindre l'European Geopark Network (EGN). Son intégration, effective dès 2012, a fait de ce territoire le cinquième géoparc français du réseau. Le Geopark du Chablais fonde son identité géologique sur l'eau et la glace, deux thématiques intimement liées aux témoins glaciaires. Dans ce contexte d'intérêt pour le géopatrimoine local et en particulier pour le patrimoine glaciaire, plusieurs missions ont été assignées à cette recherche qui devait à la fois améliorer la connaissance objective des témoins glaciaires, inventorier les géosites glaciaires et valoriser le patrimoine glaciaire. Le premier objectif de ce travail était d'acquérir une vision synthétique des témoins glaciaires. Il a nécessité une étape de synthèse bibliographique ainsi que sa spatialisation, afin d'identifier les lacunes de connaissance et la façon dont ce travail pouvait contribuer à les combler. Sur cette base, plusieurs méthodes ont été mises en oeuvre : cartographie géomorphologique, reconstitution des lignes d'équilibre glaciaires et datations de blocs erratiques à l'aide des isotopes cosmogéniques produits in situ. Les cartes géomorphologiques ont été élaborées en particulier dans les cirques et vallons glaciaires. Les datations cosmogéniques ont été concentrées sur deux stades du glacier du Rhône : le Last Local Glacial Maximum (LLGM) et le stade de Monthey. Au terme de cette étape, les spécificités du patrimoine glaciaire régional se sont révélées être 1) une grande diversité de formes et des liens étroits avec différents autres processus géomorphologiques ; 2) une appartenance des témoins glaciaires à dix grandes étapes de la déglaciation du bassin lémanique. Le second objectif était centré sur le processus d'inventaire des géosites glaciaires. Nous avons mis l'accent sur la sélection du géopatrimoine en développant une approche basée sur deux axes (temps et espace) identifiés dans le volet précédent et avons ainsi réalisé un inventaire à thèmes, composé de 32 géosites. La structure de l'inventaire a également été explorée de façon à intégrer des critères d'usage de ces géosites. Cette démarche, soutenue par une réflexion sur les valeurs attribuées au géopatrimoine et sur la façon d'évaluer ces valeurs, nous a permis de mettre en évidence le point de vue anthropo - et scientifico - centré qui prévaut nettement dans la recherche européenne sur le géopatrimoine. L'analyse des résultats de l'inventaire a fait apparaître quelques caractéristiques du patrimoine glaciaire chablaisien, discret, diversifié, et comportant deux spécificités exploitables dans le cadre d'une médiation scientifique : son statut de « berceau de la théorie glaciaire » et ses liens étroits avec des activités de la vie quotidienne, en tant que matière première, support de loisir ou facteur de risque. Cette recherche a débouché sur l'élaboration d'une exposition itinérante sur le patrimoine glaciaire des Chablais. Ce produit de valorisation géotouristique a été conçu pour sensibiliser la population locale à l'impact des glaciers sur son territoire. Il présente une série de sept cartes de stades glaciaires, encadrées par les deux mêmes thématiques, l'histoire de la connaissance glaciaire d'une part, les témoins glaciaires et la société, d'autre part. -- This research focuses on glacial witnesses in the Chablais area according to four dimensions : geoheritage, objective knowledge, inventory and promotion of geosites. It is organized on the model of an heritage's process which it participates and that it questions both. In 2009, the project 123 Chablais started for a period of four years. It covered the entire chablaisien territory spread over two countries and three administrative entities (département of Haute-Savoie, canton of Vaud, canton of Valais). This project, developed in the framework of the Interreg IV France-Switzerland program, aimed to boost the local development through regional heritage. The geoheritage identified as one of these resources, was therefore the subject of several actions, including this research. In parallel, the French Chablais was preparing its application to join the European Geopark Network (EGN). Its integration, effective since 2012, made of this area the fifth French Geopark of the network. The Chablais Geopark geological identity was based on water and ice, two themes closely linked to the glacial witnesses. In this context of interest for the regional geoheritage and especially for the glacial heritage, several missions have been assigned to this research which should improve objective knowledge of glacial witnesses, inventory and assess glacial geosites. The objective knowledge's component was to acquire a synthetic vision of the glacial witnesses. It required a first bibliography synthesis step in order to identify gaps in knowledge and how this work could help to fill them. On this basis, several methods have been implemented: geomorphological mapping, reconstruction of the equilibrium-line altitude and dating of glacial erratic blocks using cosmogenic isotopes produced in situ. Geomorphological maps have been developed especially in glacial cirques and valleys. Cosmogenic datings were concentrated on two stages of the Rhone glacier: the Last Local Glacial Maximum (LLGM) and « the stage of Monthey ». After this step, the specificities of the regional glacial heritage have emerged to us as 1) a wide variety of forms and links to various other geomorphological processes; 2) belonging of glacial witnesses to ten major glacial stages of Léman Lake's deglaciation. In the inventory of glacial geosites component we focused on the selection of geoheritage. We developed an approach based on two axes (time and space) identified in the preceding components. We obtained a thematic inventory, consisting of 32 geosites. The structure of the inventory was also explored in the aim to integrate use criteria of geosites. This approach, supported by a thought on the values attributed to the geoheritage and how to assess these values allowed us to highlight the point of view much anthropological - and scientific -centered prevailing in the European research on geoheritage. The analysis of the inventory's results revealed some characteristics of chablaisien glacial heritage, discrete, diverse, and with two features exploitable in the context of a scientific mediation: its status as « cradle of the glacial theory » and its close links with activities of daily life, as raw material, leisure support and risk factor. This research leads to the development of a traveling exhibition on the glacial heritage of the Chablais area. It presents a series of seven glacial stage's cards, framed by the two themes mentioned above: « history of glacial knowledge » and « glacial witnesses and society ».
Resumo:
OBJECTIVE: Accurate identification of major trauma patients in the prehospital setting positively affects survival and resource utilization. Triage algorithms using predictive criteria of injury severity have been identified in paramedic-based prehospital systems. Our rescue system is based on prehospital paramedics and emergency physicians. The aim of this study was to evaluate the accuracy of the prehospital triage performed by physicians and to identify the predictive factors leading to errors of triage.METHODS: Retrospective study of trauma patients triaged by physicians. Prehospital triage was analyzed using criteria defining major trauma victims (MTVs, Injury Severity Score >15, admission to ICU, need for immediate surgery and death within 48 h). Adequate triage was defined as MTVs oriented to the trauma centre or non-MTV (NMTV) oriented to regional hospitals.RESULTS: One thousand six hundred and eighti-five patients (blunt trauma 96%) were included (558 MTV and 1127 NMTV). Triage was adequate in 1455 patients (86.4%). Overtriage occurred in 171 cases (10.1%) and undertriage in 59 cases (3.5%). Sensitivity and specificity was 90 and 85%, respectively, whereas positive predictive value and negative predictive value were 75 and 94%, respectively. Using logistic regression analysis, significant (P<0.05) predictors of undertriage were head or thorax injuries (odds ratio >2.5). Predictors of overtriage were paediatric age group, pedestrian or 2 wheel-vehicle road traffic accidents (odds ratio >2.0).CONCLUSION: Physicians using clinical judgement provide effective prehospital triage of trauma patients. Only a few factors predicting errors in triage process were identified in this study.
Resumo:
We describe the use of dynamic combinatorial chemistry (DCC) to identify ligands for the stem-loop structure located at the exon 10-5'-intron junction of Tau pre-mRNA, which is involved in the onset of several tauopathies including frontotemporal dementia with Parkinsonism linked to chromosome 17 (FTDP-17). A series of ligands that combine the small aminoglycoside neamine and heteroaromatic moieties (azaquinolone and two acridines) have been identified by using DCC. These compounds effectively bind the stem-loop RNA target (the concentration required for 50% RNA response (EC(50)): 2-58 μM), as determined by fluorescence titration experiments. Importantly, most of them are able to stabilize both the wild-type and the +3 and +14 mutated sequences associated with the development of FTDP-17 without producing a significant change in the overall structure of the RNA (as analyzed by circular dichroism (CD) spectroscopy), which is a key factor for recognition by the splicing regulatory machinery. A good correlation has been found between the affinity of the ligands for the target and their ability to stabilize the RNA secondary structure.
Resumo:
Background: The hepatitis C virus (HCV) NS3-4A protease is not only an essential component of the viral replication complex and a prime target for a ntiviral intervention but also a key player i n the persistence and pathogenesis of HCV. It cleaves and thereby inactivates two crucial adaptor proteins in viral RNA sensing and innate immunity (MAVS and TRIF) as well as a phosphatase involved in growth factor signaling (TCPTP). T he aim of this study was to identify novel cellular substrates o f the N S3-4A protease and to investigate their role in the replication and pathogenesis of HCV. Methods: Cell lines inducibly expressing t he NS3-4A protease were analyzed in basal as well as interferon-α-stimulated states by stable isotopic l abeling using amino acids in cell culture (SILAC) coupled with protein separation and mass spectrometry. Candidates fulfilling stringent criteria for potential substrates or products of the NS3-4A protease were further i nvestigated in different experimental systems as well a s in liver biopsies from patients with chronic hepatitis C. Results: SILAC coupled with protein separation and mass spectrometry yielded > 5000 proteins of which 18 candidates were selected for further analyses. These allowed us to identify GPx8, a membrane-associated peroxidase involved in disulfide bond formation in the endoplasmic reticulum, as a n ovel cellular substrate of the H CV NS3-4A protease. Cleavage occurs at cysteine in position 11, removing the cytosolic tip of GPx8, and was observed in different experimental systems as well as in liver biopsies from patients with chronic hepatitis C. Further functional studies, involving overexpression and RNA silencing, revealed that GPx8 is a p roviral factor involved in viral particle production but not in HCV entry or HCV RNA replication. Conclusions: GPx8 is a proviral host factor cleaved by the HCV NS3-4A protease. Studies investigating the consequences of GPx8 cleavage for protein function are underway. The identification of novel cellular substrates o f the HCV N S3-4A protease should yield new insights i nto the HCV life cycle and the pathogenesis of hepatitis C and may reveal novel targets for antiviral intervention.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.