58 resultados para Feature-extraction
Resumo:
Estimating the time since the last discharge of firearms and/or spent cartridges may be a useful piece of information in forensic firearm-related cases. The current approach consists of studying the diffusion of selected volatile organic compounds (such as naphthalene) released during the shooting using solid phase micro-extraction (SPME). However, this technique works poorly on handgun car-tridges because the extracted quantities quickly fall below the limit of detection. In order to find more effective solutions and further investigate the aging of organic gunshot residue after the discharge of handgun cartridges, an extensive study was carried out in this work using a novel approach based on high capacity headspace sorptive extraction (HSSE). By adopting this technique, for the first time 51 gunshot residue (GSR) volatile organic compounds could be simultaneously detected from fired handgun cartridge cases. Application to aged specimens showed that many of those compounds presented significant and complementary aging profiles. Compound-to-compound ratios were also tested and proved to be beneficial both in reducing the variability of the aging curves and in enlarging the time window useful in a forensic casework perspective. The obtained results were thus particularly promising for the development of a new complete forensic dating methodology.
Resumo:
Mammalian genomes contain highly conserved sequences that are not functionally transcribed. These sequences are single copy and comprise approximately 1-2% of the human genome. Evolutionary analysis strongly supports their functional conservation, although their potentially diverse, functional attributes remain unknown. It is likely that genomic variation in conserved non-genic sequences is associated with phenotypic variability and human disorders. So how might their function and contribution to human disorders be examined?
Resumo:
Retained T-tubes are rare complications after biliary surgery. The authors present three cases of retained T-tubes in patients with transplanted liver that could not be removed by a standard manual traction. The authors describe a new simple percutaneous method that allows removal of these T-tubes without complication.
Resumo:
Les progrès continus des connaissances réalisés dans le domaine de l'oncohématologie depuis quelques décennies ont permis une amélioration considérable du pronostic de la plupart des formes de cancer. Toutefois, la morbidité et la mortalité attribuables aux infections apparaissent actuellement comme les principaux facteurs limitant l'agressivité des traitements de la maladie cancéreuse, et un meilleur contrôle de ces dernières est devenu l'un des éléments essentiels de la prise en charge de ce type de patients. Une recherche clinique intense a permis d'en identifier les grands principes qui sont exposés dans cet article. Un algorithme thérapeutique susceptible de guider le clinicien face au développement d'un état fébrile, toujours suspect d'infection chez le patient cancéreux neutropénique, est ensuite proposé.
Resumo:
Solid-phase extraction (SPE) in tandem with dispersive liquid-liquid microextraction (DLLME) has been developed for the determination of mononitrotoluenes (MNTs) in several aquatic samples using gas chromatography-flame ionization (GC-FID) detection system. In the hyphenated SPE-DLLME, initially MNTs were extracted from a large volume of aqueous samples (100 mL) into a 500-mg octadecyl silane (C(18) ) sorbent. After the elution of analytes from the sorbent with acetonitrile, the obtained solution was put under the DLLME procedure, so that the extra preconcentration factors could be achieved. The parameters influencing the extraction efficiency such as breakthrough volume, type and volume of the elution solvent (disperser solvent) and extracting solvent, as well as the salt addition, were studied and optimized. The calibration curves were linear in the range of 0.5-500 μg/L and the limit of detection for all analytes was found to be 0.2 μg/L. The relative standard deviations (for 0.75 μg/L of MNTs) without internal standard varied from 2.0 to 6.4% (n=5). The relative recoveries of the well, river and sea water samples, spiked at the concentration level of 0.75 μg/L of the analytes, were in the range of 85-118%.
Resumo:
Abstract Since its creation, the Internet has permeated our daily life. The web is omnipresent for communication, research and organization. This exploitation has resulted in the rapid development of the Internet. Nowadays, the Internet is the biggest container of resources. Information databases such as Wikipedia, Dmoz and the open data available on the net are a great informational potentiality for mankind. The easy and free web access is one of the major feature characterizing the Internet culture. Ten years earlier, the web was completely dominated by English. Today, the web community is no longer only English speaking but it is becoming a genuinely multilingual community. The availability of content is intertwined with the availability of logical organizations (ontologies) for which multilinguality plays a fundamental role. In this work we introduce a very high-level logical organization fully based on semiotic assumptions. We thus present the theoretical foundations as well as the ontology itself, named Linguistic Meta-Model. The most important feature of Linguistic Meta-Model is its ability to support the representation of different knowledge sources developed according to different underlying semiotic theories. This is possible because mast knowledge representation schemata, either formal or informal, can be put into the context of the so-called semiotic triangle. In order to show the main characteristics of Linguistic Meta-Model from a practical paint of view, we developed VIKI (Virtual Intelligence for Knowledge Induction). VIKI is a work-in-progress system aiming at exploiting the Linguistic Meta-Model structure for knowledge expansion. It is a modular system in which each module accomplishes a natural language processing task, from terminology extraction to knowledge retrieval. VIKI is a supporting system to Linguistic Meta-Model and its main task is to give some empirical evidence regarding the use of Linguistic Meta-Model without claiming to be thorough.
Resumo:
ABSTRACT: In sexual assault cases, autosomal DNA analysis of gynecological swabs is a challenge, as the presence of a large quantity of female material may prevent the detection of the male DNA. A solution to this problem is differential DNA extraction, but as there are different protocols, it was decided to test their efficiency on simulated casework samples. Four difficult samples were sent to the nine Swiss laboratories active in the forensic genetics. They used their routine protocols to separate the epithelial cell fraction, enriched with the non-sperm DNA, from the sperm fraction. DNA extracts were then sent to the organizing laboratory for analysis. Estimates of male to female DNA ratio without differential DNA extraction ranged from 1:38 to 1:339, depending on the semen used to prepare the samples. After differential DNA extraction, most of the ratios ranged from 1:12 to 9:1, allowing the detection of the male DNA. Compared to direct DNA extraction, cell separation resulted in losses of 94-98% of the male DNA. As expected, more male DNA was generally present in the sperm than in the epithelial cell fraction. However, for about 30% of the samples, the reverse trend was observed. The recovery of male and female DNA was highly variable depending on the laboratories. Experimental design similar to the one used in this study may help for local protocol testing and improvement.
Resumo:
In most pathology laboratories worldwide, formalin-fixed paraffin embedded (FFPE) samples are the only tissue specimens available for routine diagnostics. Although commercial kits for diagnostic molecular pathology testing are becoming available, most of the current diagnostic tests are laboratory-based assays. Thus, there is a need for standardized procedures in molecular pathology, starting from the extraction of nucleic acids. To evaluate the current methods for extracting nucleic acids from FFPE tissues, 13 European laboratories, participating to the European FP6 program IMPACTS (www.impactsnetwork.eu), isolated nucleic acids from four diagnostic FFPE tissues using their routine methods, followed by quality assessment. The DNA-extraction protocols ranged from homemade protocols to commercial kits. Except for one homemade protocol, the majority gave comparable results in terms of the quality of the extracted DNA measured by the ability to amplify differently sized control gene fragments by PCR. For array-applications or tests that require an accurately determined DNA-input, we recommend using silica based adsorption columns for DNA recovery. For RNA extractions, the best results were obtained using chromatography column based commercial kits, which resulted in the highest quantity and best assayable RNA. Quality testing using RT-PCR gave successful amplification of 200 bp-250 bp PCR products from most tested tissues. Modifications of the proteinase-K digestion time led to better results, even when commercial kits were applied. The results of the study emphasize the need for quality control of the nucleic acid extracts with standardised methods to prevent false negative results and to allow data comparison among different diagnostic laboratories.
Resumo:
The aim of this study was to evaluate the forensic protocol recently developed by Qiagen for the QIAsymphony automated DNA extraction platform. Samples containing low amounts of DNA were specifically considered, since they represent the majority of samples processed in our laboratory. The analysis of simulated blood and saliva traces showed that the highest DNA yields were obtained with the maximal elution volume available for the forensic protocol, that is 200 ml. Resulting DNA extracts were too diluted for successful DNA profiling and required a concentration. This additional step is time consuming and potentially increases inversion and contamination risks. The 200 ml DNA extracts were concentrated to 25 ml, and the DNA recovery estimated with real-time PCR as well as with the percentage of SGM Plus alleles detected. Results using our manual protocol, based on the QIAamp DNA mini kit, and the automated protocol were comparable. Further tests will be conducted to determine more precisely DNA recovery, contamination risk and PCR inhibitors removal, once a definitive procedure, allowing the concentration of DNA extracts from low yield samples, will be available for the QIAsymphony.
Resumo:
Objectifs: Exposer une nouvelle technique permettant le retrait de ciment extra-vertebral par fuite accidentelle lors du retrait du matériel. Matériels et méthodes: Nous injectons le ciment lorsque sa densité est celle d'une pâte dentifrice pour éviter les passages vasculaires. Lorsque la vertèbre est parfaitement remplie,nous patientons quelques minutes pour éviter la fuite de ciment le long du trajet (environ 4 a 6 cc en moyenne par vertèbre sont injectés). Malgré ces précautions,une fuite de ciment peut survenir lors du retrait du trocart. Cette complication est rare lorsque l'on prend les précautions nécessaires. Cependant, si la pressionintra-vertébrale est importante, le ciment peut être aspiré en dehors de la vertèbre. Résultats: L'aiguille du trocart est retirée pour être remplacée par une pince d'endoscopie 13 gauges. Sous guidage scopique, l'extraction se fait sous contrôle continu ; lapince retirant la fuite de ciment. Conclusion: La connaissance de cette intervention peut être très utile pour les équipes de radiologie interventionnelle , d'orthopédie et de neurochirurgie réalisant desvertébroplasties, pouvant être confrontées à ce type de problèmes.
Resumo:
The Radiello Passive Air Sampler is one of the latest innovations developed for the sampling of pollutants in the air by passive headspace. It has been reported that its properties allow an enhanced sensitivity, reproducibility and adsorption capacity. It therefore appears to be of interest in the extraction of potential residues of ignitable liquids present in fire debris when arson is suspected. A theoretical approach and several laboratory tests have made it possible to precisely characterize in a forensic perspective the potential of the device in extracting and concentrating the vapors of ignitable liquids found in fire debris. Despite some advantages, the Radiello device appears to be less efficient than traditional axial symmetry samplers.
Endoscopic extraction of a prevertebral migrated guidewire after posterior cervical instrumentation.
Resumo:
Images of Spine Care
Resumo:
In Neo-Darwinism, variation and natural selection are the two evolutionary mechanisms which propel biological evolution. Our previous reports presented a histogram model to simulate the evolution of populations of individuals classified into bins according to an unspecified, quantifiable phenotypic character, and whose number in each bin changed generation after generation under the influence of fitness, while the total population was maintained constant. The histogram model also allowed Shannon entropy (SE) to be monitored continuously as the information content of the total population decreased or increased. Here, a simple Perl (Practical Extraction and Reporting Language) application was developed to carry out these computations, with the critical feature of an added random factor in the percent of individuals whose offspring moved to a vicinal bin. The results of the simulations demonstrate that the random factor mimicking variation increased considerably the range of values covered by Shannon entropy, especially when the percentage of changed offspring was high. This increase in information content is interpreted as facilitated adaptability of the population.
Resumo:
In this paper we propose an innovative methodology for automated profiling of illicit tablets bytheir surface granularity; a feature previously unexamined for this purpose. We make use of the tinyinconsistencies at the tablet surface, referred to as speckles, to generate a quantitative granularity profileof tablets. Euclidian distance is used as a measurement of (dis)similarity between granularity profiles.The frequency of observed distances is then modelled by kernel density estimation in order to generalizethe observations and to calculate likelihood ratios (LRs). The resulting LRs are used to evaluate thepotential of granularity profiles to differentiate between same-batch and different-batches tablets.Furthermore, we use the LRs as a similarity metric to refine database queries. We are able to derivereliable LRs within a scope that represent the true evidential value of the granularity feature. Thesemetrics are used to refine candidate hit-lists form a database containing physical features of illicittablets. We observe improved or identical ranking of candidate tablets in 87.5% of cases when granularityis considered.