66 resultados para analytical techniques
Resumo:
«Quel est l'âge de cette trace digitale?» Cette question est relativement souvent soulevée au tribunal ou lors d'investigations, lorsque la personne suspectée admet avoir laissé ses empreintes digitales sur une scène de crime mais prétend l'avoir fait à un autre moment que celui du crime et pour une raison innocente. Toutefois, aucune réponse ne peut actuellement être donnée à cette question, puisqu'aucune méthodologie n'est pour l'heure validée et acceptée par l'ensemble de la communauté forensique. Néanmoins, l'inventaire de cas américains conduit dans cette recherche a montré que les experts fournissent tout de même des témoignages au tribunal concernant l'âge de traces digitales, même si ceux-‐ci sont majoritairement basés sur des paramètres subjectifs et mal documentés. Il a été relativement aisé d'accéder à des cas américains détaillés, ce qui explique le choix de l'exemple. Toutefois, la problématique de la datation des traces digitales est rencontrée dans le monde entier, et le manque de consensus actuel dans les réponses données souligne la nécessité d'effectuer des études sur le sujet. Le but de la présente recherche est donc d'évaluer la possibilité de développer une méthode de datation objective des traces digitales. Comme les questions entourant la mise au point d'une telle procédure ne sont pas nouvelles, différentes tentatives ont déjà été décrites dans la littérature. Cette recherche les a étudiées de manière critique, et souligne que la plupart des méthodologies reportées souffrent de limitations prévenant leur utilisation pratique. Néanmoins, certaines approches basées sur l'évolution dans le temps de composés intrinsèques aux résidus papillaires se sont montrées prometteuses. Ainsi, un recensement détaillé de la littérature a été conduit afin d'identifier les composés présents dans les traces digitales et les techniques analytiques capables de les détecter. Le choix a été fait de se concentrer sur les composés sébacés détectés par chromatographie gazeuse couplée à la spectrométrie de masse (GC/MS) ou par spectroscopie infrarouge à transformée de Fourier. Des analyses GC/MS ont été menées afin de caractériser la variabilité initiale de lipides cibles au sein des traces digitales d'un même donneur (intra-‐variabilité) et entre les traces digitales de donneurs différents (inter-‐variabilité). Ainsi, plusieurs molécules ont été identifiées et quantifiées pour la première fois dans les résidus papillaires. De plus, il a été déterminé que l'intra-‐variabilité des résidus était significativement plus basse que l'inter-‐variabilité, mais que ces deux types de variabilité pouvaient être réduits en utilisant différents pré-‐ traitements statistiques s'inspirant du domaine du profilage de produits stupéfiants. Il a également été possible de proposer un modèle objectif de classification des donneurs permettant de les regrouper dans deux classes principales en se basant sur la composition initiale de leurs traces digitales. Ces classes correspondent à ce qui est actuellement appelé de manière relativement subjective des « bons » ou « mauvais » donneurs. Le potentiel d'un tel modèle est élevé dans le domaine de la recherche en traces digitales, puisqu'il permet de sélectionner des donneurs représentatifs selon les composés d'intérêt. En utilisant la GC/MS et la FTIR, une étude détaillée a été conduite sur les effets de différents facteurs d'influence sur la composition initiale et le vieillissement de molécules lipidiques au sein des traces digitales. Il a ainsi été déterminé que des modèles univariés et multivariés pouvaient être construits pour décrire le vieillissement des composés cibles (transformés en paramètres de vieillissement par pré-‐traitement), mais que certains facteurs d'influence affectaient ces modèles plus sérieusement que d'autres. En effet, le donneur, le substrat et l'application de techniques de révélation semblent empêcher la construction de modèles reproductibles. Les autres facteurs testés (moment de déposition, pression, température et illumination) influencent également les résidus et leur vieillissement, mais des modèles combinant différentes valeurs de ces facteurs ont tout de même prouvé leur robustesse dans des situations bien définies. De plus, des traces digitales-‐tests ont été analysées par GC/MS afin d'être datées en utilisant certains des modèles construits. Il s'est avéré que des estimations correctes étaient obtenues pour plus de 60 % des traces-‐tests datées, et jusqu'à 100% lorsque les conditions de stockage étaient connues. Ces résultats sont intéressants mais il est impératif de conduire des recherches supplémentaires afin d'évaluer les possibilités d'application de ces modèles dans des cas réels. Dans une perspective plus fondamentale, une étude pilote a également été effectuée sur l'utilisation de la spectroscopie infrarouge combinée à l'imagerie chimique (FTIR-‐CI) afin d'obtenir des informations quant à la composition et au vieillissement des traces digitales. Plus précisément, la capacité de cette technique à mettre en évidence le vieillissement et l'effet de certains facteurs d'influence sur de larges zones de traces digitales a été investiguée. Cette information a ensuite été comparée avec celle obtenue par les spectres FTIR simples. Il en a ainsi résulté que la FTIR-‐CI était un outil puissant, mais que son utilisation dans l'étude des résidus papillaires à des buts forensiques avait des limites. En effet, dans cette recherche, cette technique n'a pas permis d'obtenir des informations supplémentaires par rapport aux spectres FTIR traditionnels et a également montré des désavantages majeurs, à savoir de longs temps d'analyse et de traitement, particulièrement lorsque de larges zones de traces digitales doivent être couvertes. Finalement, les résultats obtenus dans ce travail ont permis la proposition et discussion d'une approche pragmatique afin d'aborder les questions de datation des traces digitales. Cette approche permet ainsi d'identifier quel type d'information le scientifique serait capable d'apporter aux enquêteurs et/ou au tribunal à l'heure actuelle. De plus, le canevas proposé décrit également les différentes étapes itératives de développement qui devraient être suivies par la recherche afin de parvenir à la validation d'une méthodologie de datation des traces digitales objective, dont les capacités et limites sont connues et documentées. -- "How old is this fingermark?" This question is relatively often raised in trials when suspects admit that they have left their fingermarks on a crime scene but allege that the contact occurred at a time different to that of the crime and for legitimate reasons. However, no answer can be given to this question so far, because no fingermark dating methodology has been validated and accepted by the whole forensic community. Nevertheless, the review of past American cases highlighted that experts actually gave/give testimonies in courts about the age of fingermarks, even if mostly based on subjective and badly documented parameters. It was relatively easy to access fully described American cases, thus explaining the origin of the given examples. However, fingermark dating issues are encountered worldwide, and the lack of consensus among the given answers highlights the necessity to conduct research on the subject. The present work thus aims at studying the possibility to develop an objective fingermark dating method. As the questions surrounding the development of dating procedures are not new, different attempts were already described in the literature. This research proposes a critical review of these attempts and highlights that most of the reported methodologies still suffer from limitations preventing their use in actual practice. Nevertheless, some approaches based on the evolution of intrinsic compounds detected in fingermark residue over time appear to be promising. Thus, an exhaustive review of the literature was conducted in order to identify the compounds available in the fingermark residue and the analytical techniques capable of analysing them. It was chosen to concentrate on sebaceous compounds analysed using gas chromatography coupled with mass spectrometry (GC/MS) or Fourier transform infrared spectroscopy (FTIR). GC/MS analyses were conducted in order to characterize the initial variability of target lipids among fresh fingermarks of the same donor (intra-‐variability) and between fingermarks of different donors (inter-‐variability). As a result, many molecules were identified and quantified for the first time in fingermark residue. Furthermore, it was determined that the intra-‐variability of the fingermark residue was significantly lower than the inter-‐variability, but that it was possible to reduce both kind of variability using different statistical pre-‐ treatments inspired from the drug profiling area. It was also possible to propose an objective donor classification model allowing the grouping of donors in two main classes based on their initial lipid composition. These classes correspond to what is relatively subjectively called "good" or "bad" donors. The potential of such a model is high for the fingermark research field, as it allows the selection of representative donors based on compounds of interest. Using GC/MS and FTIR, an in-‐depth study of the effects of different influence factors on the initial composition and aging of target lipid molecules found in fingermark residue was conducted. It was determined that univariate and multivariate models could be build to describe the aging of target compounds (transformed in aging parameters through pre-‐ processing techniques), but that some influence factors were affecting these models more than others. In fact, the donor, the substrate and the application of enhancement techniques seemed to hinder the construction of reproducible models. The other tested factors (deposition moment, pressure, temperature and illumination) also affected the residue and their aging, but models combining different values of these factors still proved to be robust. Furthermore, test-‐fingermarks were analysed with GC/MS in order to be dated using some of the generated models. It turned out that correct estimations were obtained for 60% of the dated test-‐fingermarks and until 100% when the storage conditions were known. These results are interesting but further research should be conducted to evaluate if these models could be used in uncontrolled casework conditions. In a more fundamental perspective, a pilot study was also conducted on the use of infrared spectroscopy combined with chemical imaging in order to gain information about the fingermark composition and aging. More precisely, its ability to highlight influence factors and aging effects over large areas of fingermarks was investigated. This information was then compared with that given by individual FTIR spectra. It was concluded that while FTIR-‐ CI is a powerful tool, its use to study natural fingermark residue for forensic purposes has to be carefully considered. In fact, in this study, this technique does not yield more information on residue distribution than traditional FTIR spectra and also suffers from major drawbacks, such as long analysis and processing time, particularly when large fingermark areas need to be covered. Finally, the results obtained in this research allowed the proposition and discussion of a formal and pragmatic framework to approach the fingermark dating questions. It allows identifying which type of information the scientist would be able to bring so far to investigators and/or Justice. Furthermore, this proposed framework also describes the different iterative development steps that the research should follow in order to achieve the validation of an objective fingermark dating methodology, whose capacities and limits are well known and properly documented.
Resumo:
Since the first anti-doping tests in the 1960s, the analytical aspects of the testing remain challenging. The evolution of the analytical process in doping control is discussed in this paper with a particular emphasis on separation techniques, such as gas chromatography and liquid chromatography. These approaches are improving in parallel with the requirements of increasing sensitivity and selectivity for detecting prohibited substances in biological samples from athletes. Moreover, fast analyses are mandatory to deal with the growing number of doping control samples and the short response time required during particular sport events. Recent developments in mass spectrometry and the expansion of accurate mass determination has improved anti-doping strategies with the possibility of using elemental composition and isotope patterns for structural identification. These techniques must be able to distinguish equivocally between negative and suspicious samples with no false-negative or false-positive results. Therefore, high degree of reliability must be reached for the identification of major metabolites corresponding to suspected analytes. Along with current trends in pharmaceutical industry the analysis of proteins and peptides remains an important issue in doping control. Sophisticated analytical tools are still mandatory to improve their distinction from endogenous analogs. Finally, indirect approaches will be discussed in the context of anti-doping, in which recent advances are aimed to examine the biological response of a doping agent in a holistic way.
Resumo:
The safe and responsible development of engineered nanomaterials (ENM), nanotechnology-based materials and products, together with the definition of regulatory measures and implementation of "nano"-legislation in Europe require a widely supported scientific basis and sufficient high quality data upon which to base decisions. At the very core of such a scientific basis is a general agreement on key issues related to risk assessment of ENMs which encompass the key parameters to characterise ENMs, appropriate methods of analysis and best approach to express the effect of ENMs in widely accepted dose response toxicity tests. The following major conclusions were drawn: Due to high batch variability of ENMs characteristics of commercially available and to a lesser degree laboratory made ENMs it is not possible to make general statements regarding the toxicity resulting from exposure to ENMs. 1) Concomitant with using the OECD priority list of ENMs, other criteria for selection of ENMs like relevance for mechanistic (scientific) studies or risk assessment-based studies, widespread availability (and thus high expected volumes of use) or consumer concern (route of consumer exposure depending on application) could be helpful. The OECD priority list is focussing on validity of OECD tests. Therefore source material will be first in scope for testing. However for risk assessment it is much more relevant to have toxicity data from material as present in products/matrices to which men and environment are be exposed. 2) For most, if not all characteristics of ENMs, standardized methods analytical methods, though not necessarily validated, are available. Generally these methods are only able to determine one single characteristic and some of them can be rather expensive. Practically, it is currently not feasible to fully characterise ENMs. Many techniques that are available to measure the same nanomaterial characteristic produce contrasting results (e.g. reported sizes of ENMs). It was recommended that at least two complementary techniques should be employed to determine a metric of ENMs. The first great challenge is to prioritise metrics which are relevant in the assessment of biological dose response relations and to develop analytical methods for characterising ENMs in biological matrices. It was generally agreed that one metric is not sufficient to describe fully ENMs. 3) Characterisation of ENMs in biological matrices starts with sample preparation. It was concluded that there currently is no standard approach/protocol for sample preparation to control agglomeration/aggregation and (re)dispersion. It was recommended harmonization should be initiated and that exchange of protocols should take place. The precise methods used to disperse ENMs should be specifically, yet succinctly described within the experimental section of a publication. 4) ENMs need to be characterised in the matrix as it is presented to the test system (in vitro/ in vivo). 5) Alternative approaches (e.g. biological or in silico systems) for the characterisation of ENMS are simply not possible with the current knowledge. Contributors: Iseult Lynch, Hans Marvin, Kenneth Dawson, Markus Berges, Diane Braguer, Hugh J. Byrne, Alan Casey, Gordon Chambers, Martin Clift, Giuliano Elia1, Teresa F. Fernandes, Lise Fjellsbø, Peter Hatto, Lucienne Juillerat, Christoph Klein, Wolfgang Kreyling, Carmen Nickel1, and Vicki Stone.
Resumo:
Propane can be responsible for several types of lethal intoxication and explosions. Quantifying it would be very helpful to determine in some cases the cause of death. Some gas chromatography-mass spectrometry (GC-MS) methods of propane measurements do already exist. The main drawback of these GC-MS methods described in the literature is the absence of a specific propane internal standard necessary for accurate quantitative analysis. The main outcome of the following study was to provide an innovative Headspace-GC-MS method (HS-GC-MS) applicable to the routine determination of propane concentration in forensic toxicology laboratories. To date, no stable isotope of propane is commercially available. The development of an in situ generation of standards is thus presented. An internal-labeled standard gas (C3DH7) is generated in situ by the stoichiometric formation of propane by the reaction of deuterated water (D2O) with Grignard reagent propylmagnesium chloride (C3H7MgCl). The method aims to use this internal standard to quantify propane concentrations and, therefore, to obtain precise measurements. Consequently, a complete validation with an accuracy profile according to two different guidelines, the French Society of Pharmaceutical Sciences and Techniques (SFSTP) and the Gesellschaft für toxikologische und Forensische Chemie (GTFCh), is presented.
Resumo:
The fight against doping in sports has been governed since 1999 by the World Anti-Doping Agency (WADA), an independent institution behind the implementation of the World Anti-Doping Code (Code). The intent of the Code is to protect clean athletes through the harmonization of anti-doping programs at the international level with special attention to detection, deterrence and prevention of doping.1 A new version of the Code came into force on January 1st 2015, introducing, among other improvements, longer periods of sanctioning for athletes (up to four years) and measures to strengthen the role of anti-doping investigations and intelligence. To ensure optimal harmonization, five International Standards covering different technical aspects of the Code are also currently in force: the List of Prohibited Substances and Methods (List), Testing and Investigations, Laboratories, Therapeutic Use Exemptions (TUE) and Protection of Privacy and Personal Information. Adherence to these standards is mandatory for all anti-doping stakeholders to be compliant with the Code. Among these documents, the eighth version of International Standard for Laboratories (ISL), which also came into effect on January 1st 2015, includes regulations for WADA and ISO/IEC 17025 accreditations and their application for urine and blood sample analysis by anti-doping laboratories.2 Specific requirements are also described in several Technical Documents or Guidelines in which various topics are highlighted such as the identification criteria for gas chromatography (GC) and liquid chromatography (LC) coupled to mass spectrometry (MS) techniques (IDCR), measurements and reporting of endogenous androgenic anabolic agents (EAAS) and analytical requirements for the Athlete Biological Passport (ABP).
Resumo:
The potential and applicability of UHPSFC-MS/MS for anti-doping screening in urine samples were tested for the first time. For this purpose, a group of 110 doping agents with diverse physicochemical properties was analyzed using two separation techniques, namely UHPLC-MS/MS and UHPSFC-MS/MS in both ESI+ and ESI- modes. The two approaches were compared in terms of selectivity, sensitivity, linearity and matrix effects. As expected, very diverse retentions and selectivities were obtained in UHPLC and UHPSFC, proving a good complementarity of these analytical strategies. In both conditions, acceptable peak shapes and MS detection capabilities were obtained within 7min analysis time, enabling the application of these two methods for screening purposes. Method sensitivity was found comparable for 46% of tested compounds, while higher sensitivity was observed for 21% of tested compounds in UHPLC-MS/MS and for 32% in UHPSFC-MS/MS. The latter demonstrated a lower susceptibility to matrix effects, which were mostly observed as signal suppression. In the case of UHPLC-MS/MS, more serious matrix effects were observed, leading typically to signal enhancement and the matrix effect was also concentration dependent, i.e., more significant matrix effects occurred at the lowest concentrations.
Resumo:
PURPOSE: The Cancer Vaccine Consortium of the Cancer Research Institute (CVC-CRI) conducted a multicenter HLA-peptide multimer proficiency panel (MPP) with a group of 27 laboratories to assess the performance of the assay. EXPERIMENTAL DESIGN: Participants used commercially available HLA-peptide multimers and a well characterized common source of peripheral blood mononuclear cells (PBMC). The frequency of CD8+ T cells specific for two HLA-A2-restricted model antigens was measured by flow cytometry. The panel design allowed for participants to use their preferred staining reagents and locally established protocols for both cell labeling, data acquisition and analysis. RESULTS: We observed significant differences in both the performance characteristics of the assay and the reported frequencies of specific T cells across laboratories. These results emphasize the need to identify the critical variables important for the observed variability to allow for harmonization of the technique across institutions. CONCLUSIONS: Three key recommendations emerged that would likely reduce assay variability and thus move toward harmonizing of this assay. (1) Use of more than two colors for the staining (2) collect at least 100,000 CD8 T cells, and (3) use of a background control sample to appropriately set the analytical gates. We also provide more insight into the limitations of the assay and identified additional protocol steps that potentially impact the quality of data generated and therefore should serve as primary targets for systematic analysis in future panels. Finally, we propose initial guidelines for harmonizing assay performance which include the introduction of standard operating protocols to allow for adequate training of technical staff and auditing of test analysis procedures.
Resumo:
Drug abuse is a widespread problem affecting both teenagers and adults. Nitrous oxide is becoming increasingly popular as an inhalation drug, causing harmful neurological and hematological effects. Some gas chromatography-mass spectrometry (GC-MS) methods for nitrous oxide measurement have been previously described. The main drawbacks of these methods include a lack of sensitivity for forensic applications; including an inability to quantitatively determine the concentration of gas present. The following study provides a validated method using HS-GC-MS which incorporates hydrogen sulfide as a suitable internal standard allowing the quantification of nitrous oxide. Upon analysis, sample and internal standard have similar retention times and are eluted quickly from the molecular sieve 5Å PLOT capillary column and the Porabond Q column therefore providing rapid data collection whilst preserving well defined peaks. After validation, the method has been applied to a real case of N2O intoxication indicating concentrations in a mono-intoxication.
Resumo:
A generic optical biosensing strategy was developed that relies on the absorbance enhancement phenomenon occurring in a multiple scattering matrix. Experimentally, inserts made of glass fiber membrane were placed into microplate wells in order to significantly lengthen the trajectory of the incident light through the sample and therefore increase the corresponding absorbance. Enhancement factor was calculated by comparing the absorbance values measured for a given amount of dye with and without the absorbance-enhancing inserts in the wells. Moreover, the dilution of dye in solutions with different refractive indices (RI) clearly revealed that the enhancement factor increased with the ΔRI between the membrane and the surrounding medium, reaching a maximum value (EF>25) when the membranes were dried. On this basis, two H2O2-biosensing systems were developed based on the biofunctionalization of the glass fiber inserts either with cytochrome c or horseradish peroxidase (HRP) and the analytical performances were systematically compared with the corresponding bioassay in solution. The efficiency of the absorbance-enhancement approach was particularly clear in the case of the cytochrome c-based biosensor with a sensitivity gain of 40 folds and wider dynamic range. Therefore, the developed strategy represents a promising way to convert standard colorimetric bioassays into optical biosensors with improved sensitivity.
Resumo:
This work is focused on the development of a methodology for the use of chemical characteristic of tire traces to help answer the following question: "Is the offending tire at the origin of the trace found on the crime scene?". This methodology goes from the trace sampling on the road to statistical analysis of its chemical characteristics. Knowledge about the composition and manufacture of tread tires as well as a review of instrumental techniques used for the analysis of polymeric materials were studied to select, as an ansi vi cal technique for this research, pyrolysis coupled to a gas Chromatograph with a mass spectrometry detector (Py-GC/MS). An analytical method was developed and optimized to obtain the lowest variability between replicates of the same sample. Within-variability of the tread was evaluated regarding width and circumference with several samples taken from twelve tires of different brands and/or models. The variability within each of the treads (within-variability) and between the treads (between-variability) could be quantified. Different statistical methods have shown that within-variability is lower than between-variability, which helped differentiate these tires. Ten tire traces were produced with tires of different brands and/or models by braking tests. These traces have been adequately sampled using sheets of gelatine. Particles of each trace were analysed using the same methodology as for the tires at their origin. The general chemical profile of a trace or of a tire has been characterized by eighty-six compounds. Based on a statistical comparison of the chemical profiles obtained, it has been shown that a tire trace is not differentiable from the tire at its origin but is generally differentiable from tires that are not at its origin. Thereafter, a sample containing sixty tires was analysed to assess the discrimination potential of the developed methodology. The statistical results showed that most of the tires of different brands and models are differentiable. However, tires of the same brand and model with identical characteristics, such as country of manufacture, size and DOT number, are not differentiable. A model, based on a likelihood ratio approach, was chosen to evaluate the results of the comparisons between the chemical profiles of the traces and tires. The methodology developed was finally blindly tested using three simulated scenarios. Each scenario involved a trace of an unknown tire as well as two tires possibly at its origin. The correct results for the three scenarios were used to validate the developed methodology. The different steps of this work were useful to collect the required information to test and validate the underlying assumption that it is possible to help determine if an offending tire » or is not at the origin of a trace, by means of a statistical comparison of their chemical profile. This aid was formalized by a measure of the probative value of the evidence, which is represented by the chemical profile of the trace of the tire. - Ce travail s'est proposé de développer une méthodologie pour l'exploitation des caractéristiques chimiques des traces de pneumatiques dans le but d'aider à répondre à la question suivante : «Est-ce que le pneumatique incriminé est ou n'est pas à l'origine de la trace relevée sur les lieux ? ». Cette méthodologie s'est intéressée du prélèvement de la trace de pneumatique sur la chaussée à l'exploitation statistique de ses caractéristiques chimiques. L'acquisition de connaissances sur la composition et la fabrication de la bande de roulement des pneumatiques ainsi que la revue de techniques instrumentales utilisées pour l'analyse de matériaux polymériques ont permis de choisir, comme technique analytique pour la présente recherche, la pyrolyse couplée à un chromatographe en phase gazeuse avec un détecteur de spectrométrie de masse (Py-GC/MS). Une méthode analytique a été développée et optimisée afin d'obtenir la plus faible variabilité entre les réplicas d'un même échantillon. L'évaluation de l'intravariabilité de la bande de roulement a été entreprise dans sa largeur et sa circonférence à l'aide de plusieurs prélèvements effectués sur douze pneumatiques de marques et/ou modèles différents. La variabilité au sein de chacune des bandes de roulement (intravariabilité) ainsi qu'entre les bandes de roulement considérées (intervariabilité) a pu être quantifiée. Les différentes méthodes statistiques appliquées ont montré que l'intravariabilité est plus faible que l'intervariabilité, ce qui a permis de différencier ces pneumatiques. Dix traces de pneumatiques ont été produites à l'aide de pneumatiques de marques et/ou modèles différents en effectuant des tests de freinage. Ces traces ont pu être adéquatement prélevées à l'aide de feuilles de gélatine. Des particules de chaque trace ont été analysées selon la même méthodologie que pour les pneumatiques à leur origine. Le profil chimique général d'une trace de pneumatique ou d'un pneumatique a été caractérisé à l'aide de huitante-six composés. Sur la base de la comparaison statistique des profils chimiques obtenus, il a pu être montré qu'une trace de pneumatique n'est pas différenciable du pneumatique à son origine mais est, généralement, différenciable des pneumatiques qui ne sont pas à son origine. Par la suite, un échantillonnage comprenant soixante pneumatiques a été analysé afin d'évaluer le potentiel de discrimination de la méthodologie développée. Les méthodes statistiques appliquées ont mis en évidence que des pneumatiques de marques et modèles différents sont, majoritairement, différenciables entre eux. La méthodologie développée présente ainsi un bon potentiel de discrimination. Toutefois, des pneumatiques de la même marque et du même modèle qui présentent des caractéristiques PTD (i.e. pays de fabrication, taille et numéro DOT) identiques ne sont pas différenciables. Un modèle d'évaluation, basé sur une approche dite du likelihood ratio, a été adopté pour apporter une signification au résultat des comparaisons entre les profils chimiques des traces et des pneumatiques. La méthodologie mise en place a finalement été testée à l'aveugle à l'aide de la simulation de trois scénarios. Chaque scénario impliquait une trace de pneumatique inconnue et deux pneumatiques suspectés d'être à l'origine de cette trace. Les résultats corrects obtenus pour les trois scénarios ont permis de valider la méthodologie développée. Les différentes étapes de ce travail ont permis d'acquérir les informations nécessaires au test et à la validation de l'hypothèse fondamentale selon laquelle il est possible d'aider à déterminer si un pneumatique incriminé est ou n'est pas à l'origine d'une trace, par le biais d'une comparaison statistique de leur profil chimique. Cette aide a été formalisée par une mesure de la force probante de l'indice, qui est représenté par le profil chimique de la trace de pneumatique.
Resumo:
An active, solvent-free solid sampler was developed for the collection of 1,6-hexamethylene diisocyanate (HDI) aerosol and prepolymers. The sampler was made of a filter impregnated with 1-(2-methoxyphenyl)piperazine contained in a filter holder. Interferences with HDI were observed when a set of cellulose acetate filters and a polystyrene filter holder were used; a glass fiber filter and polypropylene filter cassette gave better results. The applicability of the sampling and analytical procedure was validated with a test chamber, constructed for the dynamic generation of HDI aerosol and prepolymers in commercial two-component spray paints (Desmodur(R) N75) used in car refinishing. The particle size distribution, temporal stability, and spatial uniformity of the simulated aerosol were established in order to test the sample. The monitoring of aerosol concentrations was conducted with the solid sampler paired to the reference impinger technique (impinger flasks contained 10 mL of 0.5 mg/mL 1-(2-methoxyphenyl)piperazine in toluene) under a controlled atmosphere in the test chamber. Analyses of derivatized HDI and prepolymers were carried out by using high-performance liquid chromatography and ultraviolet detection. The correlation between the solvent-free and the impinger techniques appeared fairly good (Y = 0.979X - 0.161; R = 0.978), when the tests were conducted in the range of 0.1 to 10 times the threshold limit value (TLV) for HDI monomer and up to 60-mu-g/m3 (3 U.K. TLVs) for total -N = C = O groups.
Resumo:
Matrix effects, which represent an important issue in liquid chromatography coupled to mass spectrometry or tandem mass spectrometry detection, should be closely assessed during method development. In the case of quantitative analysis, the use of stable isotope-labelled internal standard with physico-chemical properties and ionization behaviour similar to the analyte is recommended. In this paper, an example of the choice of a co-eluting deuterated internal standard to compensate for short-term and long-term matrix effect in the case of chiral (R,S)-methadone plasma quantification is reported. The method was fully validated over a concentration range of 5-800 ng/mL for each methadone enantiomer with satisfactory relative bias (-1.0 to 1.0%), repeatability (0.9-4.9%) and intermediate precision (1.4-12.0%). From the results obtained during validation, a control chart process during 52 series of routine analysis was established using both intermediate precision standard deviation and FDA acceptance criteria. The results of routine quality control samples were generally included in the +/-15% variability around the target value and mainly in the two standard deviation interval illustrating the long-term stability of the method. The intermediate precision variability estimated in method validation was found to be coherent with the routine use of the method. During this period, 257 trough concentration and 54 peak concentration plasma samples of patients undergoing (R,S)-methadone treatment were successfully analysed for routine therapeutic drug monitoring.