994 resultados para Chromatography techniques
Resumo:
L'imagerie par résonance magnétique (IRM) peut fournir aux cardiologues des informations diagnostiques importantes sur l'état de la maladie de l'artère coronarienne dans les patients. Le défi majeur pour l'IRM cardiaque est de gérer toutes les sources de mouvement qui peuvent affecter la qualité des images en réduisant l'information diagnostique. Cette thèse a donc comme but de développer des nouvelles techniques d'acquisitions des images IRM, en changeant les techniques de compensation du mouvement, pour en augmenter l'efficacité, la flexibilité, la robustesse et pour obtenir plus d'information sur le tissu et plus d'information temporelle. Les techniques proposées favorisent donc l'avancement de l'imagerie des coronaires dans une direction plus maniable et multi-usage qui peut facilement être transférée dans l'environnement clinique. La première partie de la thèse s'est concentrée sur l'étude du mouvement des artères coronariennes sur des patients en utilisant la techniques d'imagerie standard (rayons x), pour mesurer la précision avec laquelle les artères coronariennes retournent dans la même position battement après battement (repositionnement des coronaires). Nous avons découvert qu'il y a des intervalles dans le cycle cardiaque, tôt dans la systole et à moitié de la diastole, où le repositionnement des coronaires est au minimum. En réponse nous avons développé une nouvelle séquence d'acquisition (T2-post) capable d'acquérir les données aussi tôt dans la systole. Cette séquence a été testée sur des volontaires sains et on a pu constater que la qualité de visualisation des artère coronariennes est égale à celle obtenue avec les techniques standard. De plus, le rapport signal sur bruit fourni par la séquence d'acquisition proposée est supérieur à celui obtenu avec les techniques d'imagerie standard. La deuxième partie de la thèse a exploré un paradigme d'acquisition des images cardiaques complètement nouveau pour l'imagerie du coeur entier. La technique proposée dans ce travail acquiert les données sans arrêt (free-running) au lieu d'être synchronisée avec le mouvement cardiaque. De cette façon, l'efficacité de la séquence d'acquisition est augmentée de manière significative et les images produites représentent le coeur entier dans toutes les phases cardiaques (quatre dimensions, 4D). Par ailleurs, l'auto-navigation de la respiration permet d'effectuer cette acquisition en respiration libre. Cette technologie rend possible de visualiser et évaluer l'anatomie du coeur et de ses vaisseaux ainsi que la fonction cardiaque en quatre dimensions et avec une très haute résolution spatiale et temporelle, sans la nécessité d'injecter un moyen de contraste. Le pas essentiel qui a permis le développement de cette technique est l'utilisation d'une trajectoire d'acquisition radiale 3D basée sur l'angle d'or. Avec cette trajectoire, il est possible d'acquérir continûment les données d'espace k, puis de réordonner les données et choisir les paramètres temporel des images 4D a posteriori. L'acquisition 4D a été aussi couplée avec un algorithme de reconstructions itératif (compressed sensing) qui permet d'augmenter la résolution temporelle tout en augmentant la qualité des images. Grâce aux images 4D, il est possible maintenant de visualiser les artères coronariennes entières dans chaque phase du cycle cardiaque et, avec les mêmes données, de visualiser et mesurer la fonction cardiaque. La qualité des artères coronariennes dans les images 4D est la même que dans les images obtenues avec une acquisition 3D standard, acquise en diastole Par ailleurs, les valeurs de fonction cardiaque mesurées au moyen des images 4D concorde avec les valeurs obtenues avec les images 2D standard. Finalement, dans la dernière partie de la thèse une technique d'acquisition a temps d'écho ultra-court (UTE) a été développée pour la visualisation in vivo des calcifications des artères coronariennes. Des études récentes ont démontré que les acquisitions UTE permettent de visualiser les calcifications dans des plaques athérosclérotiques ex vivo. Cepandent le mouvement du coeur a entravé jusqu'à maintenant l'utilisation des techniques UTE in vivo. Pour résoudre ce problème nous avons développé une séquence d'acquisition UTE avec trajectoire radiale 3D et l'avons testée sur des volontaires. La technique proposée utilise une auto-navigation 3D pour corriger le mouvement respiratoire et est synchronisée avec l'ECG. Trois échos sont acquis pour extraire le signal de la calcification avec des composants au T2 très court tout en permettant de séparer le signal de la graisse depuis le signal de l'eau. Les résultats sont encore préliminaires mais on peut affirmer que la technique développé peut potentiellement montrer les calcifications des artères coronariennes in vivo. En conclusion, ce travail de thèse présente trois nouvelles techniques pour l'IRM du coeur entier capables d'améliorer la visualisation et la caractérisation de la maladie athérosclérotique des coronaires. Ces techniques fournissent des informations anatomiques et fonctionnelles en quatre dimensions et des informations sur la composition du tissu auparavant indisponibles. CORONARY artery magnetic resonance imaging (MRI) has the potential to provide the cardiologist with relevant diagnostic information relative to coronary artery disease of patients. The major challenge of cardiac MRI, though, is dealing with all sources of motions that can corrupt the images affecting the diagnostic information provided. The current thesis, thus, focused on the development of new MRI techniques that change the standard approach to cardiac motion compensation in order to increase the efficiency of cardioavscular MRI, to provide more flexibility and robustness, new temporal information and new tissue information. The proposed approaches help in advancing coronary magnetic resonance angiography (MRA) in the direction of an easy-to-use and multipurpose tool that can be translated to the clinical environment. The first part of the thesis focused on the study of coronary artery motion through gold standard imaging techniques (x-ray angiography) in patients, in order to measure the precision with which the coronary arteries assume the same position beat after beat (coronary artery repositioning). We learned that intervals with minimal coronary artery repositioning occur in peak systole and in mid diastole and we responded with a new pulse sequence (T2~post) that is able to provide peak-systolic imaging. Such a sequence was tested in healthy volunteers and, from the image quality comparison, we learned that the proposed approach provides coronary artery visualization and contrast-to-noise ratio (CNR) comparable with the standard acquisition approach, but with increased signal-to-noise ratio (SNR). The second part of the thesis explored a completely new paradigm for whole- heart cardiovascular MRI. The proposed techniques acquires the data continuously (free-running), instead of being triggered, thus increasing the efficiency of the acquisition and providing four dimensional images of the whole heart, while respiratory self navigation allows for the scan to be performed in free breathing. This enabling technology allows for anatomical and functional evaluation in four dimensions, with high spatial and temporal resolution and without the need for contrast agent injection. The enabling step is the use of a golden-angle based 3D radial trajectory, which allows for a continuous sampling of the k-space and a retrospective selection of the timing parameters of the reconstructed dataset. The free-running 4D acquisition was then combined with a compressed sensing reconstruction algorithm that further increases the temporal resolution of the 4D dataset, while at the same time increasing the overall image quality by removing undersampling artifacts. The obtained 4D images provide visualization of the whole coronary artery tree in each phases of the cardiac cycle and, at the same time, allow for the assessment of the cardiac function with a single free- breathing scan. The quality of the coronary arteries provided by the frames of the free-running 4D acquisition is in line with the one obtained with the standard ECG-triggered one, and the cardiac function evaluation matched the one measured with gold-standard stack of 2D cine approaches. Finally, the last part of the thesis focused on the development of ultrashort echo time (UTE) acquisition scheme for in vivo detection of calcification in the coronary arteries. Recent studies showed that UTE imaging allows for the coronary artery plaque calcification ex vivo, since it is able to detect the short T2 components of the calcification. The heart motion, though, prevented this technique from being applied in vivo. An ECG-triggered self-navigated 3D radial triple- echo UTE acquisition has then been developed and tested in healthy volunteers. The proposed sequence combines a 3D self-navigation approach with a 3D radial UTE acquisition enabling data collection during free breathing. Three echoes are simultaneously acquired to extract the short T2 components of the calcification while a water and fat separation technique allows for proper visualization of the coronary arteries. Even though the results are still preliminary, the proposed sequence showed great potential for the in vivo visualization of coronary artery calcification. In conclusion, the thesis presents three novel MRI approaches aimed at improved characterization and assessment of atherosclerotic coronary artery disease. These approaches provide new anatomical and functional information in four dimensions, and support tissue characterization for coronary artery plaques.
Resumo:
The prediction filters are well known models for signal estimation, in communications, control and many others areas. The classical method for deriving linear prediction coding (LPC) filters is often based on the minimization of a mean square error (MSE). Consequently, second order statistics are only required, but the estimation is only optimal if the residue is independent and identically distributed (iid) Gaussian. In this paper, we derive the ML estimate of the prediction filter. Relationships with robust estimation of auto-regressive (AR) processes, with blind deconvolution and with source separation based on mutual information minimization are then detailed. The algorithm, based on the minimization of a high-order statistics criterion, uses on-line estimation of the residue statistics. Experimental results emphasize on the interest of this approach.
Resumo:
This paper analyzes applications of cumulant analysis in speech processing. A special focus is made on different second-order statistics. A dominant role is played by an integral representation for cumulants by means of integrals involving cyclic products of kernels.
Resumo:
The present study evaluates the performance of four methods for estimating regression coefficients used to make statistical decisions regarding intervention effectiveness in single-case designs. Ordinary least squares estimation is compared to two correction techniques dealing with general trend and one eliminating autocorrelation whenever it is present. Type I error rates and statistical power are studied for experimental conditions defined by the presence or absence of treatment effect (change in level or in slope), general trend, and serial dependence. The results show that empirical Type I error rates do not approximate the nominal ones in presence of autocorrelation or general trend when ordinary and generalized least squares are applied. The techniques controlling trend show lower false alarm rates, but prove to be insufficiently sensitive to existing treatment effects. Consequently, the use of the statistical significance of the regression coefficients for detecting treatment effects is not recommended for short data series.
Resumo:
A headspace-gas chromatography-tandem mass spectrometry (HS-GC-MS/MS) method for the trace measurement of perfluorocarbon compounds (PFCs) in blood was developed. Due to oxygen carrying capabilities of PFCs, application to doping and sports misuse is speculated. This study was therefore extended to perform validation methods for F-tert-butylcyclohexane (Oxycyte(®)), perfluoro(methyldecalin) (PFMD) and perfluorodecalin (PFD). The limit of detection of these compounds was established and found to be 1.2µg/mL blood for F-tert-butylcyclohexane, 4.9µg/mL blood for PFMD and 9.6µg/mL blood for PFD. The limit of quantification was assumed to be 12µg/mL blood (F-tert-butylcyclohexane), 48µg/mL blood (PFMD) and 96µg/mL blood (PFD). HS-GC-MS/MS technique allows detection from 1000 to 10,000 times lower than the estimated required dose to ensure a biological effect for the investigated PFCs. Thus, this technique could be used to identify a PFC misuse several hours, maybe days, after the injection or the sporting event. Clinical trials with those compounds are still required to evaluate the validation parameters with the calculated estimations.
Resumo:
To-date, there has been no effective chiral capillary electrophoresis-mass spectrometry (CE-MS) method reported for the simultaneous enantioseparation of the antidepressant drug, venlafaxine (VX) and its structurally-similar major metabolite, O-desmethylvenlafaxine (O-DVX). This is mainly due to the difficulty of identifying MS compatible chiral selector, which could provide both high enantioselectivity and sensitive MS detection. In this work, poly-sodium N-undecenoyl-L,L-leucylalaninate (poly-L,L-SULA) was employed as a chiral selector after screening several dipeptide polymeric chiral surfactants. Baseline separation of both O-DVX and VX enantiomers was achieved in 15min after optimizing the buffer pH, poly-L,L-SULA concentration, nebulizer pressure and separation voltage. Calibration curves in spiked plasma (recoveries higher than 80%) were linear over the concentration range 150-5000ng/mL for both VX and O-DVX. The limit of detection (LOD) was found to be as low as 30ng/mL and 21ng/mL for O-DVX and VX, respectively. This method was successfully applied to measure the plasma concentrations of human volunteers receiving VX or O-DVX orally when co-administered without and with indinivar therapy. The results suggest that micellar electrokinetic chromatography electrospray ionization-tandem mass spectrometry (MEKC-ESI-MS/MS) is an effective low cost alternative technique for the pharmacokinetics and pharmacodynamics studies of both O-DVX and VX enantiomers. The technique has potential to identify drug-drug interaction involving VX and O-DVX enantiomers while administering indinivar therapy.
Resumo:
Post-testicular sperm maturation occurs in the epididymis. The ion concentration and proteins secreted into the epididymal lumen, together with testicular factors, are believed to be responsible for the maturation of spermatozoa. Disruption of the maturation of spermatozoa in the epididymis provides a promising strategy for generating a male contraceptive. However, little is known about the proteins involved. For drug development, it is also essential to have tools to study the function of these proteins in vitro. One approach for screening novel targets is to study the secretory products of the epididymis or the G protein-coupled receptors (GPCRs) that are involved in the maturation process of the spermatozoa. The modified Ca2+ imaging technique to monitor release from PC12 pheochromocytoma cells can also be applied to monitor secretory products involved in the maturational processes of spermatozoa. PC12 pheochromocytoma cells were chosen for evaluation of this technique as they release catecholamines from their cell body, thus behaving like endocrine secretory cells. The results of the study demonstrate that depolarisation of nerve growth factor -differentiated PC12 cells releases factors which activate nearby randomly distributed HEL erythroleukemia cells. Thus, during the release process, the ligands reach concentrations high enough to activate receptors even in cells some distance from the release site. This suggests that communication between randomly dispersed cells is possible even if the actual quantities of transmitter released are extremely small. The development of a novel method to analyse GPCR-dependent Ca2+ signalling in living slices of mouse caput epididymis is an additional tool for screening for drug targets. By this technique it was possible to analyse functional GPCRs in the epithelial cells of the ductus epididymis. The results revealed that, both P2X- and P2Y-type purinergic receptors are responsible for the rapid and transient Ca2+ signal detected in the epithelial cells of caput epididymides. Immunohistochemical and reverse transcriptase-polymerase chain reaction (RTPCR) analyses showed the expression of at least P2X1, P2X2, P2X4 and P2X7, and P2Y1 and P2Y2 receptors in the epididymis. Searching for epididymis-specific promoters for transgene delivery into the epididymis is of key importance for the development of specific models for drug development. We used EGFP as the reporter gene to identify proper promoters to deliver transgenes into the epithelial cells of the mouse epididymis in vivo. Our results revealed that the 5.0 kb murine Glutathione peroxidase 5 (GPX5) promoter can be used to target transgene expression into the epididymis while the 3.8 kb Cysteine-rich secretory protein-1 (CRISP-1) promoter can be used to target transgene expression into the testis. Although the visualisation of EGFP in living cells in culture usually poses few problems, the detection of EGFP in tissue sections can be more difficult because soluble EGFP molecules can be lost if the cell membrane is damaged by freezing, sectioning, or permeabilisation. Furthermore, the fluorescence of EGFP is dependent on its conformation. Therefore, fixation protocols that immobilise EGFP may also destroy its usefulness as a fluorescent reporter. We therefore developed a novel tissue preparation and preservation techniques for EGFP. In addition, fluorescence spectrophotometry with epididymal epithelial cells in suspension revealed the expression of functional purinergic, adrenergic, cholinergic and bradykinin receptors in these cell lines (mE-Cap27 and mE-Cap28). In conclusion, we developed new tools for studying the role of the epididymis in sperm maturation. We developed a new technique to analyse GPCR dependent Ca2+ signalling in living slices of mouse caput epididymis. In addition, we improved the method of detecting reporter gene expression. Furthermore, we characterised two epididymis-specific gene promoters, analysed the expression of GPCRs in epididymal epithelial cells and developed a novel technique for measurement of secretion from cells.
Resumo:
«Quel est l'âge de cette trace digitale?» Cette question est relativement souvent soulevée au tribunal ou lors d'investigations, lorsque la personne suspectée admet avoir laissé ses empreintes digitales sur une scène de crime mais prétend l'avoir fait à un autre moment que celui du crime et pour une raison innocente. Toutefois, aucune réponse ne peut actuellement être donnée à cette question, puisqu'aucune méthodologie n'est pour l'heure validée et acceptée par l'ensemble de la communauté forensique. Néanmoins, l'inventaire de cas américains conduit dans cette recherche a montré que les experts fournissent tout de même des témoignages au tribunal concernant l'âge de traces digitales, même si ceux-‐ci sont majoritairement basés sur des paramètres subjectifs et mal documentés. Il a été relativement aisé d'accéder à des cas américains détaillés, ce qui explique le choix de l'exemple. Toutefois, la problématique de la datation des traces digitales est rencontrée dans le monde entier, et le manque de consensus actuel dans les réponses données souligne la nécessité d'effectuer des études sur le sujet. Le but de la présente recherche est donc d'évaluer la possibilité de développer une méthode de datation objective des traces digitales. Comme les questions entourant la mise au point d'une telle procédure ne sont pas nouvelles, différentes tentatives ont déjà été décrites dans la littérature. Cette recherche les a étudiées de manière critique, et souligne que la plupart des méthodologies reportées souffrent de limitations prévenant leur utilisation pratique. Néanmoins, certaines approches basées sur l'évolution dans le temps de composés intrinsèques aux résidus papillaires se sont montrées prometteuses. Ainsi, un recensement détaillé de la littérature a été conduit afin d'identifier les composés présents dans les traces digitales et les techniques analytiques capables de les détecter. Le choix a été fait de se concentrer sur les composés sébacés détectés par chromatographie gazeuse couplée à la spectrométrie de masse (GC/MS) ou par spectroscopie infrarouge à transformée de Fourier. Des analyses GC/MS ont été menées afin de caractériser la variabilité initiale de lipides cibles au sein des traces digitales d'un même donneur (intra-‐variabilité) et entre les traces digitales de donneurs différents (inter-‐variabilité). Ainsi, plusieurs molécules ont été identifiées et quantifiées pour la première fois dans les résidus papillaires. De plus, il a été déterminé que l'intra-‐variabilité des résidus était significativement plus basse que l'inter-‐variabilité, mais que ces deux types de variabilité pouvaient être réduits en utilisant différents pré-‐ traitements statistiques s'inspirant du domaine du profilage de produits stupéfiants. Il a également été possible de proposer un modèle objectif de classification des donneurs permettant de les regrouper dans deux classes principales en se basant sur la composition initiale de leurs traces digitales. Ces classes correspondent à ce qui est actuellement appelé de manière relativement subjective des « bons » ou « mauvais » donneurs. Le potentiel d'un tel modèle est élevé dans le domaine de la recherche en traces digitales, puisqu'il permet de sélectionner des donneurs représentatifs selon les composés d'intérêt. En utilisant la GC/MS et la FTIR, une étude détaillée a été conduite sur les effets de différents facteurs d'influence sur la composition initiale et le vieillissement de molécules lipidiques au sein des traces digitales. Il a ainsi été déterminé que des modèles univariés et multivariés pouvaient être construits pour décrire le vieillissement des composés cibles (transformés en paramètres de vieillissement par pré-‐traitement), mais que certains facteurs d'influence affectaient ces modèles plus sérieusement que d'autres. En effet, le donneur, le substrat et l'application de techniques de révélation semblent empêcher la construction de modèles reproductibles. Les autres facteurs testés (moment de déposition, pression, température et illumination) influencent également les résidus et leur vieillissement, mais des modèles combinant différentes valeurs de ces facteurs ont tout de même prouvé leur robustesse dans des situations bien définies. De plus, des traces digitales-‐tests ont été analysées par GC/MS afin d'être datées en utilisant certains des modèles construits. Il s'est avéré que des estimations correctes étaient obtenues pour plus de 60 % des traces-‐tests datées, et jusqu'à 100% lorsque les conditions de stockage étaient connues. Ces résultats sont intéressants mais il est impératif de conduire des recherches supplémentaires afin d'évaluer les possibilités d'application de ces modèles dans des cas réels. Dans une perspective plus fondamentale, une étude pilote a également été effectuée sur l'utilisation de la spectroscopie infrarouge combinée à l'imagerie chimique (FTIR-‐CI) afin d'obtenir des informations quant à la composition et au vieillissement des traces digitales. Plus précisément, la capacité de cette technique à mettre en évidence le vieillissement et l'effet de certains facteurs d'influence sur de larges zones de traces digitales a été investiguée. Cette information a ensuite été comparée avec celle obtenue par les spectres FTIR simples. Il en a ainsi résulté que la FTIR-‐CI était un outil puissant, mais que son utilisation dans l'étude des résidus papillaires à des buts forensiques avait des limites. En effet, dans cette recherche, cette technique n'a pas permis d'obtenir des informations supplémentaires par rapport aux spectres FTIR traditionnels et a également montré des désavantages majeurs, à savoir de longs temps d'analyse et de traitement, particulièrement lorsque de larges zones de traces digitales doivent être couvertes. Finalement, les résultats obtenus dans ce travail ont permis la proposition et discussion d'une approche pragmatique afin d'aborder les questions de datation des traces digitales. Cette approche permet ainsi d'identifier quel type d'information le scientifique serait capable d'apporter aux enquêteurs et/ou au tribunal à l'heure actuelle. De plus, le canevas proposé décrit également les différentes étapes itératives de développement qui devraient être suivies par la recherche afin de parvenir à la validation d'une méthodologie de datation des traces digitales objective, dont les capacités et limites sont connues et documentées. -- "How old is this fingermark?" This question is relatively often raised in trials when suspects admit that they have left their fingermarks on a crime scene but allege that the contact occurred at a time different to that of the crime and for legitimate reasons. However, no answer can be given to this question so far, because no fingermark dating methodology has been validated and accepted by the whole forensic community. Nevertheless, the review of past American cases highlighted that experts actually gave/give testimonies in courts about the age of fingermarks, even if mostly based on subjective and badly documented parameters. It was relatively easy to access fully described American cases, thus explaining the origin of the given examples. However, fingermark dating issues are encountered worldwide, and the lack of consensus among the given answers highlights the necessity to conduct research on the subject. The present work thus aims at studying the possibility to develop an objective fingermark dating method. As the questions surrounding the development of dating procedures are not new, different attempts were already described in the literature. This research proposes a critical review of these attempts and highlights that most of the reported methodologies still suffer from limitations preventing their use in actual practice. Nevertheless, some approaches based on the evolution of intrinsic compounds detected in fingermark residue over time appear to be promising. Thus, an exhaustive review of the literature was conducted in order to identify the compounds available in the fingermark residue and the analytical techniques capable of analysing them. It was chosen to concentrate on sebaceous compounds analysed using gas chromatography coupled with mass spectrometry (GC/MS) or Fourier transform infrared spectroscopy (FTIR). GC/MS analyses were conducted in order to characterize the initial variability of target lipids among fresh fingermarks of the same donor (intra-‐variability) and between fingermarks of different donors (inter-‐variability). As a result, many molecules were identified and quantified for the first time in fingermark residue. Furthermore, it was determined that the intra-‐variability of the fingermark residue was significantly lower than the inter-‐variability, but that it was possible to reduce both kind of variability using different statistical pre-‐ treatments inspired from the drug profiling area. It was also possible to propose an objective donor classification model allowing the grouping of donors in two main classes based on their initial lipid composition. These classes correspond to what is relatively subjectively called "good" or "bad" donors. The potential of such a model is high for the fingermark research field, as it allows the selection of representative donors based on compounds of interest. Using GC/MS and FTIR, an in-‐depth study of the effects of different influence factors on the initial composition and aging of target lipid molecules found in fingermark residue was conducted. It was determined that univariate and multivariate models could be build to describe the aging of target compounds (transformed in aging parameters through pre-‐ processing techniques), but that some influence factors were affecting these models more than others. In fact, the donor, the substrate and the application of enhancement techniques seemed to hinder the construction of reproducible models. The other tested factors (deposition moment, pressure, temperature and illumination) also affected the residue and their aging, but models combining different values of these factors still proved to be robust. Furthermore, test-‐fingermarks were analysed with GC/MS in order to be dated using some of the generated models. It turned out that correct estimations were obtained for 60% of the dated test-‐fingermarks and until 100% when the storage conditions were known. These results are interesting but further research should be conducted to evaluate if these models could be used in uncontrolled casework conditions. In a more fundamental perspective, a pilot study was also conducted on the use of infrared spectroscopy combined with chemical imaging in order to gain information about the fingermark composition and aging. More precisely, its ability to highlight influence factors and aging effects over large areas of fingermarks was investigated. This information was then compared with that given by individual FTIR spectra. It was concluded that while FTIR-‐ CI is a powerful tool, its use to study natural fingermark residue for forensic purposes has to be carefully considered. In fact, in this study, this technique does not yield more information on residue distribution than traditional FTIR spectra and also suffers from major drawbacks, such as long analysis and processing time, particularly when large fingermark areas need to be covered. Finally, the results obtained in this research allowed the proposition and discussion of a formal and pragmatic framework to approach the fingermark dating questions. It allows identifying which type of information the scientist would be able to bring so far to investigators and/or Justice. Furthermore, this proposed framework also describes the different iterative development steps that the research should follow in order to achieve the validation of an objective fingermark dating methodology, whose capacities and limits are well known and properly documented.
Resumo:
Chemical analysis is a well-established procedure for the provenancing of archaeological ceramics. Various analytical techniques are routinely used and large amounts of data have been accumulated so far in data banks. However, in order to exchange results obtained by different laboratories, the respective analytical procedures need to be tested in terms of their inter-comparability. In this study, the schemes of analysis used in four laboratories that are involved in archaeological pottery studies on a routine basis were compared. The techniques investigated were neutron activation analysis (NAA), X-ray fluorescence analysis (XRF), inductively coupled plasma optical emission spectrometry (ICP-OES) and inductively coupled plasma mass spectrometry (ICP-MS). For this comparison series of measurements on different geological standard reference materials (SRM) were carried out and the results were statistically evaluated. An attempt was also made towards the establishment of calibration factors between pairs of analytical setups in order to smooth the systematic differences among the results.
Resumo:
Thyroid fine-needle aspiration (FNA) cytology is a fast growing field. One of the most developing areas is represented by molecular tests applied to cytological material. Patients that could benefit the most from these tests are those that have been diagnosed as 'indeterminate' on FNA. They could be better stratified in terms of malignancy risk and thus oriented with more confidence to the appropriate management. Taking in to consideration the need to improve and keep high the yield of thyroid FNA, professionals from various fields (i.e. molecular biologists, endocrinologists, nuclear medicine physicians and radiologists) are refining and fine-tuning their diagnostic instruments. In particular, all these developments aim at increasing the negative predictive value of FNA to improve the selection of patients for diagnostic surgery. These advances involve terminology, the application of next-generation sequencing to thyroid FNA, the use of immunocyto- and histo-chemistry, the development of new sampling techniques and the increasing use of nuclear medicine as well as molecular imaging in the management of patients with a thyroid nodule. Herein, we review the recent advances in thyroid FNA cytology that could be of interest to the 'thyroid-care' community, with particular focus on the indeterminate diagnostic category.
Resumo:
Coherent anti-Stokes Raman scattering is the powerful method of laser spectroscopy in which significant successes are achieved. However, the non-linear nature of CARS complicates the analysis of the received spectra. The objective of this Thesis is to develop a new phase retrieval algorithm for CARS. It utilizes the maximum entropy method and the new wavelet approach for spectroscopic background correction of a phase function. The method was developed to be easily automated and used on a large number of spectra of different substances.. The algorithm was successfully tested on experimental data.
Resumo:
The fight against doping in sports has been governed since 1999 by the World Anti-Doping Agency (WADA), an independent institution behind the implementation of the World Anti-Doping Code (Code). The intent of the Code is to protect clean athletes through the harmonization of anti-doping programs at the international level with special attention to detection, deterrence and prevention of doping.1 A new version of the Code came into force on January 1st 2015, introducing, among other improvements, longer periods of sanctioning for athletes (up to four years) and measures to strengthen the role of anti-doping investigations and intelligence. To ensure optimal harmonization, five International Standards covering different technical aspects of the Code are also currently in force: the List of Prohibited Substances and Methods (List), Testing and Investigations, Laboratories, Therapeutic Use Exemptions (TUE) and Protection of Privacy and Personal Information. Adherence to these standards is mandatory for all anti-doping stakeholders to be compliant with the Code. Among these documents, the eighth version of International Standard for Laboratories (ISL), which also came into effect on January 1st 2015, includes regulations for WADA and ISO/IEC 17025 accreditations and their application for urine and blood sample analysis by anti-doping laboratories.2 Specific requirements are also described in several Technical Documents or Guidelines in which various topics are highlighted such as the identification criteria for gas chromatography (GC) and liquid chromatography (LC) coupled to mass spectrometry (MS) techniques (IDCR), measurements and reporting of endogenous androgenic anabolic agents (EAAS) and analytical requirements for the Athlete Biological Passport (ABP).