49 resultados para Ultrasonic non-destructive testing
em Université de Lausanne, Switzerland
Resumo:
Recently, a number of cases of smuggling dissolved cocaine in wine bottles have been reported. The aim of the present study was to determine whether cocaine dissolved in wine can be detected by proton magnetic resonance spectroscopy ((1) H MRS) on a standard clinical MR scanner, in intact (i.e. unopened) wine bottles. (1) H MRS experiments were performed with a 3 Tesla clinical scanner on wine phantoms with or without cocaine contamination. The aromatic protons of cocaine displayed resonance peaks in the 7-8 ppm region of the spectrum, where no overlapping resonances of wine were present. Additional cocaine resonances were detected in the 2-3 ppm region of the spectrum, between the resonances of ethanol and other wine constituents. Detection of cocaine in wine (at 5 mM, i.e. ∼1.5 g/L) was feasible in a scan time of 1 min. We conclude that dissolved cocaine can be detected in intact wine bottles, on a standard clinical MR scanner. Thus, (1) H MRS is the technique of choice to examine this type of suspicious cargo, since it allows for a non-destructive and rapid content characterization. Copyright © 2010 John Wiley & Sons, Ltd.
Resumo:
ACuteTox is a project within the 6th European Framework Programme which had as one of its goals to develop, optimise and prevalidate a non-animal testing strategy for predicting human acute oral toxicity. In its last 6 months, a challenging exercise was conducted to assess the predictive capacity of the developed testing strategies and final identification of the most promising ones. Thirty-two chemicals were tested blind in the battery of in vitro and in silico methods selected during the first phase of the project. This paper describes the classification approaches studied: single step procedures and two step tiered testing strategies. In summary, four in vitro testing strategies were proposed as best performing in terms of predictive capacity with respect to the European acute oral toxicity classification. In addition, a heuristic testing strategy is suggested that combines the prediction results gained from the neutral red uptake assay performed in 3T3 cells, with information on neurotoxicity alerts identified by the primary rat brain aggregates test method. Octanol-water partition coefficients and in silico prediction of intestinal absorption and blood-brain barrier passage are also considered. This approach allows to reduce the number of chemicals wrongly predicted as not classified (LD50>2000 mg/kg b.w.).
Resumo:
As part of the ACuteTox project aimed at the development of non-animal testing strategies for predicting human acute oral toxicity, aggregating brain cell cultures (AGGR) were examined for their capability to detect organ-specific toxicity. Previous multicenter evaluations of in vitro cytotoxicity showed that some 20% of the tested chemicals exhibited significantly lower in vitro toxicity as expected from in vivo toxicity data. This was supposed to be due to toxicity at supracellular (organ or system) levels. To examine the capability of AGGR to alert for potential organ-specific toxicants, concentration-response studies were carried out in AGGR for 86 chemicals, taking as endpoints the mRNA expression levels of four selected genes. The lowest observed effect concentration (LOEC) determined for each chemical was compared with the IC20 reported for the 3T3/NRU cytotoxicity assay. A LOEC lower than IC20 by at least a factor of 5 was taken to alert for organ-specific toxicity. The results showed that the frequency of alerts increased with the level of toxicity observed in AGGR. Among the chemicals identified as alert were many compounds known for their organ-specific toxicity. These findings suggest that AGGR are suitable for the detection of organ-specific toxicity and that they could, in conjunction with the 3T3/NRU cytotoxicity assay, improve the predictive capacity of in vitro toxicity testing.
Resumo:
X-ray microtomography has become a new tool in earth sciences to obtain non-destructive 3D-image data from geological objects in which variations in mineralogy, chemical composition and/or porosity create sufficient x-ray density contrasts.We present here first, preliminary results of an application to the external and internal morphology of Permian to Recent Larger Foraminifera. We use a SkyScan-1072 high-resolution desk-top micro-CT system. The system has a conical x-ray source with a spot size of about 5µm that runs at 20-100kV, 0-250µA, resulting in a maximal resolution of 5µm. X-ray transmission images are captured by a scintillator coupled via fibre optics to a 1024x1024 pixel 12-bit CCD. The object is placed between the x-ray source and the scintillator on a stub that rotates 360°around its vertical axis in steps as small as 0.24 degrees. Sample size is limited to 2 cm due to the absorption of geologic material for x-rays. The transmission images are back projected using a Feldkamp algorithm into a vertical stack of up to 1000 1Kx1K images that represent horizontal cuts of the object. This calculation takes 2 to several hours on a Double-Processor 2.4GHz PC. The stack of images (.bmp) can be visualized with any 3D-imaging software, used to produce cuts of Larger Foraminifera. Among other applications, the 3D-imaging software furnished by SkyScan can produce 3D-models by defining a threshold density value to distinguish "solid" from "void. Several models with variable threshold values and colors can be imbricated, rotated and cut together. The best results were obtained with microfossils devoid of chamber-filling cements (Permian, Eocene, Recent). However, even slight differences in cement mineralogy/composition can result in surprisingly good x-ray density contrasts.X-ray microtomography may develop into a powerful tool for larger microfossils with a complex internal structure, because it is non-destructive, requires no preparation of the specimens, and produces a true 3D-image data set. We will use these data sets in the future to produce cuts in any direction to compare them with arbitrary cuts of complex microfossils in thin sections. Many groups of benthic and planktonic foraminifera may become more easily determinable in thin section by this way.
Resumo:
The pigments and the plasters of the Roman frescoes discovered at the House of Diana (Cosa, Grosseto, Italy) were analysed using non-destructive and destructive mineralogical and chemical techniques. The characterization of both pigments and plasters was performed through optical microscopy, scanning electron microscopy and electron microprobe analysis. The pigments were identified by Raman spectroscopy and submitted to stable isotope analysis. The results were integrated with the archaeological data in order to determine and reconstruct the provenance, trade patterns and the employment of the raw materials used for the elaboration of the frescoes.
Resumo:
A transportable Raman spectrometer was tested for the detection of illicit drugs seized during border controls. In a first step, the analysis methodology was optimized using reference substances such as diacetylmorphine (heroin), cocaine and amphetamine (as powder or liquid forms). Adequate focalisation distance and times of analysis, influence of daylight and artificial light sources, repeatability and limits of detection were studied. In a second step the applications and limitations of the technique to detect the illicit substances in different mixtures and containers was evaluated. Transportable Raman spectroscopy was found to be adequate for a rapid screen of liquids and powders for the detection and identification of controlled substances. Additionally, it had the advantage over other portable techniques, such as ion mobility spectrometry, of being non-destructive and capable of rapid analysis of large quantities of substances through containers such as plastic bags and glass bottles.
Resumo:
RESUME Dès le printemps 2004, la construction d'une 2ème ligne de métro est entreprise dans la ville de Lausanne en Suisse. En reliant Ouchy, au bord du lac Léman (alt. 373 m) à Epalinges (alt. 711 m), le nouveau métro "M2" traversera dès 2008 l'agglomération lausannoise du Sud au Nord sur une distance de 6 km. Depuis l'avant-projet, en 1999, une grande quantité de données géologiques a été récolté et de nombreux forages exécutés sur le site. Ceci nous a donné une occasion unique d'entreprendre une étude de microgravimétrique urbaine de détail. Le mode de creusement du tunnel dépend fortement des matériaux à excaver et il est classiquement du domaine du géologue, avec ses connaissances de la géologie régionale et de la stratigraphie des forages, de fournir à l'ingénieur un modèle géologique. Ce modèle indiquera dans ce cas l'épaisseur des terrains meubles qui recouvrent le soubassement rocheux. La représentativité spatiale d'une information très localisée, comme celle d'un forage, est d'autant plus compliquée que le détail recherché est petit. C'est à ce moment là que la prospection géophysique, plus spécialement gravimétrique, peut apporter des informations complémentaires déterminantes pour régionaliser les données ponctuelles des forages. La microgravimétrie en milieu urbain implique de corriger avec soin les perturbations gravifiques sur la mesure de la pesanteur dues aux effets de la topographie, des bâtiments et des caves afin d'isoler l'effet gravifique dû exclusivement à l'épaisseur du remplissage des terrains meubles. Tenant compte de l'intensité des corrections topographiques en milieu urbain, nous avons donné une grande importance aux sous-sols, leurs effets gravifiques pouvant atteindre l'ordre du dixième de mGal. Nous avons donc intégré ces corrections celle de topographie et traité les effets des bâtiments de manière indépendante. Nous avons inclus dans le modèle numérique de terrain (MNT) la chaussée et les sous-sols afin de construire un modèle numérique de terrain urbain. Nous utiliserons un nouvel acronyme « MNTU »pour décrire ce modèle. Nous proposons d'établir des cartes de corrections topographiques préalables, basées sur les données à disposition fournies par le cadastre en faisant des hypothèses sur la profondeur des sous-sols et la hauteur des bâtiments. Les deux zones de test choisies sont caractéristiques des différents types d'urbanisation présente à Lausanne et se révèlent par conséquent très intéressantes pour élaborer une méthodologie globale de la microgravimétrie urbaine. Le but était d'évaluer l'épaisseur du remplissage morainique sur un fond rocheux molassique se situant à une profondeur variable de quelques mètres à une trentaine de mètres et d'en établir une coupe dans l'axe du futur tracé du métro. Les résultats des modélisations se sont révélés très convaincants en détectant des zones qui diffèrent sensiblement du modèle géologique d'avant projet. Nous avons également démontré que l'application de cette méthode géophysique, non destructive, est à même de limiter le nombre de sondages mécaniques lors de l'avant-projet et du projet définitif, ce qui peut limiter à la fois les coûts et le dérangement engendré par ces travaux de surface. L'adaptabilité de la technique gravimétrique permet d'intervenir dans toutes les différentes phases d'un projet de génie civil comme celui de la construction d'un métro en souterrain. KURZFASSUNG Seit dem Frühling 2004 ist in der Stadt Lausanne (Schweiz) die neue U-Bahn "M2" in Konstruktion. Diese soll auf 6 km Länge die Lausanner Agglomeration von Süd nach Nord durchqueren. Die dem Projekt zu Grunde liegende technische Planung sieht vor, daß die Bahnlinie hauptsächlich in der Molasse angesiedelt sein wird. Seit dem Vorentwurf (1999) ist eine große Anzahl geologischer Angaben gesammelt worden. Daraus ergab sich die einmalige Gelegenheit, die Informationen aus den damit verbundenen zahlreichen Bohrungen zu einer detaillierten mikrogravimetrischen Studie der Stadt Lausanne zu erweitern und zu vervollständigen. Das Ziel bestand darin, die Mächtigkeit der die Molasseüberdeckenden Moräneablagerung abzuschätzen, um eine entsprechendes geologisches Profile entlang der künftigen Bahnlinie zu erstellen. Weiterhin sollte gezeigt werden, daß die Anwendung dieser nicht-invasiven geophysikalischen Methode es ermöglicht, die Anzahl der benötigten Bohrungen sowohl in der Pilotphase wie auch im endgültigen Projekt zu reduzieren, was zu wesentlichen finanziellen Einsparungen in der Ausführung des Werkes beitragen würde. Die beiden in dieser Studie bearbeiteten Testzonen befinden sich im Nordteil und im Stadtzentrum von Lausanne und sind durch eine unterschiedliche Urbanisierung charakterisiert. Das anstehende Gestein liegt in verschiedenen Tiefen: von einigen Metern bis zu etwa dreißig Metern. Diese Zonen weisen alle Schwierigkeiten einer urbanen Bebauung mit hoher Verkehrsdichte auf und waren daher massgebend bei der Ausarbeitung einer globalen mikrogravimetrischen Methodologie für die Stadt Lausanne. Die so entwickelte Technik ermöglicht, die störenden Auswirkungen der Topographie, der Gebäude, der Keller und der Öffentlichen Infrastrukturen sorgfältig zu korrigieren, um so die ausschließlich auf die Mächtigkeit des Lockergesteins zurückzuführenden Effekte zu isolieren. In Bezug auf die Intensität der Auswirkungen der topographischen Korrekturen im Stadtgebiet wurde den Untergeschossen eine besonders grosse Bedeutung zugemessen da die entsprechenden Schwerkrafteffekte eine Grösse von rund einem Zehntel mGal erreichen können. Wir schlagen deshalb vor, vorläufige Karten der topographischen Korrekturen zu erstellen. Diese Korrekturen basieren auf den uns vom Katasterplan gelieferten Daten und einigen Hypothesen bezüglich der Tiefe der Untergeschosse und der Höhe der Gebäude. Die Verfügbarkeit einer derartigen Karte vor der eigentlichen gravimetrischen Messkampagne würde uns erlauben, die Position der Meßstationen besser zu wählen. Wir sahen zudem, daß ein entsprechenden a priori Filter benutzt werden kann, wenn die Form und die Intensität der Anomalie offensichtlich dem entsprechenden Gebäude zugeordnet werden können. Diese Strategie muß jedoch mit Vorsicht angewandt werden, denn falls weitere Anomalien dazukommen, können bedeutende Verschiebungen durch Übèrlagerungen der Schwerewirkung verschiedener Strukturen entstehen. Die Ergebnisse der Modellierung haben sich als sehr überzeugend erwiesen, da sie im Voraus unbekannte sensible Zonen korrekt identifiziert haben. Die Anwendbarkeit der in dieser Arbeit entwickelten gravimetrischen Technik ermöglicht es, während allen Phasen eines Grossbauprojekts, wie zum Beispiel bei der Konstruktion einer unterirdischen U-Bahn, einzugreifen. ABSTRACT Since Spring of 2004 a new metro line has been under construction in the city of Lausanne in Switzerland. The new line, the M2, will be 6 km long and will traverse the city from south to north. The civil engineering project determined that the line would be located primarily in the Molasse. Since the preparatory project in 1999, a great quantity of geological data has been collected, and the many drillings made on the site have proved to be a unique opportunity to undertake a study of urban microgravimetry. The goal was to evaluate the thickness of the morainic filling over the molassic bedrock, and to establish a section along the axis of the future line. It then had to be shown that the application of this nondestructive geophysical method could reduce the number of mechanical surveys required both for a preparatory and a definitive project, which would lead to real savings in the realization of a civil engineering project. The two test zones chosen, one in the northern part of the city and one in the city centre, are characterised by various types of urbanisation. Bedrock is at a depth varying from a few metres to about thirty metres. These zones well exemplify the various difficulties encountered in an urban environment and are therefore very interesting for the development of an overall methodology of urban microgravimetry. Microgravimetry in an urban environment requires careful corrections for gravific disturbances due to the effects of topography, buildings, cellars, and the infrastructure of distribution networks, in order to isolate the gravific effect due exclusively to the thickness of loose soil filling. Bearing in mind the intensity of the topographic corrections in an urban environment, we gave particular importance to basements. Their gravific effects can reach the order of one tenth of one meal, and can influence above all the precision of the Bouguer anomaly. We propose to establish preliminary topographic correction charts based on data provided to us by the land register, by making assumptions on the depths of basements and the heights of buildings. Availability of this chart previous to a gravimetry campaign would enable us to choose optimum measuring sites. We have also seen that an a priori filter can be used when the form and the intensity of the anomaly correspond visually to the corresponding building. This strategy must be used with caution because if other anomalies are to be associated, important shifts can be generated by the superposition of the effects of different structures. The results of the model have proved to be very convincing in detecting previously unknown sensitive zones. The adaptability of the gravimetry technique allows for application in all phases of a civil engineering project such as the construction of an underground metro line. RIASSUNTO Dalla primavera 2004 una nuova linea metropolitana é in costruzione nella città di Losanna in Svizzera. La nuova metropolitana "M2" traverserà per la lunghezza di 6 km il centro urbano di Losanna da sud a nord. II progetto d'ingegneria civile prevedeva un tracciato situato essenzialmente nel fondo roccioso arenaceo terziario (molassa). Dalla redazione del progetto preliminare, avvenuta nel 1999, una grande quantità di dati geologici sono stati raccolti e sono stati eseguiti numerosi sondaggi. Questo sì é presentato come un'occasione unica per mettere a punto uno studio microgravimetrico in ambiente urbano con lo scopo di valutare lo spessore dei terreni sciolti di origine glaciale che ricoprono il fondo roccioso di molassa e di mettere in evidenza come l'applicazione di questo metodo geofisico non distruttivo possa limitare il numero di sondaggi meccanici nella fase di progetto preliminare ed esecutivo con conseguente reale risparmio economico nella realizzazione di una tale opera. Le due zone di test sono situate una nella zona nord e la seconda nel centro storico di Losanna e sono caratterizzate da stili architettonici differenti. II fondo roccioso é situato ad una profondità variabile da qualche metro ad una trentina. Queste due zone sembrano ben rappresentare tutte le difficoltà di un ambiente urbano e ben si prestano per elaborare una metodologia globale per la microgravimetria in ambiente urbano. L'applicazione di questa tecnica nell'ambiente suddetto implica la correzione attenta delle perturbazioni sulla misura dell'accelerazione gravitazionale, causate dalla topografia, gli edifici, le cantine e le infrastrutture dei sottoservizi, per ben isolare il segnale esclusivamente causato dallo spessore dei terreni sciolti. Tenuto conto, dell'intensità delle correzioni topografiche, abbiamo dato grande importanza alle cantine, poiché il loro effetto sulle misure può raggiungere il decimo di mGal. Proponiamo quindi di redigere una carta delle correzioni topografiche preliminare all'acquisizione, facendo delle ipotesi sulla profondità delle cantine e sull'altezza degli edifici, sulla base delle planimetrie catastali. L'analisi di questa carta permetterà di scegliere le posizioni più adatte per le stazioni gravimetriche. Abbiamo anche osservato che un filtro a priori, qualora la forma e l'intensità dell'anomalia fosse facilmente riconducibile in maniera visuale ad un edificio, possa essere efficace. Tuttavia questa strategia deve essere utilizzata con precauzione, poiché può introdurre uno scarto, qualora più anomalie, dovute a differenti strutture, si sovrappongano. I risultati delle modellizzazioni si sono rivelati convincenti, evidenziando zone sensibili non conosciute preventivamente. L'adattabilità della tecnica gravimetrica ha mostrato di poter intervenire in differenti fasi di un progetto di ingegneria civile, quale è quella di un'opera in sotterraneo.
Resumo:
RESUME La méthode de la spectroscopie Raman est une technique d'analyse chimique basée sur l'exploitation du phénomène de diffusion de la lumière (light scattering). Ce phénomène fut observé pour la première fois en 1928 par Raman et Krishnan. Ces observations permirent à Raman d'obtenir le Prix Nobel en physique en 1930. L'application de la spectroscopie Raman a été entreprise pour l'analyse du colorant de fibres textiles en acrylique, en coton et en laine de couleurs bleue, rouge et noire. Nous avons ainsi pu confirmer que la technique est adaptée pour l'analyse in situ de traces de taille microscopique. De plus, elle peut être qualifiée de rapide, non destructive et ne nécessite aucune préparation particulière des échantillons. Cependant, le phénomène de la fluorescence s'est révélé être l'inconvénient le plus important. Lors de l'analyse des fibres, différentes conditions analytiques ont été testées et il est apparu qu'elles dépendaient surtout du laser choisi. Son potentiel pour la détection et l'identification des colorants imprégnés dans les fibres a été confirmé dans cette étude. Une banque de données spectrale comprenant soixante colorants de référence a été réalisée dans le but d'identifier le colorant principal imprégné dans les fibres collectées. De plus, l'analyse de différents blocs de couleur, caractérisés par des échantillons d'origine inconnue demandés à diverses personnes, a permis de diviser ces derniers en plusieurs groupes et d'évaluer la rareté des configurations des spectres Raman obtenus. La capacité de la technique Raman à différencier ces échantillons a été évaluée et comparée à celle des méthodes conventionnelles pour l'analyse des fibres textiles, à savoir la micro spectrophotométrie UV-Vis (MSP) et la chromatographie sur couche mince (CCM). La technique Raman s'est révélée être moins discriminatoire que la MSP pour tous les blocs de couleurs considérés. C'est pourquoi dans le cadre d'une séquence analytique nous recommandons l'utilisation du Raman après celle de la méthode d'analyse de la couleur, à partir d'un nombre de sources lasers le plus élevé possible. Finalement, la possibilité de disposer d'instruments équipés avec plusieurs longueurs d'onde d'excitation, outre leur pouvoir de réduire la fluorescence, permet l'exploitation d'un plus grand nombre d'échantillons. ABSTRACT Raman spectroscopy allows for the measurement of the inelastic scattering of light due to the vibrational modes of a molecule when irradiated by an intense monochromatic source such as a laser. Such a phenomenon was observed for the first time by Raman and Krishnan in 1928. For this observation, Raman was awarded with the Nobel Prize in Physics in 1930. The application of Raman spectroscopy has been undertaken for the dye analysis of textile fibers. Blue, black and red acrylics, cottons and wools were examined. The Raman technique presents advantages such as non-destructive nature, fast analysis time, and the possibility of performing microscopic in situ analyses. However, the problem of fluorescence was often encountered. Several aspects were investigated according to the best analytical conditions for every type/color fiber combination. The potential of the technique for the detection and identification of dyes was confirmed. A spectral database of 60 reference dyes was built to detect the main dyes used for the coloration of fiber samples. Particular attention was placed on the discriminating power of the technique. Based on the results from the Raman analysis for the different blocs of color submitted to analyses, it was possible to obtain different classes of fibers according to the general shape of spectra. The ability of Raman spectroscopy to differentiate samples was compared to the one of the conventional techniques used for the analysis of textile fibers, like UV-Vis Microspectrophotometry (UV-Vis MSP) and thin layer chromatography (TLC). The Raman technique resulted to be less discriminative than MSP for every bloc of color considered in this study. Thus, it is recommended to use Raman spectroscopy after MSP and light microscopy to be considered for an analytical sequence. It was shown that using several laser wavelengths allowed for the reduction of fluorescence and for the exploitation of a higher number of samples.
Resumo:
Buccal swabs have recently been used as a minimally invasive sampling method in genetic studies of wild populations, including amphibian species. Yet it is not known to date what is the level of reliability for microsatellite genotypes obtained using such samples. Allelic dropout and false alleles may affect the genotyping derived from buccal samples. Here we quantified the success of microsatellite amplification and the rates of genotyping errors using buccal swabs in two amphibian species, the Alpine newt Triturus alpestris and the Green tree frog Hyla arborea, and we estimated two important parameters for downstream analyses, namely the number of repetitions required to achieve typing reliability and the probability of identity among genotypes. Amplification success was high, and only one locus tested required two to three repetitions to achieve reliable genotypes, showing that buccal swabbing is a very efficient approach allowing good quality DNA retrieval. This sampling method which allows avoiding the controversial toe-clipping will likely prove very useful in the context of amphibian conservation.
Resumo:
OBJECTIVE: HIV-1 post-exposure prophylaxis (PEP) is frequently prescribed after exposure to source persons with an undetermined HIV serostatus. To reduce unnecessary use of PEP, we implemented a policy including active contacting of source persons and the availability of free, anonymous HIV testing ('PEP policy'). METHODS: All consultations for potential non-occupational HIV exposures i.e. outside the medical environment) were prospectively recorded. The impact of the PEP policy on PEP prescription and costs was analysed and modelled. RESULTS: Among 146 putative exposures, 47 involved a source person already known to be HIV positive and 23 had no indication for PEP. The remaining 76 exposures involved a source person of unknown HIV serostatus. Of 33 (43.4%) exposures for which the source person could be contacted and tested, PEP was avoided in 24 (72.7%), initiated and discontinued in seven (21.2%), and prescribed and completed in two (6.1%). In contrast, of 43 (56.6%) exposures for which the source person could not be tested, PEP was prescribed in 35 (81.4%), P < 0.001. Upon modelling, the PEP policy allowed a 31% reduction of cost for management of exposures to source persons of unknown HIV serostatus. The policy was cost-saving for HIV prevalence of up to 70% in the source population. The availability of all the source persons for testing would have reduced cost by 64%. CONCLUSION: In the management of non-occupational HIV exposures, active contacting and free, anonymous testing of source persons proved feasible. This policy resulted in a decrease in prescription of PEP, proved to be cost-saving, and presumably helped to avoid unnecessary toxicity and psychological stress.
Resumo:
This paper contains a joint ESHG/ASHG position document with recommendations regarding responsible innovation in prenatal screening with non-invasive prenatal testing (NIPT). By virtue of its greater accuracy and safety with respect to prenatal screening for common autosomal aneuploidies, NIPT has the potential of helping the practice better achieve its aim of facilitating autonomous reproductive choices, provided that balanced pretest information and non-directive counseling are available as part of the screening offer. Depending on the health-care setting, different scenarios for NIPT-based screening for common autosomal aneuploidies are possible. The trade-offs involved in these scenarios should be assessed in light of the aim of screening, the balance of benefits and burdens for pregnant women and their partners and considerations of cost-effectiveness and justice. With improving screening technologies and decreasing costs of sequencing and analysis, it will become possible in the near future to significantly expand the scope of prenatal screening beyond common autosomal aneuploidies. Commercial providers have already begun expanding their tests to include sex-chromosomal abnormalities and microdeletions. However, multiple false positives may undermine the main achievement of NIPT in the context of prenatal screening: the significant reduction of the invasive testing rate. This document argues for a cautious expansion of the scope of prenatal screening to serious congenital and childhood disorders, only following sound validation studies and a comprehensive evaluation of all relevant aspects. A further core message of this document is that in countries where prenatal screening is offered as a public health programme, governments and public health authorities should adopt an active role to ensure the responsible innovation of prenatal screening on the basis of ethical principles. Crucial elements are the quality of the screening process as a whole (including non-laboratory aspects such as information and counseling), education of professionals, systematic evaluation of all aspects of prenatal screening, development of better evaluation tools in the light of the aim of the practice, accountability to all stakeholders including children born from screened pregnancies and persons living with the conditions targeted in prenatal screening and promotion of equity of access.
Resumo:
La douleur est fréquente en milieu de soins intensifs et sa gestion est l'une des missions des infirmières. Son évaluation est une prémisse indispensable à son soulagement. Cependant lorsque le patient est incapable de signaler sa douleur, les infirmières doivent se baser sur des signes externes pour l'évaluer. Les guides de bonne pratique recommandent chez les personnes non communicantes l'usage d'un instrument validé pour la population donnée et basé sur l'observation des comportements. A l'heure actuelle, les instruments d'évaluation de la douleur disponibles ne sont que partiellement adaptés aux personnes cérébrolésées dans la mesure où ces personnes présentent des comportements qui leur sont spécifiques. C'est pourquoi, cette étude vise à identifier, décrire et valider des indicateurs, et des descripteurs, de la douleur chez les personnes cérébrolésées. Un devis d'étude mixte multiphase avec une dominante quantitative a été choisi pour cette étude. Une première phase consistait à identifier des indicateurs et des descripteurs de la douleur chez les personnes cérébrolésées non communicantes aux soins intensifs en combinant trois sources de données : une revue intégrative des écrits, une démarche consultative utilisant la technique du groupe nominal auprès de 18 cliniciens expérimentés (6 médecins et 12 infirmières) et les résultats d'une étude pilote observationnelle réalisée auprès de 10 traumatisés crâniens. Les résultats ont permis d'identifier 6 indicateurs et 47 descripteurs comportementaux, vocaux et physiologiques susceptibles d'être inclus dans un instrument d'évaluation de la douleur destiné aux personnes cérébrolésées non- communicantes aux soins intensifs. Une deuxième phase séquentielle vérifiait les propriétés psychométriques des indicateurs et des descripteurs préalablement identifiés. La validation de contenu a été testée auprès de 10 experts cliniques et 4 experts scientifiques à l'aide d'un questionnaire structuré qui cherchait à évaluer la pertinence et la clarté/compréhensibilité de chaque descripteur. Cette démarche a permis de sélectionner 33 des 47 descripteurs et valider 6 indicateurs. Dans un deuxième temps, les propriétés psychométriques de ces indicateurs et descripteurs ont été étudiés au repos, lors de stimulation non nociceptive et lors d'une stimulation nociceptive (la latéralisation du patient) auprès de 116 personnes cérébrolésées aux soins intensifs hospitalisées dans deux centres hospitaliers universitaires. Les résultats montrent d'importantes variations dans les descripteurs observés lors de stimulation nociceptive probablement dues à l'hétérogénéité des patients au niveau de leur état de conscience. Dix descripteurs ont été éliminés, car leur fréquence lors de la stimulation nociceptive était inférieure à 5% ou leur fiabilité insuffisante. Les descripteurs physiologiques ont tous été supprimés en raison de leur faible variabilité et d'une fiabilité inter juge problématique. Les résultats montrent que la validité concomitante, c'est-à-dire la corrélation entre l'auto- évaluation du patient et les mesures réalisées avec les descripteurs, est satisfaisante lors de stimulation nociceptive {rs=0,527, p=0,003, n=30). Par contre la validité convergente, qui vérifiait l'association entre l'évaluation de la douleur par l'infirmière en charge du patient et les mesures réalisés avec les descripteurs, ainsi que la validité divergente, qui vérifiait si les indicateurs discriminent entre la stimulation nociceptive et le repos, mettent en évidence des résultats variables en fonction de l'état de conscience des patients. Ces résultats soulignent la nécessité d'étudier les descripteurs de la douleur chez des patients cérébrolésés en fonction du niveau de conscience et de considérer l'hétérogénéité de cette population dans la conception d'un instrument d'évaluation de la douleur pour les personnes cérébrolésées non communicantes aux soins intensifs. - Pain is frequent in the intensive care unit (ICU) and its management is a major issue for nurses. The assessment of pain is a prerequisite for appropriate pain management. However, pain assessment is difficult when patients are unable to communicate about their experience and nurses have to base their evaluation on external signs. Clinical practice guidelines highlight the need to use behavioral scales that have been validated for nonverbal patients. Current behavioral pain tools for ICU patients unable to communicate may not be appropriate for nonverbal brain-injured ICU patients, as they demonstrate specific responses to pain. This study aimed to identify, describe and validate pain indicators and descriptors in brain-injured ICU patients. A mixed multiphase method design with a quantitative dominant was chosen for this study. The first phase aimed to identify indicators and descriptors of pain for nonverbal brain- injured ICU patients using data from three sources: an integrative literature review, a consultation using the nominal group technique with 18 experienced clinicians (12 nurses and 6 physicians) and the results of an observational pilot study with 10 traumatic brain injured patients. The results of this first phase identified 6 indicators and 47 behavioral, vocal and physiological descriptors of pain that could be included in a pain assessment tool for this population. The sequential phase two tested the psychometric properties of the list of previously identified indicators and descriptors. Content validity was tested with 10 clinical and 4 scientific experts for pertinence and comprehensibility using a structured questionnaire. This process resulted in 33 descriptors to be selected out of 47 previously identified, and six validated indicators. Then, the psychometric properties of the descriptors and indicators were tested at rest, during non nociceptive stimulation and nociceptive stimulation (turning) in a sample of 116 brain-injured ICLI patients who were hospitalized in two university centers. Results showed important variations in the descriptors observed during the nociceptive stimulation, probably due to the heterogeneity of patients' level of consciousness. Ten descriptors were excluded, as they were observed less than 5% of the time or their reliability was insufficient. All physiologic descriptors were deleted as they showed little variability and inter observer reliability was lacking. Concomitant validity, testing the association between patients' self report of pain and measures performed using the descriptors, was acceptable during nociceptive stimulation (rs=0,527, p=0,003, n=30). However, convergent validity ( testing for an association between the nurses' pain assessment and measures done with descriptors) and divergent validity (testing for the ability of the indicators to discriminate between rest and a nociceptive stimulation) varied according to the level of consciousness These results highlight the need to study pain descriptors in brain-injured patients with different level of consciousness and to take into account the heterogeneity of this population forthe conception of a pain assessment tool for nonverbal brain-injured ICU patients.
Resumo:
OBJECTIVES: To assess attitudes to HIV risk and acceptability of rapid HIV testing among clients of street-based female sex workers (FSW) in Lausanne, Switzerland, where HIV prevalence in the general population is 0.4%. METHODS: The authors conducted a cross-sectional study in the red light district of Lausanne for five nights in September of 2008, 2009 and 2010. Clients of FSW were invited to complete a questionnaire in the street assessing demographic characteristics, attitudes to HIV risk and HIV testing history. All clients interviewed were then offered anonymous finger stick rapid HIV testing in a van parked on-site. RESULTS: The authors interviewed 112, 127 and 79 clients in 2008, 2009 and 2010, respectively. All were men, average age 32-37 years old; 40-60% were in a stable relationship. History of unprotected sex was higher with non-commercial partners (33-50%) than with FSW (6-11%); 29-46% of clients had never undergone an HIV test. Anonymous rapid HIV testing was accepted by 45-50% of clients. Out of 109 HIV tests conducted during the three study periods, none was reactive. CONCLUSIONS: On-site HIV counselling and testing is acceptable among clients of FSW in this urban setting. These individuals represent an unquantified population, a proportion of which has an incomplete understanding of HIV risk in the face of high-risk behaviour, with implications for potential onward transmission to non-commercial sexual partners.
Resumo:
Aujourd'hui, les problèmes des maladies infectieuses concernent l'émergence d'infections difficiles à traiter, telles que les infections associées aux implants et les infections fongiques invasives chez les patients immunodéprimés. L'objectif de cette thèse était de développer des stratégies pour l'éradication des biofilms bactériens (partie 1), ainsi que d'étudier des méthodes innovantes pour la détection microbienne, pour l'établissement de nouveaux tests de sensibilité (partie 2). Le traitement des infections associées aux implants est difficile car les biofilms bactériens peuvent résister à des niveaux élevés d'antibiotiques. A ce jour, il n'y a pas de traitement optimal défini contre des infections causées par des bactéries de prévalence moindre telles que Enterococcus faecalis ou Propionibacterium acnés. Dans un premier temps, nous avons démontré une excellente activité in vitro de la gentamicine sur une souche de E. faecalis en phase stationnaire de croissance Nous avons ensuite confirmé l'activité de la gentamicine sur un biofilm précoce en modèle expérimental animal à corps étranger avec un taux de guérison de 50%. De plus, les courbes de bactéricidie ainsi que les résultats de calorimétrie ont prouvé que l'ajout de gentamicine améliorait l'activité in vitro de la daptomycine, ainsi que celle de la vancomycine. In vivo, le schéma thérapeutique le plus efficace était l'association daptomycine/gentamicine avec un taux de guérison de 55%. En établissant une nouvelle méthode pour l'évaluation de l'activité des antimicrobiens vis-à-vis de micro-organismes en biofilm, nous avons démontré que le meilleur antibiotique actif sur les biofilms à P. acnés était la rifampicine, suivi par la penicilline G, la daptomycine et la ceftriaxone. Les études conduites en modèle expérimental animal ont confirmé l'activité de la rifampicine seule avec un taux de guérison 36%. Le meilleur schéma thérapeutique était au final l'association rifampicine/daptomycine avec un taux de guérison 63%. Les associations de rifampicine avec la vancomycine ou la levofloxacine présentaient des taux de guérisons respectivement de 46% et 25%. Nous avons ensuite étudié l'émergence in vitro de la résistance à la rifampicine chez P. acnés. Nous avons observé un taux de mutations de 10"9. La caractérisation moléculaire de la résistance chez les mutant-résistants a mis en évidence l'implication de 5 mutations ponctuelles dans les domaines I et II du gène rpoB. Ce type de mutations a déjà été décrit au préalable chez d'autres espèces bactériennes, corroborant ainsi la validité de nos résultats. La deuxième partie de cette thèse décrit une nouvelle méthode d'évaluation de l'efficacité des antifongiques basée sur des mesures de microcalorimétrie isotherme. En utilisant un microcalorimètre, la chaleur produite par la croissance microbienne peut être-mesurée en temps réel, très précisément. Nous avons évalué l'activité de l'amphotéricine B, des triazolés et des échinocandines sur différentes souches de Aspergillus spp. par microcalorimétrie. La présence d'amphotéricine Β ou de triazole retardait la production de chaleur de manière concentration-dépendante. En revanche, pour les échinochandines, seule une diminution le pic de « flux de chaleur » a été observé. La concordance entre la concentration minimale inhibitrice de chaleur (CMIC) et la CMI ou CEM (définie par CLSI M38A), avec une marge de 2 dilutions, était de 90% pour l'amphotéricine B, 100% pour le voriconazole, 90% pour le pozoconazole et 70% pour la caspofongine. La méthode a été utilisée pour définir la sensibilité aux antifongiques pour d'autres types de champignons filamenteux. Par détermination microcalorimétrique, l'amphotéricine B s'est avéré être l'agent le plus actif contre les Mucorales et les Fusarium spp.. et le voriconazole le plus actif contre les Scedosporium spp. Finalement, nous avons évalué l'activité d'associations d'antifongiques vis-à-vis de Aspergillus spp. Une meilleure activité antifongique était retrouvée avec l'amphotéricine B ou le voriconazole lorsque ces derniers étaient associés aux échinocandines vis-à-vis de A. fumigatus. L'association échinocandine/amphotéricine B a démontré une activité antifongique synergique vis-à-vis de A. terreus, contrairement à l'association échinocandine/voriconazole qui ne démontrait aucune amélioration significative de l'activité antifongique. - The diagnosis and treatment of infectious diseases are today increasingly challenged by the emergence of difficult-to-manage situations, such as infections associated with medical devices and invasive fungal infections, especially in immunocompromised patients. The aim of this thesis was to address these challenges by developing new strategies for eradication of biofilms of difficult-to-treat microorganisms (treatment, part 1) and investigating innovative methods for microbial detection and antimicrobial susceptibility testing (diagnosis, part 2). The first part of the thesis investigates antimicrobial treatment strategies for infections caused by two less investigated microorganisms, Enterococcus faecalis and Propionibacterium acnes, which are important pathogens causing implant-associated infections. The treatment of implant-associated infections is difficult in general due to reduced susceptibility of bacteria when present in biofilms. We demonstrated an excellent in vitro activity of gentamicin against E. faecalis in stationary growth- phase and were able to confirm the activity against "young" biofilms (3 hours) in an experimental foreign-body infection model (cure rate 50%). The addition of gentamicin improved the activity of daptomycin and vancomycin in vitro, as determined by time-kill curves and microcalorimetry. In vivo, the most efficient combination regimen was daptomycin plus gentamicin (cure rate 55%). Despite a short duration of infection, the cure rates were low, highlighting that enterococcal biofilms remain difficult to treat despite administration of newer antibiotics, such as daptomycin. By establishing a novel in vitro assay for evaluation of anti-biofilm activity (microcalorimetry), we demonstrated that rifampin was the most active antimicrobial against P. acnes biofilms, followed by penicillin G, daptomycin and ceftriaxone. In animal studies we confirmed the anti-biofilm activity of rifampin (cure rate 36% when administered alone), as well as in combination with daptomycin (cure rate 63%), whereas in combination with vancomycin or levofloxacin it showed lower cure rates (46% and 25%, respectively). We further investigated the emergence of rifampin resistance in P. acnes in vitro. Rifampin resistance progressively emerged during exposure to rifampin, if the bacterial concentration was high (108 cfu/ml) with a mutation rate of 10"9. In resistant isolates, five point mutations of the rpoB gene were found in cluster I and II, as previously described for staphylococci and other bacterial species. The second part of the thesis describes a novel real-time method for evaluation of antifungals against molds, based on measurements of the growth-related heat production by isothermal microcalorimetry. Current methods for evaluation of antifungal agents against molds, have several limitations, especially when combinations of antifungals are investigated. We evaluated the activity of amphotericin B, triazoles (voriconazole, posaconazole) and echinocandins (caspofungin and anidulafungin) against Aspergillus spp. by microcalorimetry. The presence of amphotericin Β or a triazole delayed the heat production in a concentration-dependent manner and the minimal heat inhibition concentration (MHIC) was determined as the lowest concentration inhibiting 50% of the heat produced at 48 h. Due to the different mechanism of action echinocandins, the MHIC for this antifungal class was determined as the lowest concentration lowering the heat-flow peak with 50%. Agreement within two 2-fold dilutions between MHIC and MIC or MEC (determined by CLSI M38A) was 90% for amphotericin B, 100% for voriconazole, 90% for posaconazole and 70% for caspofungin. We further evaluated our assay for antifungal susceptibility testing of non-Aspergillus molds. As determined by microcalorimetry, amphotericin Β was the most active agent against Mucorales and Fusarium spp., whereas voriconazole was the most active agent against Scedosporium spp. Finally, we evaluated the activity of antifungal combinations against Aspergillus spp. Against A. jumigatus, an improved activity of amphotericin Β and voriconazole was observed when combined with an echinocandin. Against A. terreus, an echinocandin showed a synergistic activity with amphotericin B, whereas in combination with voriconazole, no considerable improved activity was observed.
Resumo:
PURPOSE: We report on the in vivo testing of a novel noninvasively adjustable glaucoma drainage device (AGDD), which features an adjustable outflow resistance, and assess the safety and efficiency of this implant. METHODS: Under general anesthesia, the AGDD was implanted on seven white New Zealand rabbits for a duration of 4 months under a scleral flap in a way analogous to the Ex-PRESS device and set in an operationally closed position. The IOP was measured on a regular basis on the operated and control eyes using a rebound tonometer. Once a month the AGDD was adjusted noninvasively from its fully closed to its fully open position and the resulting pressure drop was measured. The contralateral eye was not operated and served as control. After euthanization, the eyes were collected for histology evaluation. RESULTS: The mean preoperative IOP was 11.1 ± 2.4 mm Hg. The IOP was significantly lower for the operated eye (6.8 ± 2 mm Hg) compared to the nonoperated eye (13.1 ± 1.6 mm Hg) during the first 8 days after surgery. When opening the AGDD from its fully closed to fully open position, the IOP dropped significantly from 11.2 ± 2.9 to 4.8 ± 0.9 mm Hg (P < 0.05). CONCLUSIONS: Implanting the AGDD is a safe and uncomplicated surgical procedure. The fluidic resistance was noninvasively adjustable during the postoperative period with the AGDD between its fully closed and fully open positions.