48 resultados para Application method
em Université de Lausanne, Switzerland
Resumo:
Drug abuse is a widespread problem affecting both teenagers and adults. Nitrous oxide is becoming increasingly popular as an inhalation drug, causing harmful neurological and hematological effects. Some gas chromatography-mass spectrometry (GC-MS) methods for nitrous oxide measurement have been previously described. The main drawbacks of these methods include a lack of sensitivity for forensic applications; including an inability to quantitatively determine the concentration of gas present. The following study provides a validated method using HS-GC-MS which incorporates hydrogen sulfide as a suitable internal standard allowing the quantification of nitrous oxide. Upon analysis, sample and internal standard have similar retention times and are eluted quickly from the molecular sieve 5Å PLOT capillary column and the Porabond Q column therefore providing rapid data collection whilst preserving well defined peaks. After validation, the method has been applied to a real case of N2O intoxication indicating concentrations in a mono-intoxication.
Resumo:
BACKGROUND AND OBJECTIVE: In bladder cancer, conventional white light endoscopic examination of the bladder does not provide adequate information about the presence of "flat" urothelial lesions such as carcinoma in situ. In the present investigation, we examine a new technique for the photodetection of such lesions by the imaging of protoporphyrin IX (PpIX) fluorescence following topical application of 5-aminolevulinic acid (ALA). STUDY DESIGN/MATERIALS AND METHODS: Several hours after bladder instillation of an aqueous solution of ALA in 34 patients, a Krypton ion laser or a filtered Xenon arc-lamp was used to excite PpIX fluorescence. Tissue samples for histological analysis were taken while observing the bladder wall either by means of a video camera, or by direct endoscopic observation. RESULTS: A good correlation was found between the PpIX fluorescence and the histopathological diagnosis. On a total of 215 biopsies, 143 in fluorescent and 72 in nonfluorescent areas, all visible tumors on white light cytoscopy appeared in a bright red fluorescence with the photodetection technique. In addition, this method permitted to discover 47 unsuspected carcinomatous lesions on white light observation, among which 40% were carcinoma in situ. CONCLUSION: PpIX fluorescence induced by instillation into the bladder of 5-ALA is an efficient method of mapping the mucosa in bladder carcinoma.
Resumo:
Achieving a high degree of dependability in complex macro-systems is challenging. Because of the large number of components and numerous independent teams involved, an overview of the global system performance is usually lacking to support both design and operation adequately. A functional failure mode, effects and criticality analysis (FMECA) approach is proposed to address the dependability optimisation of large and complex systems. The basic inductive model FMECA has been enriched to include considerations such as operational procedures, alarm systems. environmental and human factors, as well as operation in degraded mode. Its implementation on a commercial software tool allows an active linking between the functional layers of the system and facilitates data processing and retrieval, which enables to contribute actively to the system optimisation. The proposed methodology has been applied to optimise dependability in a railway signalling system. Signalling systems are typical example of large complex systems made of multiple hierarchical layers. The proposed approach appears appropriate to assess the global risk- and availability-level of the system as well as to identify its vulnerabilities. This enriched-FMECA approach enables to overcome some of the limitations and pitfalls previously reported with classical FMECA approaches.
Resumo:
Ethyl glucuronide (EtG) is a minor and direct metabolite of ethanol. EtG is incorporated into the growing hair allowing retrospective investigation of chronic alcohol abuse. In this study, we report the development and the validation of a method using gas chromatography-negative chemical ionization tandem mass spectrometry (GC-NCI-MS/MS) for the quantification of EtG in hair. EtG was extracted from about 30 mg of hair by aqueous incubation and purified by solid-phase extraction (SPE) using mixed mode extraction cartridges followed by derivation with perfluoropentanoic anhydride (PFPA). The analysis was performed in the selected reaction monitoring (SRM) mode using the transitions m/z 347-->163 (for the quantification) and m/z 347-->119 (for the identification) for EtG, and m/z 352-->163 for EtG-d(5) used as internal standard. For validation, we prepared quality controls (QC) using hair samples taken post mortem from 2 subjects with a known history of alcoholism. These samples were confirmed by a proficiency test with 7 participating laboratories. The assay linearity of EtG was confirmed over the range from 8.4 to 259.4 pg/mg hair, with a coefficient of determination (r(2)) above 0.999. The limit of detection (LOD) was estimated with 3.0 pg/mg. The lower limit of quantification (LLOQ) of the method was fixed at 8.4 pg/mg. Repeatability and intermediate precision (relative standard deviation, RSD%), tested at 4 QC levels, were less than 13.2%. The analytical method was applied to several hair samples obtained from autopsy cases with a history of alcoholism and/or lesions caused by alcohol. EtG concentrations in hair ranged from 60 to 820 pg/mg hair.
Resumo:
Q-sort is a research method which allows defining profiles of attitudes toward a set of statements, ordered in relation to each other. Pertaining to the Q Methodology, the qualitative analysis of the Q-sorts is based on quantitative techniques. This method is of particular interest for research in health professions, a field in which attitudes of patients and professionals are very important. The method is presented in this article, along with an example of application in nursing in old age psychiatry.
Resumo:
The use of Geographic Information Systems has revolutionalized the handling and the visualization of geo-referenced data and has underlined the critic role of spatial analysis. The usual tools for such a purpose are geostatistics which are widely used in Earth science. Geostatistics are based upon several hypothesis which are not always verified in practice. On the other hand, Artificial Neural Network (ANN) a priori can be used without special assumptions and are known to be flexible. This paper proposes to discuss the application of ANN in the case of the interpolation of a geo-referenced variable.
Resumo:
Estimating the time since the last discharge of firearms and/or spent cartridges may be a useful piece of information in forensic firearm-related cases. The current approach consists of studying the diffusion of selected volatile organic compounds (such as naphthalene) released during the shooting using solid phase micro-extraction (SPME). However, this technique works poorly on handgun car-tridges because the extracted quantities quickly fall below the limit of detection. In order to find more effective solutions and further investigate the aging of organic gunshot residue after the discharge of handgun cartridges, an extensive study was carried out in this work using a novel approach based on high capacity headspace sorptive extraction (HSSE). By adopting this technique, for the first time 51 gunshot residue (GSR) volatile organic compounds could be simultaneously detected from fired handgun cartridge cases. Application to aged specimens showed that many of those compounds presented significant and complementary aging profiles. Compound-to-compound ratios were also tested and proved to be beneficial both in reducing the variability of the aging curves and in enlarging the time window useful in a forensic casework perspective. The obtained results were thus particularly promising for the development of a new complete forensic dating methodology.
Resumo:
Recognition by the T-cell receptor (TCR) of immunogenic peptides (p) presented by Class I major histocompatibility complexes (MHC) is the key event in the immune response against virus-infected cells or tumor cells. A study of the 2C TCR/SIYR/H-2K(b) system using a computational alanine scanning and a much faster binding free energy decomposition based on the Molecular Mechanics-Generalized Born Surface Area (MM-GBSA) method is presented. The results show that the TCR-p-MHC binding free energy decomposition using this approach and including entropic terms provides a detailed and reliable description of the interactions between the molecules at an atomistic level. Comparison of the decomposition results with experimentally determined activity differences for alanine mutants yields a correlation of 0.67 when the entropy is neglected and 0.72 when the entropy is taken into account. Similarly, comparison of experimental activities with variations in binding free energies determined by computational alanine scanning yields correlations of 0.72 and 0.74 when the entropy is neglected or taken into account, respectively. Some key interactions for the TCR-p-MHC binding are analyzed and some possible side chains replacements are proposed in the context of TCR protein engineering. In addition, a comparison of the two theoretical approaches for estimating the role of each side chain in the complexation is given, and a new ad hoc approach to decompose the vibrational entropy term into atomic contributions, the linear decomposition of the vibrational entropy (LDVE), is introduced. The latter allows the rapid calculation of the entropic contribution of interesting side chains to the binding. This new method is based on the idea that the most important contributions to the vibrational entropy of a molecule originate from residues that contribute most to the vibrational amplitude of the normal modes. The LDVE approach is shown to provide results very similar to those of the exact but highly computationally demanding method.
Resumo:
RATIONALE AND OBJECTIVE:. The information assessment method (IAM) permits health professionals to systematically document the relevance, cognitive impact, use and health outcomes of information objects delivered by or retrieved from electronic knowledge resources. The companion review paper (Part 1) critically examined the literature, and proposed a 'Push-Pull-Acquisition-Cognition-Application' evaluation framework, which is operationalized by IAM. The purpose of the present paper (Part 2) is to examine the content validity of the IAM cognitive checklist when linked to email alerts. METHODS: A qualitative component of a mixed methods study was conducted with 46 doctors reading and rating research-based synopses sent on email. The unit of analysis was a doctor's explanation of a rating of one item regarding one synopsis. Interviews with participants provided 253 units that were analysed to assess concordance with item definitions. RESULTS AND CONCLUSION: The content relevance of seven items was supported. For three items, revisions were needed. Interviews suggested one new item. This study has yielded a 2008 version of IAM.
Resumo:
Purpose: To develop and evaluate a practical method for the quantification of signal-to-noise ratio (SNR) on coronary MR angiograms (MRA) acquired with parallel imaging.Materials and Methods: To quantify the spatially varying noise due to parallel imaging reconstruction, a new method has been implemented incorporating image data acquisition followed by a fast noise scan during which radio-frequency pulses, cardiac triggering and navigator gating are disabled. The performance of this method was evaluated in a phantom study where SNR measurements were compared with those of a reference standard (multiple repetitions). Subsequently, SNR of myocardium and posterior skeletal muscle was determined on in vivo human coronary MRA.Results: In a phantom, the SNR measured using the proposed method deviated less than 10.1% from the reference method for small geometry factors (<= 2). In vivo, the noise scan for a 10 min coronary MRA acquisition was acquired in 30 s. Higher signal and lower SNR, due to spatially varying noise, were found in myocardium compared with posterior skeletal muscle.Conclusion: SNR quantification based on a fast noise scan is a validated and easy-to-use method when applied to three-dimensional coronary MRA obtained with parallel imaging as long as the geometry factor remains low.
Resumo:
PURPOSE: : We describe a retinal endovascular fibrinolysis technique to directly reperfuse experimentally occluded retinal veins using a simple micropipette. METHODS: : Retinal vein occlusion was photochemically induced in 12 eyes of 12 minipigs: after intravenous injection of 10% fluorescein (1-mL bolus), the targeted retinal vein segment was exposed to thrombin (50 units) and to Argon laser (100-200 mW) through a pars plana approach. A beveled micropipette with a 30-μm-diameter sharp edge was used for micropuncture of the occluded vein and endovascular microinjection of tissue plasminogen activator (50 μg/mL) in 11 eyes. In one control eye, balanced salt solution was injected. The lesion site was examined histologically. RESULTS: : Retinal vein occlusion was achieved in all cases. Endovascular microinjection of tissue plasminogen activator or balanced salt solution led to reperfusion of the occluded retinal vein in all cases. Indicative of successful reperfusion were the following: continuous endovascular flow, unaffected collateral circulation, no optic disk ischemia, and no venous wall bleeding. However, balanced salt solution injection was accompanied by thrombus formation at the punctured site, whereas no thrombus was observed with tissue plasminogen activator injection. CONCLUSION: : Retinal endovascular fibrinolysis constitutes an efficient method of micropuncture and reperfusion of an experimentally occluded retinal vein. Thrombus formation at the punctured site can be prevented by injection of tissue plasminogen activator.
Resumo:
The 2009-2010 Data Fusion Contest organized by the Data Fusion Technical Committee of the IEEE Geoscience and Remote Sensing Society was focused on the detection of flooded areas using multi-temporal and multi-modal images. Both high spatial resolution optical and synthetic aperture radar data were provided. The goal was not only to identify the best algorithms (in terms of accuracy), but also to investigate the further improvement derived from decision fusion. This paper presents the four awarded algorithms and the conclusions of the contest, investigating both supervised and unsupervised methods and the use of multi-modal data for flood detection. Interestingly, a simple unsupervised change detection method provided similar accuracy as supervised approaches, and a digital elevation model-based predictive method yielded a comparable projected change detection map without using post-event data.
Resumo:
Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.