187 resultados para agile method
Resumo:
BACKGROUND: Straylight gives the appearance of a veil of light thrown over a person's retinal image when there is a strong light source present. We examined the reproducibility of the measurements by C-Quant, and assessed its correlation to characteristics of the eye and subjects' age. PARTICIPANTS AND METHODS: Five repeated straylight measurements were taken using the dominant eye of 45 healthy subjects (age 21-59) with a BCVA of 20/20: 14 emmetropic, 16 myopic, eight hyperopic and seven with astigmatism. We assessed the extent of reproducibility of straylight measures using the intraclass correlation coefficient. RESULTS: The mean straylight value of all measurements was 1.01 (SD 0.23, median 0.97, interquartile range 0.85-1.1). Per 10 years of age, straylight increased in average by 0.10 (95%CI 0.04 to 0.16, p < 0.01]. We found no independent association of refraction (range -5.25 dpt to +2 dpt) on straylight values (0.001; 95%CI -0.022 to 0.024, p = 0.92). Compared to emmetropic subjects, myopia reduced straylight (-.011; -0.024 to 0.02, p = 0.11), whereas higher straylight values (0.09; -0.01 to 0.20, p = 0.09) were observed in subjects with blue irises as compared to dark-colored irises when correcting for age. The intraclass correlation coefficient (ICC) of repeated measurements was 0.83 (95%CI 0.76 to 0.90). CONCLUSIONS: Our study showed that straylight measurements with the C-Quant had a high reproducibility, i.e. a lack of large intra-observer variability, making it appropriate to be applied in long-term follow-up studies assessing the long-term effect of surgical procedures on the quality of vision.
Resumo:
RATIONALE: The aim of the work was to develop and validate a method for the quantification of vitamin D metabolites in serum using ultra-high-pressure liquid chromatography coupled to mass spectrometry (LC/MS), and to validate a high-resolution mass spectrometry (LC/HRMS) approach against a tandem mass spectrometry (LC/MS/MS) approach using a large clinical sample set. METHODS: A fast, accurate and reliable method for the quantification of the vitamin D metabolites, 25-hydroxyvitamin D2 (25OH-D2) and 25-hydroxyvitamin D3 (25OH-D3), in human serum was developed and validated. The C3 epimer of 25OH-D3 (3-epi-25OH-D3) was also separated from 25OH-D3. The samples were rapidly prepared via a protein precipitation step followed by solid-phase extraction (SPE) using an HLB μelution plate. Quantification was performed using both LC/MS/MS and LC/HRMS systems. RESULTS: Recovery, matrix effect, inter- and intra-day reproducibility were assessed. Lower limits of quantification (LLOQs) were determined for both 25OH-D2 and 25OH-D3 for the LC/MS/MS approach (6.2 and 3.4 µg/L, respectively) and the LC/HRMS approach (2.1 and 1.7 µg/L, respectively). A Passing & Bablok fit was determined between both approaches for 25OH-D3 on 662 clinical samples (1.11 + 1.06x). It was also shown that results can be affected by the inclusion of the isomer 3-epi-25OH-D3. CONCLUSIONS: Quantification of the relevant vitamin D metabolites was successfully developed and validated here. It was shown that LC/HRMS is an accurate, powerful and easy to use approach for quantification within clinical laboratories. Finally, the results here suggest that it is important to separate 3-epi-25OH-D3 from 25OH-D3. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Objectives. The goal of this study is to evaluate a T2-mapping sequence by: (i) measuring the reproducibility intra- and inter-observer variability in healthy volunteers in two separate scanning session with a T2 reference phantom; (2) measuring the mean T2 relaxation times by T2-mapping in infarcted myocardium in patients with subacute MI and compare it with patient's the gold standard X-ray coronary angiography and healthy volunteers results. Background. Myocardial edema is a consequence of an inflammation of the tissue, as seen in myocardial infarct (MI). It can be visualized by cardiovascular magnetic resonance (CMR) imaging using the T2 relaxation time. T2-mapping is a quantitative methodology that has the potential to address the limitation of the conventional T2-weighted (T2W) imaging. Methods. The T2-mapping protocol used for all MRI scans consisted in a radial gradient echo acquisition with a lung-liver navigator for free-breathing acquisition and affine image registration. Mid-basal short axis slices were acquired.T2-maps analyses: 2 observers semi- automatically segmented the left ventricle in 6 segments accordingly to the AHA standards. 8 healthy volunteers (age: 27 ± 4 years; 62.5% male) were scanned in 2 separate sessions. 17 patients (age : 61.9 ± 13.9 years; 82.4% male) with subacute STEMI (70.6%) and NSTEMI underwent a T2-mapping scanning session. Results. In healthy volunteers, the mean inter- and intra-observer variability over the entire short axis slice (segment 1 to 6) was 0.1 ms (95% confidence interval (CI): -0.4 to 0.5, p = 0.62) and 0.2 ms (95% CI: -2.8 to 3.2, p = 0.94, respectively. T2 relaxation time measurements with and without the correction of the phantom yielded an average difference of 3.0 ± 1.1 % and 3.1 ± 2.1 % (p = 0.828), respectively. In patients, the inter-observer variability in the entire short axis slice (S1-S6), was 0.3 ms (95% CI: -1.8 to 2.4, p = 0.85). Edema location as determined through the T2-mapping and the coronary artery occlusion as determined on X-ray coronary angiography correlated in 78.6%, but only in 60% in apical infarcts. All except one of the maximal T2 values in infarct patients were greater than the upper limit of the 95% confidence interval for normal myocardium. Conclusions. The T2-mapping methodology is accurate in detecting infarcted, i.e. edematous tissue in patients with subacute infarcts. This study further demonstrated that this T2-mapping technique is reproducible and robust enough to be used on a segmental basis for edema detection without the need of a phantom to yield a T2 correction factor. This new quantitative T2-mapping technique is promising and is likely to allow for serial follow-up studies in patients to improve our knowledge on infarct pathophysiology, on infarct healing, and for the assessment of novel treatment strategies for acute infarctions.
Resumo:
Rapport de synthèse : Objectif : Le but de ce travail est d`étudier l'angiographie par scanner multi-barrette (AS) dans l'évaluation de l'artériopathie oblitérante (AOMI) de l'aorte abdominale et des membres inférieurs utilisant une méthode adaptative d'acquisition pour optimiser le rehaussement artériel en particulier pour le lit artériel distal et les artères des pieds. Matériels et méthodes : Trente-quatre patients pressentant une AOMI ont bénéficié d'une angiographie trans-cathéter (ATC) et d'une AS dans un délai inférieur ou égal à 15 jours. L'AS a été effectuée du tronc coeliaque jusqu'aux artères des pieds en une seule acquisition utilisant une haute résolution spatiale (16x0.625 mm). La vitesse de table et le temps de rotation pour chaque examen ont été choisis selon le temps de transit du produit de contraste, obtenu après un bolus test. Une quantité totale de 130 ml de contraste à 4 ml/s a été utilisée. L'analyse des images de l'AS a été effectuée par deux observateurs et les données ATC ont été interprétées de manière indépendante par deux autres observateurs. L'analyse a inclus la qualité de l'image et la détection de sténose supérieure ou égale à 50 % par patient et par segment artériel. La sensibilité et la spécificité de l'AS ont été calculées en considérant l'ATC comme examen de référence. La variabilité Interobservateur a été mesurée au moyen d'une statistique de kappa. Résultas : L'ATC a été non-conclusive dans 0.7 % des segments, tandis que l'AS était conclusive dans tous les segments. Sur l'analyse par patient, la sensibilité et la spécificité totales pour détecter une sténose significative égale ou supérieure à 50 % étaient de 100 %. L'analyse par segment a montré des sensibilités et de spécificités variant respectivement de 91 à 100 % et de 81 à 100 %. L'analyse des artères distales des pieds a révélé une sensibilité de 100 % et une spécificité de 90 %. Conclusion : L'angiographie par CT multi-barrettes utilisant cette méthode adaptative d'acquisition améliore la qualité de l'image et fournit une technique non-invasive et fiable pour évaluer L'AOMI, y compris les artères distales des pieds.
Resumo:
BACKGROUND: The use of the family history method is recommended in family studies as a type of proxy interview of non-participating relatives. However, using different sources of information can result in bias as direct interviews may provide a higher likelihood of assigning diagnoses than family history reports. The aims of the present study were to: 1) compare diagnoses for threshold and subthreshold mood syndromes from interviews to those relying on information from relatives; 2) test the appropriateness of lowering the diagnostic threshold and combining multiple reports from the family history method to obtain comparable prevalence estimates to the interviews; 3) identify factors that influence the likelihood of agreement and reporting of disorders by informants. METHODS: Within a family study, 1621 informant-index subject pairs were identified. DSM-5 diagnoses from direct interviews of index subjects were compared to those derived from family history information provided by their first-degree relatives. RESULTS: 1) Inter-informant agreement was acceptable for Mania, but low for all other mood syndromes. 2) Except for Mania and subthreshold depression, the family history method provided significantly lower prevalence estimates. The gap improved for all other syndromes after lowering the threshold of the family history method. 3) Individuals who had a history of depression themselves were more likely to report depression in their relatives. LIMITATIONS: Low proportion of affected individuals for manic syndromes and lack of independence of data. CONCLUSIONS: The higher likelihood of reporting disorders by affected informants entails the risk of overestimation of the size of familial aggregation of depression.
Resumo:
Chromatin immunoprecipitation followed by deep sequencing (ChIP-seq) experiments are widely used to determine, within entire genomes, the occupancy sites of any protein of interest, including, for example, transcription factors, RNA polymerases, or histones with or without various modifications. In addition to allowing the determination of occupancy sites within one cell type and under one condition, this method allows, in principle, the establishment and comparison of occupancy maps in various cell types, tissues, and conditions. Such comparisons require, however, that samples be normalized. Widely used normalization methods that include a quantile normalization step perform well when factor occupancy varies at a subset of sites, but may miss uniform genome-wide increases or decreases in site occupancy. We describe a spike adjustment procedure (SAP) that, unlike commonly used normalization methods intervening at the analysis stage, entails an experimental step prior to immunoprecipitation. A constant, low amount from a single batch of chromatin of a foreign genome is added to the experimental chromatin. This "spike" chromatin then serves as an internal control to which the experimental signals can be adjusted. We show that the method improves similarity between replicates and reveals biological differences including global and largely uniform changes.
Resumo:
Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.
Resumo:
Platelet-rich plasma (PRP) is a volume of plasma fraction of autologous blood having platelet concentrations above baseline whole-blood values due to processing and concentration. PRP is used in various surgical fields to enhance soft-tissue and bone healing by delivering supra-physiological concentrations of autologous platelets at the site of tissue damage. These preparations may provide a good cellular source of various growth factors and cytokines, and modulate tissue response to injury. Common clinically available materials for blood preparations combined with a two-step centrifugation protocol at 280g each, to ensure cellular component integrity, provided platelet preparations which were concentrated 2-3 fold over total blood values. Costs were shown to be lower than those of other methods which require specific equipment and high-cost disposables, while safety and traceability can be increased. PRP can be used for the treatment of wounds of all types including burns and also of split-thickness skin graft donor sites, which are frequently used in burn management. The procedure can be standardized and is easy to adapt in clinical settings with minimal infrastructure, thus enabling large numbers of patients to benefit from a form of cellular therapy.
Resumo:
An adaptation technique based on the synoptic atmospheric circulation to forecast local precipitation, namely the analogue method, has been implemented for the western Swiss Alps. During the calibration procedure, relevance maps were established for the geopotential height data. These maps highlight the locations were the synoptic circulation was found of interest for the precipitation forecasting at two rain gauge stations (Binn and Les Marécottes) that are located both in the alpine Rhône catchment, at a distance of about 100 km from each other. These two stations are sensitive to different atmospheric circulations. We have observed that the most relevant data for the analogue method can be found where specific atmospheric circulation patterns appear concomitantly with heavy precipitation events. Those skilled regions are coherent with the atmospheric flows illustrated, for example, by means of the back trajectories of air masses. Indeed, the circulation recurrently diverges from the climatology during days with strong precipitation on the southern part of the alpine Rhône catchment. We have found that for over 152 days with precipitation amount above 50 mm at the Binn station, only 3 did not show a trajectory of a southerly flow, meaning that such a circulation was present for 98% of the events. Time evolution of the relevance maps confirms that the atmospheric circulation variables have significantly better forecasting skills close to the precipitation period, and that it seems pointless for the analogue method to consider circulation information days before a precipitation event as a primary predictor. Even though the occurrence of some critical circulation patterns leading to heavy precipitation events can be detected by precursors at remote locations and 1 week ahead (Grazzini, 2007; Martius et al., 2008), time extrapolation by the analogue method seems to be rather poor. This would suggest, in accordance with previous studies (Obled et al., 2002; Bontron and Obled, 2005), that time extrapolation should be done by the Global Circulation Model, which can process atmospheric variables that can be used by the adaptation method.
Resumo:
BACKGROUND: Laparoscopic techniques have been proposed as an alternative to open surgery for therapy of peptic ulcer perforation. They provide better postoperative comfort and absence of parietal complications, but leakage occurs in 5% of cases. We describe a new method combining laparoscopy and endoluminal endoscopy, designed to ensure complete closure of the perforation. METHODS: Six patients with anterior ulcer perforations (4 duodenal, 2 gastric) underwent a concomitant laparoscopy and endoluminal endoscopy with closure of the orifice by an omental plug attracted into the digestive tract. RESULTS: All perforations were sealed. The mean operating time was 72 minutes. The mean hospital stay was 5.5 days. There was no morbidity and no mortality. At the 30-day evaluation all ulcers but one (due to Helicobacter pylori persistence) were healed. CONCLUSIONS: This method is safe and effective. Its advantages compared with open surgery or laparoscopic patching as well as its cost-effectiveness should be studied in prospective randomized trials.
Resumo:
Saffaj et al. recently criticized our method of monitoring carbon dioxide in human postmortem cardiac gas samples using Headspace-Gas Chromatography-Mass Spectrometry. According to the authors, their demonstration, based on the latest SFSTP guidelines (established after 2007 [1,2]) fitted for the validation of drug monitoring bioanalytical methods, has put in evidence potential errors. However, our validation approach was built using SFSTP guidelines established before 2007 [3-6]. We justify the use of these guidelines because of the post-mortem context of the study (and not clinical) and the gaseous state of the sample (and not solid or liquid). Using these guidelines, our validation remains correct.