910 resultados para Automated quantification
Resumo:
Therapeutic drug monitoring (TDM) may contribute to optimizing the efficacy and safety of antifungal therapy because of the large variability in drug pharmacokinetics. Rapid, sensitive, and selective laboratory methods are needed for efficient TDM. Quantification of several antifungals in a single analytical run may best fulfill these requirements. We therefore developed a multiplex ultra-performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) method requiring 100 μl of plasma for simultaneous quantification within 7 min of fluconazole, itraconazole, hydroxyitraconazole, posaconazole, voriconazole, voriconazole-N-oxide, caspofungin, and anidulafungin. Protein precipitation with acetonitrile was used in a single extraction procedure for eight analytes. After reverse-phase chromatographic separation, antifungals were quantified by electrospray ionization-triple-quadrupole mass spectrometry by selected reaction monitoring detection using the positive mode. Deuterated isotopic compounds of azole antifungals were used as internal standards. The method was validated based on FDA recommendations, including assessment of extraction yields, matrix effect variability (<9.2%), and analytical recovery (80.1 to 107%). The method is sensitive (lower limits of azole quantification, 0.01 to 0.1 μg/ml; those of echinocandin quantification, 0.06 to 0.1 μg/ml), accurate (intra- and interassay biases of -9.9 to +5% and -4.0 to +8.8%, respectively), and precise (intra- and interassay coefficients of variation of 1.2 to 11.1% and 1.2 to 8.9%, respectively) over clinical concentration ranges (upper limits of quantification, 5 to 50 μg/ml). Thus, we developed a simple, rapid, and robust multiplex UPLC-MS/MS assay for simultaneous quantification of plasma concentrations of six antifungals and two metabolites. This offers, by optimized and cost-effective lab resource utilization, an efficient tool for daily routine TDM aimed at maximizing the real-time efficacy and safety of different recommended single-drug antifungal regimens and combination salvage therapies, as well as a tool for clinical research.
Resumo:
Limiting dilution analysis was used to quantify Trypanosoma cruzi in the lymph nodes, liver and heart of Swiss and C57 B1/10 mice. The results showed that, in Swiss and B1/10 mice infected with T. cruzi Y strain, the number of parasites/mg of tissue increased during the course of the infection in both types of mice, although a grater number of parasites were observed in heart tissue from Swiss mice than from B1/10. With regard to liver tissue, it was observed that the parasite load in the initial phase of infection was higher than in heart. In experiments using T. cruzi Colombian strain, the parasite load in the heart of Swiss and B1/10 mice increased relatively slowly, although high levels of parasitization were nonetheless observable by the end of the infection. As for the liver and lymph nodes, the concentration of parasites was lower over the entire course of infection than in heart. Both strains thus maintained their characteristic tissue tropisms. The limiting dilution assay (LDA) proved to be an appropriate method for more precise quantification of T. cruzi, comparing favorably with other direct microscopic methods that only give approximate scores.
Resumo:
The broad resonances underlying the entire (1) H NMR spectrum of the brain, ascribed to macromolecules, can influence metabolite quantification. At the intermediate field strength of 3 T, distinct approaches for the determination of the macromolecule signal, previously used at either 1.5 or 7 T and higher, may become equivalent. The aim of this study was to evaluate, at 3 T for healthy subjects using LCModel, the impact on the metabolite quantification of two different macromolecule approaches: (i) experimentally measured macromolecules; and (ii) mathematically estimated macromolecules. Although small, but significant, differences in metabolite quantification (up to 23% for glutamate) were noted for some metabolites, 10 metabolites were quantified reproducibly with both approaches with a Cramer-Rao lower bound below 20%, and the neurochemical profiles were therefore similar. We conclude that the mathematical approximation can provide sufficiently accurate and reproducible estimation of the macromolecule contribution to the (1) H spectrum at 3 T. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
In hyperdiploid acute lymphoblastic leukaemia (ALL), the simultaneous occurrence of specific aneuploidies confers a more favourable outcome than hyperdiploidy alone. Interphase (I) FISH complements conventional cytogenetics (CC) through its sensitivity and ability to detect chromosome aberrations in non-dividing cells. To overcome the limits of manual I-FISH, we developed an automated four-colour I-FISH approach and assessed its ability to detect concurrent aneuploidies in ALL. I-FISH was performed using centromeric probes for chromosomes 4, 6, 10 and 17. Parameters established for automatic nucleus selection and signal detection were evaluated (3 controls). Cut-off values were determined (10 controls, 1000 nuclei/case). Combinations of aneuploidies were considered relevant when each aneuploidy was individually significant. Results obtained in 10 ALL patients (1500 nuclei/patient) were compared with those by CC. Various combinations of aneuploidies were identified. All clones detected by CC were observed by I-FISH. I-FISH revealed numerous additional abnormal clones, ranging between 0.1 % and 31.6%, based on the large number of nuclei evaluated. Four-colour automated I-FISH permits the identification of concurrent aneuploidies of prognostic significance in hyperdiploid ALL. Large numbers of cells can be analysed rapidly by this method. Owing to its high sensitivity, the method provides a powerful tool for the detection of small abnormal clones at diagnosis and during follow up. Compared to CC, it generates a more detailed cytogenetic picture, the biological and clinical significance of which merits further evaluation. Once optimised for a given set of probes, the system can be easily adapted for other probe combinations.
Resumo:
Metabolic labeling techniques have recently become popular tools for the quantitative profiling of proteomes. Classical stable isotope labeling with amino acids in cell cultures (SILAC) uses pairs of heavy/light isotopic forms of amino acids to introduce predictable mass differences in protein samples to be compared. After proteolysis, pairs of cognate precursor peptides can be correlated, and their intensities can be used for mass spectrometry-based relative protein quantification. We present an alternative SILAC approach by which two cell cultures are grown in media containing isobaric forms of amino acids, labeled either with 13C on the carbonyl (C-1) carbon or 15N on backbone nitrogen. Labeled peptides from both samples have the same nominal mass and nearly identical MS/MS spectra but generate upon fragmentation distinct immonium ions separated by 1 amu. When labeled protein samples are mixed, the intensities of these immonium ions can be used for the relative quantification of the parent proteins. We validated the labeling of cellular proteins with valine, isoleucine, and leucine with coverage of 97% of all tryptic peptides. We improved the sensitivity for the detection of the quantification ions on a pulsing instrument by using a specific fast scan event. The analysis of a protein mixture with a known heavy/light ratio showed reliable quantification. Finally the application of the technique to the analysis of two melanoma cell lines yielded quantitative data consistent with those obtained by a classical two-dimensional DIGE analysis of the same samples. Our method combines the features of the SILAC technique with the advantages of isobaric labeling schemes like iTRAQ. We discuss advantages and disadvantages of isobaric SILAC with immonium ion splitting as well as possible ways to improve it
Resumo:
BACKGROUND: Enterovirus (EV) is the most frequent cause of aseptic meningitis (AM). Lack of microbiological documentation results in unnecessary antimicrobial therapy and hospitalization. OBJECTIVES: To assess the impact of rapid EV detection in cerebrospinal fluid (CSF) by a fully-automated PCR (GeneXpert EV assay, GXEA) on the management of AM. STUDY DESIGN: Observational study in adult patients with AM. Three groups were analyzed according to EV documentation in CSF: group A=no PCR or negative PCR (n=17), group B=positive real-time PCR (n=20), and group C=positive GXEA (n=22). Clinical, laboratory and health-care costs data were compared. RESULTS: Clinical characteristics were similar in the 3 groups. Median turn-around time of EV PCR decreased from 60h (IQR (interquartile range) 44-87) in group B to 5h (IQR 4-11) in group C (p<0.0001). Median duration of antibiotics was 1 (IQR 0-6), 1 (0-1.9), and 0.5 days (single dose) in groups A, B, and C, respectively (p<0.001). Median length of hospitalization was 4 days (2.5-7.5), 2 (1-3.7), and 0.5 (0.3-0.7), respectively (p<0.001). Median hospitalization costs were $5458 (2676-6274) in group A, $2796 (2062-5726) in group B, and $921 (765-1230) in group C (p<0.0001). CONCLUSIONS: Rapid EV detection in CSF by a fully-automated PCR improves management of AM by significantly reducing antibiotic use, hospitalization length and costs.
Resumo:
Coronary artery calcification (CAC) is quantified based on a computed tomography (CT) scan image. A calcified region is identified. Modified expectation maximization (MEM) of a statistical model for the calcified and background material is used to estimate the partial calcium content of the voxels. The algorithm limits the region over which MEM is performed. By using MEM, the statistical properties of the model are iteratively updated based on the calculated resultant calcium distribution from the previous iteration. The estimated statistical properties are used to generate a map of the partial calcium content in the calcified region. The volume of calcium in the calcified region is determined based on the map. The experimental results on a cardiac phantom, scanned 90 times using 15 different protocols, demonstrate that the proposed method is less sensitive to partial volume effect and noise, with average error of 9.5% (standard deviation (SD) of 5-7mm(3)) compared with 67% (SD of 3-20mm(3)) for conventional techniques. The high reproducibility of the proposed method for 35 patients, scanned twice using the same protocol at a minimum interval of 10 min, shows that the method provides 2-3 times lower interscan variation than conventional techniques.
Resumo:
OBJECTIVE: To assess the impact of nonuniform dose distribution within lesions and tumor-involved organs of patients receiving Zevalin, and to discuss possible implications of equivalent uniform biological effective doses (EU-BED) on treatment efficacy and toxicity. MATLAB? -based software for voxel-based dosimetry was adopted for this purpose. METHODS: Eleven lesions from seven patients with either indolent or aggressive non-Hodgkin lymphoma were analyzed, along with four organs with disease. Absorbed doses were estimated by a direct integration of single-voxel kinetic data from serial tomographic images. After proper corrections, differential BED distributions and surviving cell fractions were estimated, allowing for the calculation of EU-BED. To quantify dose uniformity in each target area, a heterogeneity index was defined. RESULTS: Average doses were below those prescribed by conventional radiotherapy to eradicate lymphoma lesions. Dose heterogeneity and effect on tumor control varied among lesions, with no apparent relation to tumor mass. Although radiation doses to involved organs were safe, unexpected liver toxicity occurred in one patient who presented with a pattern of diffuse infiltration. CONCLUSION: Voxel-based dosimetry and radiobiologic modeling can be successfully applied to lesions and tumor-involved organs, representing a methodological advance over estimation of mean absorbed doses. However, effects on tumor control and organ toxicity still cannot be easily predicted.
Resumo:
An enzyme-linked immunosorbent assay was standardized for the detection of cryptococcal antigen in serum and cerebrospinal fluid. The system was evaluated in clinical samples from patients infected by human immunodeficiency virus with and without previous cryptococcosis diagnosis. The evaluated system is highly sensitive and specific, and when it was compared with latex agglutination there were not significant differences. A standard curve with purified Cryptococcus neoformans antigen was settled down for the antigen quantification in positive samples.
Resumo:
Evidence has emerged that the initiation and growth of gliomas is sustained by a subpopulation of cancer-initiating cells (CICs). Because of the difficulty of using markers to tag CICs in gliomas, we have previously exploited more robust phenotypic characteristics, including a specific morphology and intrincic autofluorescence, to identify and isolate a subpopulation of glioma CICs, called FL1(+). The objective of this study was to further validate our method in a large cohort of human glioma and a mouse model of glioma. Seventy-four human gliomas of all grades and the GFAP-V(12)HA-ras B8 mouse model were analyzed for in vitro self-renewal capacity and their content of FL1(+). Nonneoplastic brain tissue and embryonic mouse brain were used as control. Genetic traceability along passages was assessed with microsatellite analysis. We found that FL1(+) cells from low-grade gliomas and from control nonneoplasic brain tissue show a lower level of autofluorescence and undergo a restricted number of cell divisions before dying in culture. In contrast, we found that FL1(+) cells derived from many but not all high-grade gliomas acquire high levels of autofluorescence and can be propagated in long-term cultures. Moreover, FL1(+) cells show a remarkable traceability over time in vitro and in vivo. Our results show that FL1(+) cells can be found in all specimens of a large cohort of human gliomas of different grades and in a model of genetically induced mouse glioma as well as nonneoplastic brain. However, their self-renewal capacity is variable and seems to be dependent on the tumor grade.
Resumo:
Abstract : In the subject of fingerprints, the rise of computers tools made it possible to create powerful automated search algorithms. These algorithms allow, inter alia, to compare a fingermark to a fingerprint database and therefore to establish a link between the mark and a known source. With the growth of the capacities of these systems and of data storage, as well as increasing collaboration between police services on the international level, the size of these databases increases. The current challenge for the field of fingerprint identification consists of the growth of these databases, which makes it possible to find impressions that are very similar but coming from distinct fingers. However and simultaneously, this data and these systems allow a description of the variability between different impressions from a same finger and between impressions from different fingers. This statistical description of the withinand between-finger variabilities computed on the basis of minutiae and their relative positions can then be utilized in a statistical approach to interpretation. The computation of a likelihood ratio, employing simultaneously the comparison between the mark and the print of the case, the within-variability of the suspects' finger and the between-variability of the mark with respect to a database, can then be based on representative data. Thus, these data allow an evaluation which may be more detailed than that obtained by the application of rules established long before the advent of these large databases or by the specialists experience. The goal of the present thesis is to evaluate likelihood ratios, computed based on the scores of an automated fingerprint identification system when the source of the tested and compared marks is known. These ratios must support the hypothesis which it is known to be true. Moreover, they should support this hypothesis more and more strongly with the addition of information in the form of additional minutiae. For the modeling of within- and between-variability, the necessary data were defined, and acquired for one finger of a first donor, and two fingers of a second donor. The database used for between-variability includes approximately 600000 inked prints. The minimal number of observations necessary for a robust estimation was determined for the two distributions used. Factors which influence these distributions were also analyzed: the number of minutiae included in the configuration and the configuration as such for both distributions, as well as the finger number and the general pattern for between-variability, and the orientation of the minutiae for within-variability. In the present study, the only factor for which no influence has been shown is the orientation of minutiae The results show that the likelihood ratios resulting from the use of the scores of an AFIS can be used for evaluation. Relatively low rates of likelihood ratios supporting the hypothesis known to be false have been obtained. The maximum rate of likelihood ratios supporting the hypothesis that the two impressions were left by the same finger when the impressions came from different fingers obtained is of 5.2 %, for a configuration of 6 minutiae. When a 7th then an 8th minutia are added, this rate lowers to 3.2 %, then to 0.8 %. In parallel, for these same configurations, the likelihood ratios obtained are on average of the order of 100,1000, and 10000 for 6,7 and 8 minutiae when the two impressions come from the same finger. These likelihood ratios can therefore be an important aid for decision making. Both positive evolutions linked to the addition of minutiae (a drop in the rates of likelihood ratios which can lead to an erroneous decision and an increase in the value of the likelihood ratio) were observed in a systematic way within the framework of the study. Approximations based on 3 scores for within-variability and on 10 scores for between-variability were found, and showed satisfactory results. Résumé : Dans le domaine des empreintes digitales, l'essor des outils informatisés a permis de créer de puissants algorithmes de recherche automatique. Ces algorithmes permettent, entre autres, de comparer une trace à une banque de données d'empreintes digitales de source connue. Ainsi, le lien entre la trace et l'une de ces sources peut être établi. Avec la croissance des capacités de ces systèmes, des potentiels de stockage de données, ainsi qu'avec une collaboration accrue au niveau international entre les services de police, la taille des banques de données augmente. Le défi actuel pour le domaine de l'identification par empreintes digitales consiste en la croissance de ces banques de données, qui peut permettre de trouver des impressions très similaires mais provenant de doigts distincts. Toutefois et simultanément, ces données et ces systèmes permettent une description des variabilités entre différentes appositions d'un même doigt, et entre les appositions de différents doigts, basées sur des larges quantités de données. Cette description statistique de l'intra- et de l'intervariabilité calculée à partir des minuties et de leurs positions relatives va s'insérer dans une approche d'interprétation probabiliste. Le calcul d'un rapport de vraisemblance, qui fait intervenir simultanément la comparaison entre la trace et l'empreinte du cas, ainsi que l'intravariabilité du doigt du suspect et l'intervariabilité de la trace par rapport à une banque de données, peut alors se baser sur des jeux de données représentatifs. Ainsi, ces données permettent d'aboutir à une évaluation beaucoup plus fine que celle obtenue par l'application de règles établies bien avant l'avènement de ces grandes banques ou par la seule expérience du spécialiste. L'objectif de la présente thèse est d'évaluer des rapports de vraisemblance calcul és à partir des scores d'un système automatique lorsqu'on connaît la source des traces testées et comparées. Ces rapports doivent soutenir l'hypothèse dont il est connu qu'elle est vraie. De plus, ils devraient soutenir de plus en plus fortement cette hypothèse avec l'ajout d'information sous la forme de minuties additionnelles. Pour la modélisation de l'intra- et l'intervariabilité, les données nécessaires ont été définies, et acquises pour un doigt d'un premier donneur, et deux doigts d'un second donneur. La banque de données utilisée pour l'intervariabilité inclut environ 600000 empreintes encrées. Le nombre minimal d'observations nécessaire pour une estimation robuste a été déterminé pour les deux distributions utilisées. Des facteurs qui influencent ces distributions ont, par la suite, été analysés: le nombre de minuties inclus dans la configuration et la configuration en tant que telle pour les deux distributions, ainsi que le numéro du doigt et le dessin général pour l'intervariabilité, et la orientation des minuties pour l'intravariabilité. Parmi tous ces facteurs, l'orientation des minuties est le seul dont une influence n'a pas été démontrée dans la présente étude. Les résultats montrent que les rapports de vraisemblance issus de l'utilisation des scores de l'AFIS peuvent être utilisés à des fins évaluatifs. Des taux de rapports de vraisemblance relativement bas soutiennent l'hypothèse que l'on sait fausse. Le taux maximal de rapports de vraisemblance soutenant l'hypothèse que les deux impressions aient été laissées par le même doigt alors qu'en réalité les impressions viennent de doigts différents obtenu est de 5.2%, pour une configuration de 6 minuties. Lorsqu'une 7ème puis une 8ème minutie sont ajoutées, ce taux baisse d'abord à 3.2%, puis à 0.8%. Parallèlement, pour ces mêmes configurations, les rapports de vraisemblance sont en moyenne de l'ordre de 100, 1000, et 10000 pour 6, 7 et 8 minuties lorsque les deux impressions proviennent du même doigt. Ces rapports de vraisemblance peuvent donc apporter un soutien important à la prise de décision. Les deux évolutions positives liées à l'ajout de minuties (baisse des taux qui peuvent amener à une décision erronée et augmentation de la valeur du rapport de vraisemblance) ont été observées de façon systématique dans le cadre de l'étude. Des approximations basées sur 3 scores pour l'intravariabilité et sur 10 scores pour l'intervariabilité ont été trouvées, et ont montré des résultats satisfaisants.
Resumo:
Objective: To compare pressure–volume (P–V) curves obtained with the Galileo ventilator with those obtained with the CPAP method in patients with ALI or ARDS receiving mechanical ventilation. P–V curves were fitted to a sigmoidal equation with a mean R2 of 0.994 ± 0.003. Lower (LIP) and upper inflection (UIP), and deflation maximum curvature (PMC) points calculated from the fitted variables showed a good correlation between methods with high intraclass correlation coefficients. Bias and limits of agreement for LIP, UIP and PMC obtained with the two methods in the same patient were clinically acceptable.
Resumo:
Introduction Lesion detection in multiple sclerosis (MS) is an essential part of its clinical diagnosis. In addition, radiological characterisation of MS lesions is an important research field that aims at distinguishing different MS types, monitoring drug response and prognosis. To date, various MR protocols have been proposed to obtain optimal lesion contrast for early and comprehensive diagnosis of the MS disease. In this study, we compare the sensitivity of five different MR contrasts for lesion detection: (i) the DIR sequence (Double Inversion Recovery, [4]), (ii) the Dark-fluid SPACE acquisition schemes, a 3D variant of a 2D FLAIR sequence [1], (iii) the MP2RAGE [2], an MP-RAGE variant that provides homogeneous T1 contrast and quantitative T1-values, and the sequences currently used for clinical MS diagnosis (2D FLAIR, MP-RAGE). Furthermore, we investigate the T1 relaxation times of cortical and sub-cortical regions in the brain hemispheres and the cerebellum at 3T. Methods 10 early-stage female MS patients (age: 31.64.7y; disease duration: 3.81.9y; disability score, EDSS: 1.80.4) and 10 healthy controls (age and gender-matched: 31.25.8y) were included in the study after obtaining informed written consent according to the local ethic protocol. All experiments were performed at 3T (Magnetom Trio a Tim System, Siemens, Germany) using a 32-channel head coil [5]. The imaging protocol included the following sequences, (all except for axial FLAIR 2D with 1x1x1.2 mm3 voxel and 256x256x160 matrix): DIR (TI1/TI2/TR XX/3652/10000 ms, iPAT=2, TA 12:02 min), MP-RAGE (TI/TR 900/2300 ms, iPAT=3, TA 3:47 min); MP2RAGE (TI1/TI2/TR 700/2500/5000 ms, iPAT=3, TA 8:22 min, cf. [2]); 3D FLAIR SPACE (only for patient 4-6, TI/TR 1800/5000 ms, iPAT=2, TA=5;52 min, cf. [1]); Axial FLAIR (0.9x0.9x2.5 mm3, 256x256x44 matrix, TI/TR 2500/9000 ms, iPAT=2, TA 4:05 min). Lesions were identified by two experienced neurologist and radiologist, manually contoured and assigned to regional locations (s. table 1). Regional lesion masks (RLM) from each contrast were compared for number and volumes of lesions. In addition, RLM were merged in a single "master" mask, which represented the sum of the lesions of all contrasts. T1 values were derived for each location from this mask for patients 5-10 (3D FLAIR contrast was missing for patient 1-4). Results & Discussion The DIR sequence appears the most sensitive for total lesions count, followed by the MP2RAGE (table 1). The 3D FLAIR SPACE sequence turns out to be more sensitive than the 2D FLAIR, presumably due to reduced partial volume effects. Looking for sub-cortical hemispheric lesions, the DIR contrast appears to be equally sensitive to the MP2RAGE and SPACE, but most sensitive for cerebellar MS plaques. The DIR sequence is also the one that reveals cortical hemispheric lesions best. T1 relaxation times at 3T in the WM and GM of the hemispheres and the cerebellum, as obtained with the MP2RAGE sequence, are shown in table 2. Extending previous studies, we confirm overall longer T1-values in lesion tissue and higher standard deviations compared to the non-lesion tissue and control tissue in healthy controls. We hypothesize a biological (different degree of axonal loss and demyelination) rather than technical origin. Conclusion In this study, we applied 5 MR contrasts including two novel sequences to investigate the contrast of highest sensitivity for early MS diagnosis. In addition, we characterized for the first time the T1 relaxation time in cortical and sub-cortical regions of the hemispheres and the cerebellum. Results are in agreement with previous publications and meaningful biological interpretation of the data.
Compressed Sensing Single-Breath-Hold CMR for Fast Quantification of LV Function, Volumes, and Mass.
Resumo:
OBJECTIVES: The purpose of this study was to compare a novel compressed sensing (CS)-based single-breath-hold multislice magnetic resonance cine technique with the standard multi-breath-hold technique for the assessment of left ventricular (LV) volumes and function. BACKGROUND: Cardiac magnetic resonance is generally accepted as the gold standard for LV volume and function assessment. LV function is 1 of the most important cardiac parameters for diagnosis and the monitoring of treatment effects. Recently, CS techniques have emerged as a means to accelerate data acquisition. METHODS: The prototype CS cine sequence acquires 3 long-axis and 4 short-axis cine loops in 1 single breath-hold (temporal/spatial resolution: 30 ms/1.5 × 1.5 mm(2); acceleration factor 11.0) to measure left ventricular ejection fraction (LVEFCS) as well as LV volumes and LV mass using LV model-based 4D software. For comparison, a conventional stack of multi-breath-hold cine images was acquired (temporal/spatial resolution 40 ms/1.2 × 1.6 mm(2)). As a reference for the left ventricular stroke volume (LVSV), aortic flow was measured by phase-contrast acquisition. RESULTS: In 94% of the 33 participants (12 volunteers: mean age 33 ± 7 years; 21 patients: mean age 63 ± 13 years with different LV pathologies), the image quality of the CS acquisitions was excellent. LVEFCS and LVEFstandard were similar (48.5 ± 15.9% vs. 49.8 ± 15.8%; p = 0.11; r = 0.96; slope 0.97; p < 0.00001). Agreement of LVSVCS with aortic flow was superior to that of LVSVstandard (overestimation vs. aortic flow: 5.6 ± 6.5 ml vs. 16.2 ± 11.7 ml, respectively; p = 0.012) with less variability (r = 0.91; p < 0.00001 for the CS technique vs. r = 0.71; p < 0.01 for the standard technique). The intraobserver and interobserver agreement for all CS parameters was good (slopes 0.93 to 1.06; r = 0.90 to 0.99). CONCLUSIONS: The results demonstrated the feasibility of applying the CS strategy to evaluate LV function and volumes with high accuracy in patients. The single-breath-hold CS strategy has the potential to replace the multi-breath-hold standard cardiac magnetic resonance technique.
Resumo:
Background: Urinary human chorionic gonadotropin (hCG) concentration is routinely measured in all anti-doping laboratories to exclude the misuse of recombinant or urinary hCG preparations. In this study, extended validation of two commercial immunoassays for hCG measurements in urine was performed. Both tests were initially designed for hCG determination in human serum/plasma. Methods: Access (R) and Elecsys (R) 1010 are two automated immunoanalysers for central laboratories. The limits of detection and quantification, as well as intra-laboratory and inter-technique correlation, precision, and accuracy, were determined. Stability studies of hCG in urine following freezing and thawing cycles (n = 3) as well as storage conditions at room temperature, 4 degrees C and 20 degrees C, were performed. Results: Statistical evaluation of hCG concentrations in male urine samples (n = 2429) measured with the Elecsys (R) 1010 system enabled us to draw a skewed frequency histogram and establish a far outside value equal to 2.3 IU/L. This decision limit corresponds to the concentration at which a sportsman will be considered positive for hCG. Intra-assay precision for the Access (R) analyser was less than 4.0 A, whereas the inter-assay precision was closer to 4.5 % (concentrations of the official external controls contained between 5.5 and 195.0 IU/L). Intra and inter-assay precision for the Elecsys (R) 1010 analyser was slightly better. A good inter-technique correlation was obtained when measuring various urine samples (male and female). No urinary hCG loss was observed after two freeze/thaw cycles. On the other hand, time and inappropriate storage conditions, such as temperatures above 10 degrees C for more than 5 days, can deteriorate urinary hCG. Conclusions: Both analysers showed acceptable performances and are suitable for screening urine for anti-doping analyses. Each laboratory should validate and establish its own reference values because hCG concentrations measured in urine can be different from one immunoassay to another. The time delay between urine collection and analysis should be reduced as much as possible, and urine samples should be transported in optimal conditions to avoid a loss of hCG immunoreactivity.