964 resultados para field test
Resumo:
To evaluate the correlation between intraocular pressure (IOP) rise, ocular pulse amplitude (OPA), and choroidal thickness (ChT) during the water drinking test (WDT). Primary open-angle glaucoma (POAG) patients were submitted to the WDT followed by serial IOP measurements using dynamic contour tonometry (DCT), Goldman tonometry (GAT), and ChT measurements using ultrasonographic A and B-scan (USG). A control group not submitted to the test was also evaluated using DCT, GAT, and USG. Intraclass correlation coefficient (ICC) was calculated in the control group in order to assess the reproducibility of measurements. Spearman`s coefficient (rho) was used to assess the correlation between the variables. Thirty eyes were included in the study. There was a significant IOP rise during the WDT using both GAT and DCT (p < 0.001). The OPA and ChT measurements also increased significantly (p < 0.001). Spearman`s correlation between the OPA values and ChT measurements was significant and moderate (rho = 0.40, p = 0.005). The average increase of OPA and ChT measurements occurred 15 min before the IOP rise. There was a significant increase of OPA and ChT measurements followed by an IOP rise during the WDT. Increased choroidal volume due to hemodynamic forces may be enrolled in the mechanism of IOP elevation during this stress test.
Resumo:
to test the ability of frequency-doubling technology (FDT) perimetry to detect dysthyroid optic neuropathy (DON). Fifteen eyes of 15 patients with DON and 15 healthy control eyes were studied. Eligible eyes had a diagnosis of DON based on visual field abnormalities on standard automated perimetry and had visual acuity better than 20/30. FDT testing was performed using both the C-20-5 screening test and the C-20 full-threshold test. Normal and DON eyes were compared with regard to FDT mean sensitivity. Sensitivity ranges were 40.0%-86.7% for the screening test, and 53.3%-100.0% (total deviation) and 20.0-93.3 (pattern deviation) for the C-20 threshold test. The corresponding specificity ranges were 86.7-100.0, 33.3-93.3, and 26.7-100.0, respectively. The best sensitivity/specificity ratios were for one abnormal point depressed < 5% in the screening test (86.7%/86.7%), one point depressed < 1% in the total deviation analysis (80.0%/86.7%), and one point depressed < 2% in the pattern deviation analysis (80.0%/86.7%). DON eyes presented significantly lower than normal average sensitivity in the central, pericentral, and peripheral areas. FDT perimetry is a useful screening tool for DON in eyes with normal or only slightly reduced visual acuity.
Resumo:
Objective: To identify the CAMCOG sub-items that best contribute for the identification of patients with mild cognitive impairment (MCI) and incipient Alzheimer`s disease (AD) in clinical practice. Methods: Cross-sectional assessment of 272 older adults (98 MCI, 82 AD, and 92 controls) with a standardized neuropsychological battery and the CAMCOG schedule. Backward logistic regression analysis with diagnosis (MCI and controls) as dependent variable and the sub-items of the CAMCOG as independent variable was carried out to determine the CAMCOG sub-items that predicted the diagnosis of MCI. Results: Lower scores on Language, Memory, Praxis, and Calculation CAMCOG sub-items were significantly associated with the diagnosis of MCI. A composite score obtained by the sum of these scores significantly discriminated MCI patients from comparison groups. This reduced version of the CAMCOG showed similar diagnostic accuracy than the original schedule for the identification of patients with MCI as compared to controls (AUC = 0.80 +/- 0.03 for the reduced CAMCOG; AUC = 0.79 +/- 0.03 for the original CAMCOG). Conclusion: This reduced version of the CAMCOG had similar diagnostic properties as the original CAMCOG and was faster and easier to administer, rendering it more suitable for the screening of subtle cognitive deficits in general clinical practice. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
Background: The Rivermead Behavioural Memory Test (RBMT) assesses everyday memory by means of tasks which mimic daily challenges. The objective was to examine the validity of the Brazilian version of the RBMT to detect cognitive decline. Methods: 195 older adults were diagnosed as normal controls (NC) or with mild cognitive impairment (MCI) or Alzheimer`s disease (AD) by a multidisciplinary team, after participants completed clinical and neuropsychological protocols. Results: Cronbach`s alpha was high for the total sample for the RBMT profile (PS) and screening scores (SS) (PS=0.91, SS=0.87) and for the AD group (PS=0.84, SS=0.85), and moderate for the MCI (PS=0.62, SS=0.55)and NC (PS=0.62, SS=0.60) groups. RBMT total scores, Appointment, Pictures, Immediate and Delayed Story, Immediate and Delayed Route, Delayed Message and Date contributed to differentiate NC from MCI. ROC curve analyses indicated high accuracy to differentiate NC from AD patients, and, moderate accuracy to differentiate NC from MCI. Conclusions: The Brazilian version of the RBMT seems to be an appropriate instrument to identify memory decline in Brazilian older adults.
Resumo:
Toxoplasma gondii causes severe disease both to man and livestock and its detection in meat after slaughtering requires PCR or biological tests. Meat packages contain retained exudate that could be used for serology due to its blood content. Similar studies reported false negative assays in those tests. We standardized an anti-T. gondii IgG ELISA in muscle juices from experimentally infected rabbits, with blood content determination by cyanhemoglobin spectrophotometry. IgG titers and immunoblotting profiles were similar in blood, serum or meat juice, after blood content correction. These assays were adequate regardless of the storage time up to 120 days or freeze-thaw cycles, without false negative results. We also found 1.35% (1/74) positive sample in commercial Brazilian rabbit meat cuts, by this assay. The blood content determination shows ELISA of meat juice may be useful for quality control for toxoplasmosis monitoring. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Background: The Cambridge Cognitive Examination (CAMCOG) is a useful test in screening for Alzheimer`s disease (AD). However, the interpretation of CAMCOG cut-off scores is problematic and reference values are needed for different educational strata. Given the importance of earlier diagnoses of mild dementia, new cut-off values are required which take into account patients with low levels of education. This study aims to evaluate whether the CAMCOG can be used as an accurate screening test among AD patients and normal controls with different educational levels. Methods: Cross-sectional assessment was undertaken of 113 AD and 208 elderly controls with heterogeneous educational levels (group 1: 1-4 years; group 2: 5-8 years; and group 3: >= 9 years) from a geriatric clinic. submitted to a thorough diagnostic evaluation for AD including the Cambridge Examination for Mental Disorders of the Elderly (CAMDEX). Controls had no cognitive or mood complaints. Sensitivity (SE) and specificity (SP) for the CAMCOG in each educational group was assessed with receiver-operator-characteristic (ROC) curves. Results: CAMCOG mean values were lower when education was reduced in both diagnostic groups (controls - group 1: 87; group 2: 91; group 3: 96; AD - group 1: 63; group 2: 62; group 3: 77). Cutoff scores for the three education groups were 79, 80 and 90, respectively. SE and SP varied among the groups (group 1: 88.1% and 83.5%; group 2: 84.6% and 96%; group 3: 70.8% and 90%). Conclusion: The CAMCOG can be used as a cognitive test for patients with low educational level with good accuracy. Patients with higher education showed lower scores than previously reported.
Resumo:
Purpose: To evaluate rates of visual field progression in eyes with optic disc hemorrhages and the effect of intraocular pressure (IOP) reduction on these rates. Design: Observational cohort study. Participants: The study included 510 eyes of 348 patients with glaucoma who were recruited from the Diagnostic Innovations in Glaucoma Study (DIGS) and followed for an average of 8.2 years. Methods: Eyes were followed annually with clinical examination, standard automated perimetry visual fields, and optic disc stereophotographs. The presence of optic disc hemorrhages was determined on the basis of masked evaluation of optic disc stereophotographs. Evaluation of rates of visual field change during follow-up was performed using the visual field index (VFI). Main Outcome Measures: The evaluation of the effect of optic disc hemorrhages on rates of visual field progression was performed using random coefficient models. Estimates of rates of change for individual eyes were obtained by best linear unbiased prediction (BLUP). Results: During follow-up, 97 (19%) of the eyes had at least 1 episode of disc hemorrhage. The overall rate of VFI change in eyes with hemorrhages was significantly faster than in eyes without hemorrhages (-0.88%/year vs. -0.38%/year, respectively, P < 0.001). The difference in rates of visual field loss pre- and post-hemorrhage was significantly related to the reduction of IOP in the post-hemorrhage period compared with the pre-hemorrhage period (r = -0.61; P < 0.001). Each 1 mmHg of IOP reduction was associated with a difference of 0.31%/year in the rate of VFI change. Conclusions: There was a beneficial effect of treatment in slowing rates of progressive visual field loss in eyes with optic disc hemorrhage. Further research should elucidate the reasons why some patients with hemorrhages respond well to IOP reduction and others seem to continue to progress despite a significant reduction in IOP levels. Financial Disclosure(s): Proprietary or commercial disclosure may be found after the references. Ophthalmology 2010; 117: 2061-2066 (C) 2010 by the American Academy of Ophthalmology.
Resumo:
Purpose: To evaluate the ability of the GDx Variable Corneal Compensation (VCC) Guided Progression Analysis (GPA) software for detecting glaucomatous progression. Design: Observational cohort study. Participants: The study included 453 eyes from 252 individuals followed for an average of 46 +/- 14 months as part of the Diagnostic Innovations in Glaucoma Study. At baseline, 29% of the eyes were classified as glaucomatous, 67% of the eyes were classified as suspects, and 5% of the eyes were classified as healthy. Methods: Images were obtained annually with the GDx VCC and analyzed for progression using the Fast Mode of the GDx GPA software. Progression using conventional methods was determined by the GPA software for standard automated achromatic perimetry (SAP) and by masked assessment of optic disc stereophotographs by expert graders. Main Outcome Measures: Sensitivity, specificity, and likelihood ratios (LRs) for detection of glaucoma progression using the GDx GPA were calculated with SAP and optic disc stereophotographs used as reference standards. Agreement among the different methods was reported using the AC(1) coefficient. Results: Thirty-four of the 431 glaucoma and glaucoma suspect eyes (8%) showed progression by SAP or optic disc stereophotographs. The GDx GPA detected 17 of these eyes for a sensitivity of 50%. Fourteen eyes showed progression only by the GDx GPA with a specificity of 96%. Positive and negative LRs were 12.5 and 0.5, respectively. None of the healthy eyes showed progression by the GDx GPA, with a specificity of 100% in this group. Inter-method agreement (AC1 coefficient and 95% confidence intervals) for non-progressing and progressing eyes was 0.96 (0.94-0.97) and 0.44 (0.28-0.61), respectively. Conclusions: The GDx GPA detected glaucoma progression in a significant number of cases showing progression by conventional methods, with high specificity and high positive LRs. Estimates of the accuracy for detecting progression suggest that the GDx GPA could be used to complement clinical evaluation in the detection of longitudinal change in glaucoma. Financial Disclosure(s): Proprietary or commercial disclosure may be found after the references. Ophthalmology 2010; 117: 462-470 (C) 2010 by the American Academy of Ophthalmology.
Resumo:
Aims We conducted a meta-analysis to evaluate the accuracy of quantitative stress myocardial contrast echocardiography (MCE) in coronary artery disease (CAD). Methods and results Database search was performed through January 2008. We included studies evaluating accuracy of quantitative stress MCE for detection of CAD compared with coronary angiography or single-photon emission computed tomography (SPECT) and measuring reserve parameters of A, beta, and A beta. Data from studies were verified and supplemented by the authors of each study. Using random effects meta-analysis, we estimated weighted mean difference (WMD), likelihood ratios (LRs), diagnostic odds ratios (DORs), and summary area under curve (AUC), all with 95% confidence interval (0). Of 1443 studies, 13 including 627 patients (age range, 38-75 years) and comparing MCE with angiography (n = 10), SPECT (n = 1), or both (n = 2) were eligible. WMD (95% CI) were significantly less in CAD group than no-CAD group: 0.12 (0.06-0.18) (P < 0.001), 1.38 (1.28-1.52) (P < 0.001), and 1.47 (1.18-1.76) (P < 0.001) for A, beta, and A beta reserves, respectively. Pooled LRs for positive test were 1.33 (1.13-1.57), 3.76 (2.43-5.80), and 3.64 (2.87-4.78) and LRs for negative test were 0.68 (0.55-0.83), 0.30 (0.24-0.38), and 0.27 (0.22-0.34) for A, beta, and A beta reserves, respectively. Pooled DORs were 2.09 (1.42-3.07), 15.11 (7.90-28.91), and 14.73 (9.61-22.57) and AUCs were 0.637 (0.594-0.677), 0.851 (0.828-0.872), and 0.859 (0.842-0.750) for A, beta, and A beta reserves, respectively. Conclusion Evidence supports the use of quantitative MCE as a non-invasive test for detection of CAD. Standardizing MCE quantification analysis and adherence to reporting standards for diagnostic tests could enhance the quality of evidence in this field.
Resumo:
PURPOSE. To evaluate the relationship between pattern electroretinogram (PERG) amplitude, macular and retinal nerve fiber layer (RNFL) thickness by optical coherence tomography (OCT), and visual field (VF) loss on standard automated perimetry (SAP) in eyes with temporal hemianopia from chiasmal compression. METHODS. Forty-one eyes from 41 patients with permanent temporal VF defects from chiasmal compression and 41 healthy subjects underwent transient full-field and hemifield (temporal or nasal) stimulation PERG, SAP and time domain-OCT macular and RNFL thickness measurements. Comparisons were made using Student`s t-test. Deviation from normal VF sensitivity for the central 18 of VF was expressed in 1/Lambert units. Correlations between measurements were verified by linear regression analysis. RESULTS. PERG and OCT measurements were significantly lower in eyes with temporal hemianopia than in normal eyes. A significant correlation was found between VF sensitivity loss and fullfield or nasal, but not temporal, hemifield PERG amplitude. Likewise a significant correlation was found between VF sensitivity loss and most OCT parameters. No significant correlation was observed between OCT and PERG parameters, except for nasal hemifield amplitude. A significant correlation was observed between several macular and RNFL thickness parameters. CONCLUSIONS. In patients with chiasmal compression, PERG amplitude and OCT thickness measurements were significant related to VF loss, but not to each other. OCT and PERG quantify neuronal loss differently, but both technologies are useful in understanding structure-function relationship in patients with chiasmal compression. (ClinicalTrials.gov number, NCT00553761.) (Invest Ophthalmol Vis Sci. 2009; 50: 3535-3541) DOI:10.1167/iovs.08-3093
Resumo:
The Wisconsin Card Sorting Test (WCST) is the gold standard in the evaluation of executive dysfunction (ED) in patients with temporal lobe epilepsy (TLE). We evaluated 35 children with TLE and 25 healthy controls with the WCST and with a more comprehensive battery. Among the children with TLE, 77.14% showed impairment on the WCST. On other tests (Wechsler Intelligence Scale for Children-Digit Forward, Matching Familiar Figures Test, Trail Making Test, Word Fluency, Finger Windows, and Number-Letter Memory), impairment was demonstrated in 94.29%. The authors concluded that the WCST is a good paradigm to measure executive impairment in children with TLE: however, it may be not enough. Evaluation performed only with the WCST not only underestimated the number of patients with ED, but also missed relevant information regarding the type of ED. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
The aim of this paper is to study the correlation between the intraocular pressure peaks and fluctuation detected during the water drinking test and the same parameters observed during long-term follow up. This prospective cohort study enrolled 22 eyes of 22 newly diagnosed primary open angle glaucoma patients. After an initial complete ophthalmological examination, patients were started on antiglaucoma medication and returned 4 weeks later to perform the water drinking test. Thereafter, patients were evaluated at least eight times within a period of 6-12 months. The intraocular pressure peaks and fluctuation detected during the water drinking stress test were compared with those observed during regular office visits. Spearman`s correlation coefficient and Bland-Altman Plots were used for statistical analysis. The mean age of participants was 54.3 +/- 8.2 years (+/- SD), 59% were women, and average mean deviation -10.2 +/- 4.5 dB. The mean follow-up period was 8.2 +/- 2.0 months. The average intraocular pressure peaks and fluctuation during the water drinking test were 20.0 +/- 2.9 mmHg and 40 +/- 10%, respectively, and 18.1 +/- 2.8 mmHg and 30 +/- 10% during follow up. Spearman`s correlation coefficients were significant and strong between the intraocular pressure peaks and fluctuation detected during the water drinking test and during the follow-up period (P < 0.001, rho = 0.76 and 0.82, respectively). There was good agreement between the variables. The intraocular pressure peaks and fluctuation detected during the water drinking test showed significant correlation and agreement with the pressures observed during follow-up visits. Stress tests could be used to estimate long-term intraocular pressure variation.
Resumo:
Background: Impairment in pulmonary capacity due to pleural effusion compromises daily activity. Removal of fluid improves symptoms, but the impact, especially on exercise capacity, has not been determined. Methods: Twenty-five patients with unilateral pleural effusion documented by chest radiograph were included. The 6-min walk test, Borg modified dyspnea score, FVC, and FEV, were analyzed before and 48 h after the removal of large pleural effusions. Results: The mean fluid removed was 1,564 +/- 695 mL. After the procedure, values of FVC, FEV and 6-min walk distance increased (P<.001), whereas dyspnea decreased (P<.001). Statistical correlations (P<.001) between 6-min walk distance and FVC (r=0.725) and between 6-min walk distance and FEV, (r=0.661) were observed. Correlations also were observed between the deltas (prethoracentesis X postthoracentesis) of the 6-min walk test and the percentage of FVC (r=0.450) and of FEV, (r=0.472) divided by the volume of fluid removed (P<.05). Conclusion: In addition to the improvement in lung function after thoracentesis, the benefits of fluid removal are more evident in situations of exertion, allowing better readaptation of patients to routine activities. CHEST 2011; 139(6):1424-1429
Resumo:
Epidemiological studies have provided evidence that high consumption of tomatoes effectively reduces the risk of reactive oxygen species (ROS)-mediated diseases such as cancer. Tomatoes are rich sources of lycopene, a potent singlet oxygen-quenching carotenoid. In addition to its antioxidant properties, lycopene shows an array of biological effects including antimutagenic and anticarcinogenic activities. In the present study, the chemopreventive action of lycopene was examined on DNA damage and clastogenic or aneugenic effects of H2O2 and n-nitrosodiethylamine (DEN) in the metabolically competent human hepatoma cell line (HepG2 cells). Lycopene at concentrations of 10. 25, and 50 mu M, was tested under three protocols: before, simultaneously, and after treatment with the mutagen, using the comet and micronucleus assays. Lycopene significantly reduced the genotoxicity and mutagenicity of H2O2 in all of the conditions tested. For DEN, significant reductions of primary DNA damage (comet assay) were detected when the carotenoid (all of the doses) was added in the cell culture medium before or simultaneously with the mutagen. In the micronucleus test, the protective effect of lycopene was observed only when added prior to DEN treatment. In conclusion, our results suggest that lycopene is a suitable agent for preventing chemically-induced DNA and chromosome damage. (C) 2007 Elsevier Ltd. All rights reserved.