969 resultados para Malignant underlying disease
Resumo:
There is not a specific test to diagnose Alzheimer`s disease (AD). Its diagnosis should be based upon clinical history, neuropsychological and laboratory tests, neuroimaging and electroencephalography (EEG). Therefore, new approaches are necessary to enable earlier and more accurate diagnosis and to follow treatment results. In this study we used a Machine Learning (ML) technique, named Support Vector Machine (SVM), to search patterns in EEG epochs to differentiate AD patients from controls. As a result, we developed a quantitative EEG (qEEG) processing method for automatic differentiation of patients with AD from normal individuals, as a complement to the diagnosis of probable dementia. We studied EEGs from 19 normal subjects (14 females/5 males, mean age 71.6 years) and 16 probable mild to moderate symptoms AD patients (14 females/2 males, mean age 73.4 years. The results obtained from analysis of EEG epochs were accuracy 79.9% and sensitivity 83.2%. The analysis considering the diagnosis of each individual patient reached 87.0% accuracy and 91.7% sensitivity.
Resumo:
Mutations in the Grb10-interacting GYF protein 2 (GIGYF2) gene, within the PARK11 locus, have been nominated as a cause of Parkinson`s disease in Italian and French populations. By sequencing the whole GIGYF2 coding region in forty-six probands (thirty-seven Italians) with familial Parkinson`s disease compatible with an autosomal dominant inheritance, we identified no mutations. Our data add to a growing body of evidence suggesting that GIGYF2 mutations are not a frequent cause of PD. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The BOLD contrast signal history determined by lagged Unear correlation has a significant contribution to functional connectivity in activation data sets. It has been demonstrated that in resting state fMRI data, the major contribution to synchronous correlation between functionally connected areas arises from low frequency contributions (
Resumo:
OBJECTIVE. The purposes of this study were to use the myocardial delayed enhancement technique of cardiac MRI to investigate the frequency of unrecognized myocardial infarction (MI) in patients with end-stage renal disease, to compare the findings with those of ECG and SPECT, and to examine factors that may influence the utility of these methods in the detection of MI. SUBJECTS AND METHODS. We prospectively performed cardiac MRI, ECG, and SPECT to detect unrecognized MI in 72 patients with end-stage renal disease at high risk of coronary artery disease but without a clinical history of MI. RESULTS. Fifty-six patients (78%) were men ( mean age, 56.2 +/- 9.4 years) and 16 (22%) were women ( mean age, 55.8 +/- 11.4). The mean left ventricular mass index was 103.4 +/- 27.3 g/m(2), and the mean ejection fraction was 60.6% +/- 15.5%. Myocardial delayed enhancement imaging depicted unrecognized MI in 18 patients (25%). ECG findings were abnormal in five patients (7%), and SPECT findings were abnormal in 19 patients (26%). ECG findings were false-negative in 14 cases and false-positive in one case. The accuracy, sensitivity, and specificity of ECG were 79.2%, 22.2%, and 98.1% (p = 0.002). SPECT findings were false-negative in six cases and false-positive in seven cases. The accuracy, sensitivity, and specificity of SPECT were 81.9%, 66.7%, and 87.0% ( not significant). During a period of 4.9-77.9 months, 19 cardiac deaths were documented, but no statistical significance was found in survival analysis. CONCLUSION. Cardiac MRI with myocardial delayed enhancement can depict unrecognized MI in patients with end-stage renal disease. ECG and SPECT had low sensitivity in detection of MI. Infarct size and left ventricular mass can influence the utility of these methods in the detection of MI.
Resumo:
Objective. Arrhythmogenic right ventricular dysplasia (ARVD) is a myocardial disease of familiar, origin where the myocardium is replaced by fibrofatty tissue predominantly in the right ventricle. Herein we have presented the clinical courses of 4 patients with ARVD who underwent orthotopic heart transplantation. Patients and Methods. Among 358 adult patients undergoing heart transplantation, 4 (1.1%) displayed ARVD. The main indication for transplantation was the progression to heart failure associated with arrhythmias. All 4 patients displayed rapid, severe courses leading to heart failure with left ventricular involvement and uncontrolled arrhythmias. Results. In all cases the transplantation was performed using a bicaval technique with prophylactic tricuspid valve annuloplasty. One patient developed hyperacute rejection and infection, leading to death on the 7th day after surgery. The other 3 cases showed a good evolution with clinical remission of the symptoms. Pathological study of the explanted hearts confirmed the presence of the disease. Conclusions. ARVD is a serious cardiomyopathy that can develop malignant arrhythmias, severe ventricular dysfunction with right ventricular predominance, and sudden cardiac death. Orthotopic heart transplantation must always be considered in advanced cases of ARVD with malignant arrhythmias or refractory congestive heart failure with or without uncontrolled arrhythmias, because it is the only way to remit the symptoms and the disease.
Resumo:
Two longitudinal experiments involving Angora goats challenged with either bovine or ovine strains of Mycobacterium avium subspecies paratuberculosis (Map) have been conducted over a period of 54 and 35 months, respectively. Blood samples for the interferon-gamma (IFN-gamma) test and the absorbed ELISA and faecal samples for bacteriological culture were taken pre-challenge and monthly post-challenge. Persistent shedding, IFN-gamma production, seroconversion and clinical disease occurred earlier with the bovine Map gut mucosal tissue challenge inoculum than with cultured bacteria. The IFN-gamma responses of the gut mucosal tissue and bacterial challenge groups were substantially and consistently higher than those of the control group. The in vivo and cultured cattle strains were much more pathogenic for goats than the sheep strains with persistent faecal shedding, seroconversion and clinical disease occurring in the majority of bovine Map challenged goats. With the ovine Map, 3 goats developed persistent antibody responses but only one of these goats developed persistent faecal shedding and clinical disease. However, there was no significant difference between the IFN-gamma responses of the tissue challenged, bacterial challenged and control groups. Compared with sheep, the ELISA appeared to have higher sensitivity and the IFN-gamma test lower specificity. (C) 2005 Elsevier B.V. All rights reserved.
Resumo:
Objective: To examine the quality of diabetes care and prevention of cardiovascular disease (CVD) in Australian general practice patients with type 2 diabetes and to investigate its relationship with coronary heart disease absolute risk (CHDAR). Methods: A total of 3286 patient records were extracted from registers of patients with type 2 diabetes held by 16 divisions of general practice (250 practices) across Australia for the year 2002. CHDAR was estimated using the United Kingdom Prospective Diabetes Study algorithm with higher CHDAR set at a 10 year risk of >15%. Multivariate multilevel logistic regression investigated the association between CHDAR and diabetes care. Results: 47.9% of diabetic patient records had glycosylated haemoglobin (HbA1c) >7%, 87.6% had total cholesterol >= 4.0 mmol/l, and 73.8% had blood pressure (BP) >= 130/85 mm Hg. 57.6% of patients were at a higher CHDAR, 76.8% of whom were not on lipid modifying medication and 66.2% were not on antihypertensive medication. After adjusting for clustering at the general practice level and age, lipid modifying medication was negatively related to CHDAR (odds ratio (OR) 0.84) and total cholesterol. Antihypertensive medication was positively related to systolic BP but negatively related to CHDAR (OR 0.88). Referral to ophthalmologists/optometrists and attendance at other health professionals were not related to CHDAR. Conclusions: At the time of the study the diabetes and CVD preventive care in Australian general practice was suboptimal, even after a number of national initiatives. The Australian Pharmaceutical Benefits Scheme (PBS) guidelines need to be modified to improve CVD preventive care in patients with type 2 diabetes.
Resumo:
Background and objectives Low bone mineral density and coronary artery calcification (CAC) are highly prevalent among chronic kidney disease (CKD) patients, and both conditions are strongly associated with higher mortality. The study presented here aimed to investigate whether reduced vertebral bone density (VBD) was associated with the presence of CAC in the earlier stages of CKD. Design, setting, participants, & measurements Seventy-two nondialyzed CKD patients (age 52 +/- 11.7 years, 70% male, 42% diabetics, creatinine clearance 40.4 +/- 18.2 ml/min per 1.73 m(2)) were studied. VBD and CAC were quantified by computed tomography. Results CAC > 10 Agatston units (AU) was observed in 50% of the patients (median 120 AU [interquartile range 32 to 584 AU]), and a calcification score >= 400 AU was found in 19% (736 [527 to 1012] AU). VBD (190 +/- 52 Hounsfield units) correlated inversely with age (r = -0.41, P < 0.001) and calcium score (r = -0.31, P = 0.01), and no correlation was found with gender, creatinine clearance, proteinuria, lipid profile, mineral parameters, body mass index, and diabetes. Patients in the lowest tertile of VBD had expressively increased calcium score in comparison to the middle and highest tertile groups. In the multiple logistic regression analysis adjusting for confounding variables, low VBD was independently associated with the presence of CAC. Conclusions Low VBD was associated with CAC in nondialyzed CKD patients. The authors suggest that low VBD might constitute another nontraditional risk factor for cardiovascular disease in CKD. Clin J Am Soc Nephrol 6: 1456-1462, 2011. doi: 10.2215/CJN.10061110
Resumo:
Background and Purpose. There has been a lot of debate about the use of predicted oxygen consumption to calculate pulmonary vascular resistance using the Fick principle. We therefore comparatively analyzed predicted oxygen consumption in infants and children in specific age groups, using different methods (formulas), as an attempt to better understand the usefulness and limitations of predictions. Methods and Results. Four models (LaFarge & Miettinen, Bergstra et al., Lindahl, and Lundell et al.) were used to predict oxygen consumption in 200 acyanotic patients with congenital cardiac defects aged 0-2.0, > 2.0-4.0, > 4.0-6.0, and > 6.0-8.75 years (median 2.04 years). Significant differences were observed between the age groups (P < .001) and between the methods (P < .001), not related to diagnoses. Differences between methods were more impressive in the first age group (P < .01). In patients aged 0-2.0 years, the lowest values of oxygen consumption (corresponding to the highest estimates of pulmonary vascular resistance) were obtained with the method of Lindahl; above this age, any method except that of Lundell et al. Conclusions. Although measuring oxygen consumption is always preferable, a rational use of predictions, using different methods, may be of help in situations where measurements are definitely not possible.
Resumo:
Smell identification tests may be of routine clinical value in the differential diagnosis of PD but are subject to cultural variation and have not been systematically evaluated in the Brazilian population. We have applied culturally adapted translations of the University of Pennsylvania 40-item smell identification test (UPSIT-40) and the 16-item identification test from Sniffin` Sticks (SS-16) to nondemented Brazilian PD patients and controls. Pearson`s correlation coefficient between the test scores was 0.76 (95% CI 0.70-0.81, n = 204, P < 0.001). To calculate reliability measures for each test we used the diagnosis (either PD or control) as outcome variable for separate logistic regression analyses using the score in the UPSIT-40 or the SS-16 as a covariate. The SS-16 specificity was 89.0% with a sensitivity of 81.1% (106 PD and 118 controls). The UPSIT-40 specificity was 83.5% and its sensitivity 82.1% (95 PD and 109 controls). Regression curves were used to associate an individual`s smell test score with the probability of belonging to the PD, as opposed to the control group. Our data provide support for the use of the UPSIT-40 and SS-16 to help distinguish early PD from controls. (c) 2008 Movement Disorder Society
Resumo:
Although a new protocol of dobutamine stress echocardiography with the early injection of atropine (EA-DSE) has been demonstrated to be useful in reducing adverse effects and increasing the number of effective tests and to have similar accuracy for detecting coronary artery disease (CAD) compared with conventional protocols, no data exist regarding its ability to predict long-term events. The aim of this study was to determine the prognostic value of EA-DSE and the effects of the long-term use of beta blockers on it. A retrospective evaluation of 844 patients who underwent EA-DSE for known or suspected CAD was performed; 309 (37%) were receiving beta blockers. During a median follow-up period of 24 months, 102 events (12%) occurred. On univariate analysis, predictors of events were the ejection fraction (p <0.001), male gender (p <0.001), previous myocardial infarction (p <0.001), angiotensin-converting enzyme inhibitor therapy (p = 0.021), calcium channel blocker therapy (p = 0.034), and abnormal results on EA-DSE (p <0.001). On multivariate analysis, the independent predictors of events were male gender (relative risk [RR] 1.78, 95% confidence interval [CI] 1.13 to 2.81, p = 0.013) and abnormal results on EA-DSE (RR 4.45, 95% CI 2.84 to 7.01, p <0.0001). Normal results on EA-DSE with P blockers were associated with a nonsignificant higher incidence of events than normal results on EA-DSE without beta blockers (RR 1.29, 95% CI 0.58 to 2.87, p = 0.54). Abnormal results on EA-DSE with beta blockers had an RR of 4.97 (95% CI 2.79 to 8.87, p <0.001) compared with normal results, while abnormal results on EA-DSE without beta blockers had an RR of 5.96 (95% CI 3.41 to 10.44, p <0.001) for events, with no difference between groups (p = 0.36). In conclusion, the detection of fixed or inducible wall motion abnormalities during EA-DSE was an independent predictor of long-term events in patients with known or suspected CAD. The prognostic value of EA-DSE was not affected by the long-term use of beta blockers. (C) 2008 Elsevier Inc. All rights reserved. (Am J Cardiol 2008;102:1291-1295)
Resumo:
Chronic beryllium disease (CBD) is clinically similar to other granulomatous diseases such as sarcoidosis. It is often misdiagnosed if a thorough occupational history is not taken. When appropriate, a beryllium lymphocyte proliferation tests (BeLPT) need to be performed. We aimed to search for CBD among currently diagnosed pulmonary sarcoidosis patients and to identify the occupations and exposures in Ontario leading to CBD. Questionnaire items included work history and details of possible exposure to beryllium. Participants who provided a history of previous work with metals underwent BeLPTs and an ELISPOT on the basis of having a higher pretest probability of CBD. Among 121 sarcoid patients enrolled, 87 (72%) reported no known previous metal dust or fume exposure, while 34 (28%) had metal exposure, including 17 (14%) with beryllium exposure at work or home. However, none of these 34 who underwent testing had positive test results. Self-reported exposure to beryllium or metals was relatively common in these patients with clinical sarcoidosis, but CBD was not confirmed using blood assays in this population.
Resumo:
Background-Randomized trials that studied clinical outcomes after percutaneous coronary intervention (PCI) with bare metal stenting versus coronary artery bypass grafting (CABG) are underpowered to properly assess safety end points like death, stroke, and myocardial infarction. Pooling data from randomized controlled trials increases the statistical power and allows better assessment of the treatment effect in high-risk subgroups. Methods and Results-We performed a pooled analysis of 3051 patients in 4 randomized trials evaluating the relative safety and efficacy of PCI with stenting and CABG at 5 years for the treatment of multivessel coronary artery disease. The primary end point was the composite end point of death, stroke, or myocardial infarction. The secondary end point was the occurrence of major adverse cardiac and cerebrovascular accidents, death, stroke, myocardial infarction, and repeat revascularization. We tested for heterogeneities in treatment effect in patient subgroups. At 5 years, the cumulative incidence of death, myocardial infarction, and stroke was similar in patients randomized to PCI with stenting versus CABG (16.7% versus 16.9%, respectively; hazard ratio, 1.04, 95% confidence interval, 0.86 to 1.27; P = 0.69). Repeat revascularization, however, occurred significantly more frequently after PCI than CABG (29.0% versus 7.9%, respectively; hazard ratio, 0.23; 95% confidence interval, 0.18 to 0.29; P<0.001). Major adverse cardiac and cerebrovascular events were significantly higher in the PCI than the CABG group (39.2% versus 23.0%, respectively; hazard ratio, 0.53; 95% confidence interval, 0.45 to 0.61; P<0.001). No heterogeneity of treatment effect was found in the subgroups, including diabetic patients and those presenting with 3-vessel disease. Conclusions-In this pooled analysis of 4 randomized trials, PCI with stenting was associated with a long-term safety profile similar to that of CABG. However, as a result of persistently lower repeat revascularization rates in the CABG patients, overall major adverse cardiac and cerebrovascular event rates were significantly lower in the CABG group at 5 years.
Resumo:
Suppression of the renin-angiotensin system during lactation causes irreversible renal structural changes. In this study we investigated 1) the time course and the mechanisms underlying the chronic kidney disease caused by administration of the AT(1) receptor blocker losartan during lactation, and 2) whether this untoward effect can be used to engender a new model of chronic kidney disease. Male Munich-Wistar pups were divided into two groups: C, whose mothers were untreated, and L(Lact), whose mothers received oral losartan (250 mg.kg(-1).day(-1)) during the first 20 days after delivery. At 3 mo of life, both nephron number and the glomerular filtration rate were reduced in L(Lact) rats, whereas glomerular pressure was elevated. Unselective proteinuria and decreased expression of the zonula occludens-1 protein were also observed, along with modest glomerulosclerosis, significant interstitial expansion and inflammation, and wide glomerular volume variation, with a stable subpopulation of exceedingly small glomeruli. In addition, the urine osmolality was persistently lower in L(Lact) rats. At 10 mo of age, L(Lact) rats exhibited systemic hypertension, heavy albuminuria, substantial glomerulosclerosis, severe renal interstitial expansion and inflammation, and creatinine retention. Conclusions are that 1) oral losartan during lactation can be used as a simple and easily reproducible model of chronic kidney disease in adult life, associated with low mortality and no arterial hypertension until advanced stages; and 2) the mechanisms involved in the progression of renal injury in this model include glomerular hypertension, glomerular hypertrophy, podocyte injury, and interstitial inflammation.