978 resultados para Logistic Regressions
Resumo:
1. Cluster analysis of reference sites with similar biota is the initial step in creating River Invertebrate Prediction and Classification System (RIVPACS) and similar river bioassessment models such as Australian River Assessment System (AUSRIVAS). This paper describes and tests an alternative prediction method, Assessment by Nearest Neighbour Analysis (ANNA), based on the same philosophy as RIVPACS and AUSRIVAS but without the grouping step that some people view as artificial. 2. The steps in creating ANNA models are: (i) weighting the predictor variables using a multivariate approach analogous to principal axis correlations, (ii) calculating the weighted Euclidian distance from a test site to the reference sites based on the environmental predictors, (iii) predicting the faunal composition based on the nearest reference sites and (iv) calculating an observed/expected (O/E) analogous to RIVPACS/AUSRIVAS. 3. The paper compares AUSRIVAS and ANNA models on 17 datasets representing a variety of habitats and seasons. First, it examines each model's regressions for Observed versus Expected number of taxa, including the r(2), intercept and slope. Second, the two models' assessments of 79 test sites in New Zealand are compared. Third, the models are compared on test and presumed reference sites along a known trace metal gradient. Fourth, ANNA models are evaluated for western Australia, a geographically distinct region of Australia. The comparisons demonstrate that ANNA and AUSRIVAS are generally equivalent in performance, although ANNA turns out to be potentially more robust for the O versus E regressions and is potentially more accurate on the trace metal gradient sites. 4. The ANNA method is recommended for use in bioassessment of rivers, at least for corroborating the results of the well established AUSRIVAS- and RIVPACS-type models, if not to replace them.
Resumo:
WO(3)/chitosan and WO(3)/chitosan/poly(ethylene oxide) (PEO) films were prepared by the layer-by-layer method. The presence of chitosan enabled PEO to be carried into the self-assembled structure, contributing to an increase in the Li(+) diffusion rate. On the basis of the galvanostatic intermittent titration technique (GITT) and the quadratic logistic equation (QLE), a spectroelectrochemical method was used for determination of the ""optical"" diffusion coefficient (D(op)), enabling analysis of the Li(+) diffusion rate and, consequently, the coloration front rate in these host matrices. The D(op) values within the WO(3)/chitosan/PEO film were significantly higher than those within the WO(3)/chitosan film, mainly for higher values of injected charge. The presence of PEO also ensured larger accessibility to the electroactive sites, in accordance with the method employed here. Hence, this spectroelectrochemical method allowed us to separate the contribution of the diffusion process from the number of accessible electroactive sites in the materials, thereby aiding a better understanding of the useful electrochemical and electrochromic properties of these films for use in electrochromic devices. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Objective-The goal of this study was to assess the independent and collective associations of hepatic steatosis, obesity, and the metabolic syndrome with elevated high-sensitivity C-reactive protein (hs-CRP) levels. Methods and Results-We evaluated 2388 individuals without clinical cardiovascular disease between December 2004 and December 2006. Hepatic steatosis was diagnosed by ultrasound, and the metabolic syndrome was defined using National Heart, Lung, and Blood Institute criteria. The cut point of >= 3 mg/L was used to define high hs-CRP. Multivariate logistic regression was used to assess the independent and collective associations of hepatic steatosis, obesity, and the metabolic syndrome with high hs-CRP. Steatosis was detected in 32% of participants, 23% met criteria for metabolic syndrome, and 17% were obese. After multivariate regression, hepatic steatosis (odds ratio [OR] 2.07; 95% CI 1.68 to 2.56), obesity (OR 3.00; 95% CI 2.39 to 3.80), and the metabolic syndrome (2.39; 95% CI 1.88 to 3.04) were all independently associated with high hs-CRP. Combinations of these factors were associated with an additive increase in the odds of high hs-CRP, with individuals with 1, 2, and 3 factors having ORs for high hs-CRP of 1.92 (1.49 to 2.48), 3.38 (2.50 to 4.57), and 4.53 (3.23 to 6.35), respectively. Conclusion-Hepatic steatosis, obesity, and the metabolic syndrome are independently and additively associated with increased odds of high hs-CRP levels. (Arterioscler Thromb Vasc Biol. 2011; 31: 1927-1932.)
Resumo:
Vascular calcification is a strong prognostic marker of mortality in hemodialysis patients and has been associated with bone metabolism disorders in this population. In earlier stages of chronic kidney disease (CKD), vascular calcification also has been documented. This study evaluated the association between coronary artery calcification (CAC) and bone histomorphometric parameters in CKD predialysis patients assessed by multislice coronary tomography and by undecalcified bone biopsy. CAC was detected in 33 (66%) patients, and their median calcium score was 89.7 (0.4-2299.3 AU). The most frequent bone histologic alterations observed included low trabecular bone volume, increased eroded and osteoclast surfaces, and low bone-formation rate (BFR/BS). Multiple logistic regression analysis, adjusted for age, sex, and diabetes, showed that BFR/BS was independently associated with the presence of coronary calcification [p=.009; odd ratio (OR) = 0.15; 95% confidence interval (Cl) 0.036-0.619] This study showed a high prevalence of CAC in asymptomatic predialysis CKD patients. Also, there was an independent association of low bone formation and CAC in this population. In conclusion, our results provide evidence that low bone-formation rate constitutes another nontraditional risk factor for cardiovascular disease in CKD patients. 2010 American Society for Bone and Mineral Research.
Resumo:
Objectives: The aim of this study was to determine the correlation between ductus venosus (DV) Doppler velocimetry and fetal cardiac troponin T (cTnT). Study design: Between March 2007 and March 2008, 89 high-risk pregnancies were prospectively studied. All patients delivered by cesarean section and the Doppler exams were performed on the same day. Multiple regression included the following variables: maternial age, parity, hypertension, diabetes, gestational age at delivery, umbilical artery (UA) S/D ratio, diagnosis of absent or reversed end-diastolic flow velocity (AREDV) in the UA, middle cerebral artery (MCA) pulsatility index (131), and DV pulsatility index for veins (PIV). Immediately after delivery, UA blood samples were obtained for the measurement of pH and cTnT levels. Statistical analysis included the Kruskal-Wallis test and multiple regressions. Results: The results showed a cTnT concentration at birth >0.05 ng/ml in nine (81.8%) of AREDV cases, a proportion significantly higher than that observed in normal UA S/D ratio and UA S/D ratio >p95 with positive diastolic blood flow (7.7 and 23.1%, respectively, p < 0.001). A positive correlation Was found between abnormal DV-PIV and elevated cTnT levels in the UA. Multiple regression identified DV-PIV and a diagnosis of AREDV as independent factors associated with abnormal fetal cTnT levels (p < 0.0001, F(2.86) = 63.5, R = 0.7722). Conclusion: DV-PIV was significantly correlated with fetal cTnT concentrations at delivery. AREDV and abnormal DV flow represent severe cardiac compromise, with increased systemic venous pressure, and a rise in right ventricular afterload, demonstrated by myocardial damage and elevated fetal cTnT. (C) 2009 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Objectives: To describe current practice for the discontinuation of continuous renal replacement therapy in a multinational setting and to identify variables associated with successful discontinuation. The approach to discontinue continuous renal replacement therapy may affect patient outcomes. However, there is lack of information on how and under what conditions continuous renal replacement therapy is discontinued. Design: Post hoc analysis of a prospective observational study. Setting. Fifty-four intensive care units in 23 countries. Patients: Five hundred twenty-nine patients (52.6%) who survived initial therapy among 1006 patients treated with continuous renal replacement therapy. Interventions: None. Measurements and Main Results., Three hundred thirteen patients were removed successfully from continuous renal replacement therapy and did not require any renal replacement therapy for at least 7 days and were classified as the ""success"" group and the rest (216 patients) were classified as the ""repeat-RRT"" (renal replacement therapy) group. Patients in the ""success"" group had lower hospital mortality (28.5% vs. 42.7%, p < .0001) compared with patients in the ""repeat-RRT"" group. They also had lower creatinine and urea concentrations and a higher urine output at the time of stopping continuous renal replacement therapy. Multivariate logistic regression analysis for successful discontinuation of continuous renal replacement therapy identified urine output (during the 24 hrs before stopping continuous renal replacement therapy: odds ratio, 1.078 per 100 mL/day increase) and creatinine (odds ratio, 0.996 per mu mol/L increase) as significant predictors of successful cessation. The area under the receiver operating characteristic curve to predict successful discontinuation of continuous renal replacement therapy was 0.808 for urine output and 0.635 for creatinine. The predictive ability of urine output was negatively affected by the use of diuretics (area under the receiver operating characteristic curve, 0.671 with diuretics and 0.845 without diuretics). Conclusions. We report on the current practice of discontinuing continuous renal replacement therapy in a multinational setting. Urine output at the time of initial cessation (if continuous renal replacement therapy was the most important predictor of successful discontinuation, especially if occurring without the administration of diuretics. (Crit Care Med 2009; 37:2576-2582)
Resumo:
Objective: To examine the quality of diabetes care and prevention of cardiovascular disease (CVD) in Australian general practice patients with type 2 diabetes and to investigate its relationship with coronary heart disease absolute risk (CHDAR). Methods: A total of 3286 patient records were extracted from registers of patients with type 2 diabetes held by 16 divisions of general practice (250 practices) across Australia for the year 2002. CHDAR was estimated using the United Kingdom Prospective Diabetes Study algorithm with higher CHDAR set at a 10 year risk of >15%. Multivariate multilevel logistic regression investigated the association between CHDAR and diabetes care. Results: 47.9% of diabetic patient records had glycosylated haemoglobin (HbA1c) >7%, 87.6% had total cholesterol >= 4.0 mmol/l, and 73.8% had blood pressure (BP) >= 130/85 mm Hg. 57.6% of patients were at a higher CHDAR, 76.8% of whom were not on lipid modifying medication and 66.2% were not on antihypertensive medication. After adjusting for clustering at the general practice level and age, lipid modifying medication was negatively related to CHDAR (odds ratio (OR) 0.84) and total cholesterol. Antihypertensive medication was positively related to systolic BP but negatively related to CHDAR (OR 0.88). Referral to ophthalmologists/optometrists and attendance at other health professionals were not related to CHDAR. Conclusions: At the time of the study the diabetes and CVD preventive care in Australian general practice was suboptimal, even after a number of national initiatives. The Australian Pharmaceutical Benefits Scheme (PBS) guidelines need to be modified to improve CVD preventive care in patients with type 2 diabetes.
Resumo:
Background and objectives Low bone mineral density and coronary artery calcification (CAC) are highly prevalent among chronic kidney disease (CKD) patients, and both conditions are strongly associated with higher mortality. The study presented here aimed to investigate whether reduced vertebral bone density (VBD) was associated with the presence of CAC in the earlier stages of CKD. Design, setting, participants, & measurements Seventy-two nondialyzed CKD patients (age 52 +/- 11.7 years, 70% male, 42% diabetics, creatinine clearance 40.4 +/- 18.2 ml/min per 1.73 m(2)) were studied. VBD and CAC were quantified by computed tomography. Results CAC > 10 Agatston units (AU) was observed in 50% of the patients (median 120 AU [interquartile range 32 to 584 AU]), and a calcification score >= 400 AU was found in 19% (736 [527 to 1012] AU). VBD (190 +/- 52 Hounsfield units) correlated inversely with age (r = -0.41, P < 0.001) and calcium score (r = -0.31, P = 0.01), and no correlation was found with gender, creatinine clearance, proteinuria, lipid profile, mineral parameters, body mass index, and diabetes. Patients in the lowest tertile of VBD had expressively increased calcium score in comparison to the middle and highest tertile groups. In the multiple logistic regression analysis adjusting for confounding variables, low VBD was independently associated with the presence of CAC. Conclusions Low VBD was associated with CAC in nondialyzed CKD patients. The authors suggest that low VBD might constitute another nontraditional risk factor for cardiovascular disease in CKD. Clin J Am Soc Nephrol 6: 1456-1462, 2011. doi: 10.2215/CJN.10061110
Resumo:
Background: Current relevance of T-wave alternans is based on its association with electrical disorder and elevated cardiac risk. Quantitative reports would improve understanding on TWA augmentation mechanisms during mental stress or prior to tachyarrhythmias. However, little information is available about quantitative TWA values in clinical populations. This study aims to create and compare TWA profiles of healthy subjects and ICD patients, evaluated on treadmill stress protocols. Methods: Apparently healthy subjects, not in use of any medication were recruited. All eligible ICD patients were capable of performing an attenuated stress test. TWA analysis was performed during a 15-lead treadmill test. The derived comparative profile consisted of TWA amplitude and its associated heart rate, at rest (baseline) and at peak TWA value. Chi-square or Mann-Whitney tests were used with p values <= 0.05. Discriminatory performance was evaluated by a binary logistic regression model. Results: 31 healthy subjects (8F, 23M) and 32 ICD patients (10F, 22M) were different on baseline TWA (1 +/- 2 mu V; 8 +/- 9 mu V; p < 0.001) and peak TWA values (26 +/- 13 mu V; 37 +/- 20 mu V; p = 0,009) as well as on baseline TWA heart rate (79 +/- 10 bpm; 67 +/- 15 bpm; p < 0.001) and peak TWA heart rate (118 +/- 8 bpm; 90 +/- 17 bpm; p < 0.001). The logistic model yielded sensitivity and specificity values of 88.9% and 92.9%, respectively. Conclusions: Healthy subjects and ICD patients have distinct TWA profiles. The new TWA profile representation (in amplitude-heart rate pairs) may help comparison among different research protocols. Ann Noninvasive Electrocardiol 2009;14(2):108-118.
Resumo:
We evaluated the associations between glycemic therapies and prevalence of diabetic peripheral neuropathy (DPN) at baseline among participants in the Bypass Angioplasty Revascularization Investigation 2 Diabetes (BARI 2D) trial on medical and revascularization therapies for coronary artery disease (CAD) and on insulin-sensitizing vs. insulin-providing treatments for diabetes. A total of 2,368 patients with type 2 diabetes and CAD was evaluated. DPN was defined as clinical examination score > 2 using the Michigan Neuropathy Screening Instrument (MNSI). DPN odds ratios across different groups of glycemic therapy were evaluated by multiple logistic regression adjusted for multiple covariates including age, sex, hemoglobin A1c (HbA1c), and diabetes duration. Fifty-one percent of BARI 2D subjects with valid baseline characteristics and MNSI scores had DPN. After adjusting for all variables, use of insulin was significantly associated with DPN (OR = 1.57, 95% CI: 1.15-2.13). Patients on sulfonylurea (SU) or combination of SU/metformin (Met)/thiazolidinediones (TZD) had marginally higher rates of DPN than the Met/TZD group. This cross-sectional study in a cohort of patients with type 2 diabetes and CAD showed association of insulin use with higher DPN prevalence, independent of disease duration, glycemic control, and other characteristics. The causality between a glycemic control strategy and DPN cannot be evaluated in this cross-sectional study, but continued assessment of DPN and randomized therapies in BARI 2D trial may provide further explanations on the development of DPN.
Resumo:
Smell identification tests may be of routine clinical value in the differential diagnosis of PD but are subject to cultural variation and have not been systematically evaluated in the Brazilian population. We have applied culturally adapted translations of the University of Pennsylvania 40-item smell identification test (UPSIT-40) and the 16-item identification test from Sniffin` Sticks (SS-16) to nondemented Brazilian PD patients and controls. Pearson`s correlation coefficient between the test scores was 0.76 (95% CI 0.70-0.81, n = 204, P < 0.001). To calculate reliability measures for each test we used the diagnosis (either PD or control) as outcome variable for separate logistic regression analyses using the score in the UPSIT-40 or the SS-16 as a covariate. The SS-16 specificity was 89.0% with a sensitivity of 81.1% (106 PD and 118 controls). The UPSIT-40 specificity was 83.5% and its sensitivity 82.1% (95 PD and 109 controls). Regression curves were used to associate an individual`s smell test score with the probability of belonging to the PD, as opposed to the control group. Our data provide support for the use of the UPSIT-40 and SS-16 to help distinguish early PD from controls. (c) 2008 Movement Disorder Society
Resumo:
We sought to evaluate this ""response-to-injury"" hypothesis of atherosclerosis by studying the interaction between systolic blood pressure (SBP) and LDL-cholesterol (LDL-C) in predicting the presence of coronary artery calcification (CAC) in asymptomatic men. We Studied 526 men (46 +/- 7 years of age) referred for electron-beam tomography (EBT) exam. The prevalence of CAC was determined across LDL-C tertiles (low: <115 mg/dl; middle: 115-139 mg/dl high: >= 140 mg/dl) within tertiles of SBP (low: <121 mmHg; middle: 121-130 mmHg; high: >= 131 mmHg). CAC was found in 220 (42%) men. There was no linear trend in the presence of CAC across LDL-C tertiles in the low (p = 0.6 for trend) and middle (p = 0.3 for trend) SBP tertile groups, respectively. In contrast, there was a significant trend for increasing CAC with increasing LDL-C (1st: 44%; 2nd: 49%; 3rd: 83%; p < 0.0001 for trend) in the high SBP tertile group. In multivariate logistic analyses (adjusting for age, smoking, triglyceride levels, HDL-cholesterol levels, body mass index, and fasting glucose levels), the odds ratio for any CAC associated with increasing LDL-C was significantly higher in those with highest SBP levels, whereas no such relationship was observed among men with SBP in the lower two tertiles. An interaction term (LDL-C x SBP) incorporated in the multivariate analyses was statistically significant (p = 0.038). The finding of an interaction between SBP and LDL-C relation to CAC in asymptomatic men support the response-to-injury model of atherogenesis. (C) 2007 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Recently, mild AKI has been considered as a risk factor for mortality in different scenarios. We conducted a retrospective analysis of the risk factors for two distinct definitions of AKI after elective repair of aortic aneurysms. Logistic regression was carried out to identify independent risk factors for AKI ( defined as >= 25% or >= 50% increase in baseline SCr within 48 h after surgery, AKI 25% and AKI 50%, respectively) and for mortality. Of 77 patients studied ( mean age 68 +/- 10, 83% male), 57% developed AKI 25% and 33.7% AKI 50%. There were no differences between AKI and control groups regarding comorbidities and diameter of aneurysms. However, AKI patients needed a supra-renal aortic cross-clamping more frequently and were more severely ill. Overall in-hospital mortality was 27.3%, which was markedly higher in those requiring a supra-renal aortic cross-clamping. The risk factors for AKI 25% were suprarenal aortic cross-clamping ( odds ratio 5.51, 95% CI 1.05-36.12, p = 0.04) and duration of operation for AKI 25% ( OR 6.67, 95% CI 2.23-19.9, p < 0.001). For AKI 50%, in addition to those factors, post-operative use of vasoactive drugs remained as an independent factor ( OR 6.13, 95% CI 1.64-22.8, p = 0.005). The risk factors associated with mortality were need of supra-renal aortic cross-clamping ( OR 9.6, 95% CI 1.37-67.88, p = 0.02), development of AKI 50% ( OR 8.84, 95% CI 1.31-59.39, p = 0.02), baseline GFR lower than 49 mL/min ( OR 17.07, 95% CI 2.00 145.23, p = 0.009), and serum glucose > 118 mg/dL in the post-operative period ( OR 19.99, 95% CI 2.32-172.28, p = 0.006). An increase of at least 50% in baseline SCr is a common event after surgical repair of aortic aneurysms, particularly when a supra-renal aortic cross-clamping is needed. Along with baseline moderate chronic renal failure, AKI is an independent factor contributing to the high mortality found in this scenario.
Resumo:
Purpose: To evaluate the influence of cross-sectional arc calcification on the diagnostic accuracy of computed tomography (CT) angiography compared with conventional coronary angiography for the detection of obstructive coronary artery disease (CAD). Materials and Methods: Institutional Review Board approval and written informed consent were obtained from all centers and participants for this HIPAA-compliant study. Overall, 4511 segments from 371 symptomatic patients (279 men, 92 women; median age, 61 years [interquartile range, 53-67 years]) with clinical suspicion of CAD from the CORE-64 multi-center study were included in the analysis. Two independent blinded observers evaluated the percentage of diameter stenosis and the circumferential extent of calcium (arc calcium). The accuracy of quantitative multidetector CT angiography to depict substantial (>50%) stenoses was assessed by using quantitative coronary angiography (QCA). Cross-sectional arc calcium was rated on a segment level as follows: noncalcified or mild (<90 degrees), moderate (90 degrees-180 degrees), or severe (>180 degrees) calcification. Univariable and multivariable logistic regression, receiver operation characteristic curve, and clustering methods were used for statistical analyses. Results: A total of 1099 segments had mild calcification, 503 had moderate calcification, 338 had severe calcification, and 2571 segments were noncalcified. Calcified segments were highly associated (P < .001) with disagreement between CTA and QCA in multivariable analysis after controlling for sex, age, heart rate, and image quality. The prevalence of CAD was 5.4% in noncalcified segments, 15.0% in mildly calcified segments, 27.0% in moderately calcified segments, and 43.0% in severely calcified segments. A significant difference was found in area under the receiver operating characteristic curves (noncalcified: 0.86, mildly calcified: 0.85, moderately calcified: 0.82, severely calcified: 0.81; P < .05). Conclusion: In a symptomatic patient population, segment-based coronary artery calcification significantly decreased agreement between multidetector CT angiography and QCA to detect a coronary stenosis of at least 50%.
Resumo:
To describe incidence rates and risk factors associated with external ventricular drain (EVD)-related infections at a tertiary Brazilian teaching hospital. The patient cohort consisted of all patients at a major teaching hospital in Brazil with an EVD during the period 1 April 2007 to 30 June 2008 (15 months). Patients were followed up for 30 days after catheter removal. According to the Center for Diseases Control and Prevention criteria for meningitis/ventriculitis, all of the central nervous system (CNS) infections that occurred during this period could be considered to be meningitis or ventriculitis related to EVD placement. Infection rates were calculated using different denominators, such as (1) per patient (incidence), (2) per procedure, and (3) per 1,000 catheter-days (drain-associated infection rate). Patient demographic data, medical history of underlying diseases, antibiotic prophylaxis usage, American Society of Anesthesiologists Score classification, duration of surgery and hospitalization, length of time the EVD was in place, and overall mortality were evaluated during the study period. A logistic regression model was developed to identify factors associated with infection. A total of 119 patients, 130 EVD procedures, and 839 catheter-days were evaluated. The incidence of infection was 18.3%, the infection rate was 16.9% per procedure, and the drain-associated infection rate was 22.4 per 1,000 catheter-days; 77% of the infections were caused by Gram-negative micro-organisms. Only 75% of patients received antibiotic prophylaxis. The infection rate increased with length of the hospital stay. The length of time the catheter was in place was the only independent risk factor associated with infection (p = 0.0369). The incidence of EVD-related infections is high in our hospital, Gram-negative micro-organisms were the most frequent causal agents identified and length of time that the catheter was in place contributed to the infection rate.