99 resultados para proportional elections
Resumo:
Introduction: Association between ADAMTS13 levels and cardiovascular events has been described recently. However, no genetic study of ADAMTS13 in coronary patients has been described. Materials and Methods: Based on related populations frequencies and functional studies, we tested three ADAMTS13 polymorphisms: C1342G (Q448E), C1852G (P618A) and C2699T (A900V) in a group of 560 patients enrolled in the Medical, Angioplasty, or Surgery Study II (MASS II), a randomized trial comparing treatments for patients with coronary artery disease (CAD) and preserved left ventricular function. The incidence of the 5-year end-points of death and death from cardiac causes, myocardial infarction, refractory angina requiring revascularization and cerebrovascular accident was determined for each polymorphim`s allele, genotype and haplotype. Risk was assessed with the use of logistic regression and Cox proportional-hazards model and multivariable adjustment was employed for possible confounders. Results: Clinical characteristics and received treatment of each genotype group were similar at baseline. In an adjusted model for cardiovascular risk variables, we were able to observe a significant association between ADAMTS13 900V variant and an increased risk of death (OR: 1,92 CI: 1,14-3,23, p = 0,015) or death from cardiac cause (OR: 2,67, CI: 1,59-4,49, p = 0,0009). No association between events and ADAMTS13 Q448E or P618A was observed. Conclusions: This first report studying the association between ADAMTS13 genotypes and cardiovascular events provides evidence for the association between ADAMTS13 900V variant and an increased risk of death in a population with multi-vessel CAD. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Background: Previous studies have associated neurohumoral excitation, as estimated by plasma norepinephrine levels, with increased mortality in heart failure. However, the prognostic value of neurovascular interplay in heart failure (HF) is unknown. We tested the hypothesis that the muscle sympathetic nerve activity (MSNA) and forearm blood flow would predict mortality in chronic heart failure patients. Methods: One hundred and twenty two heart failure patients, NYHA II-IV, age 50 +/- 1 ys, LVEF 33 +/- 1%, and LVDD 7.1 +/- 0.2 mm, were followed up for one year. MSNA was directly measured from the peroneal nerve by microneurography. Forearm blood flow was obtained by venous occlusion plethysmography. The variables were analyzed by using univariate, stepwise multivariate Cox proportional hazards analysis, and Kaplan-Meier analysis. Results: After one year, 34 pts died from cardiac death. The univariate analysis showed that MSNA, forearm blood flow, LVDD, LVEF, and heart rate were significant predictors of mortality. The multivariate analysis showed that only MSNA (P = 0.001) and forearm blood flow (P = 0.003) were significant independent predictors of mortality. On the basis of median levels of MSNA, survival rate was significantly lower in pts with >49 bursts/min. Similarly, survival rate was significantly lower in pts with forearm blood flow <1.87 ml/min/100 ml (P = 0.002). Conclusion: MSNA and forearm blood flow predict mortality rate in patients with heart failure. It remains unknown whether therapies that specifically target these abnormalities will improve survival in heart failure. (C) 2008 Elsevier Ireland Ltd. All rights reserved.
Resumo:
In this study, we evaluated the biodistribution and the elimination kinetics of a biocompatible magnetic fluid, Endorem (TM), based on dextrancoated Fe(3)O(4) nanoparticles endovenously injected into Winstar rats. The iron content in blood and liver samples was recorded using electron paramagnetic resonance (EPR) and X-ray fluorescence (XRF) techniques. The EPR line intensity at g=2.1 was found to be proportional to the concentration of magnetic nanoparticles and the best temperature for spectra acquisition was 298 K. Both EPR and XRF analysis indicated that the maximum concentration of iron in the liver occurred 95 min after the ferrofluid administration. The half-life of the magnetic nanoparticles (MNP) in the blood was (11.6 +/- 0.6) min measured by EPR and (12.6 +/- 0.6) min determined by XRF. These results indicate that both EPR and XRF are very useful and appropriate techniques for the study of kinetics of ferrofluid elimination and biodistribution after its administration into the organism. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
For the purpose of developing a longitudinal model to predict hand-and-foot syndrome (HFS) dynamics in patients receiving capecitabine, data from two large phase III studies were used. Of 595 patients in the capecitabine arms, 400 patients were randomly selected to build the model, and the other 195 were assigned for model validation. A score for risk of developing HFS was modeled using the proportional odds model, a sigmoidal maximum effect model driven by capecitabine accumulation as estimated through a kinetic-pharmacodynamic model and a Markov process. The lower the calculated creatinine clearance value at inclusion, the higher was the risk of HFS. Model validation was performed by visual and statistical predictive checks. The predictive dynamic model of HFS in patients receiving capecitabine allows the prediction of toxicity risk based on cumulative capecitabine dose and previous HFS grade. This dose-toxicity model will be useful in developing Bayesian individual treatment adaptations and may be of use in the clinic.
Resumo:
Aims: The aims of the present study were to characterize fatal traffic accident victims in a major urban center in Brazil and their association with alcohol consumption. Methods: Cross-sectional study of 907 fatal traffic accident victims in Sao Paulo, in 2005. Results: Adult males between the ages of 25 and 54 represented the majority of cases with positive blood alcohol concentrations (BAC). Overall, males had a higher proportional BAC and mean BAC than females. Pedestrians, particularly those with no detectable BAC, were typically older than other victims. Most accidents (total and BAC-positive) happened on weekends between midnight and 6 a.m. Considering all victims, 39.4% were positive (BAC over 0.1 g/l). When only drivers (automobile, motorcycle and bicycle) were evaluated. 42.3% had BAC over the legal limit (0.6 g/l). Conclusions: Alcohol is associated with nearly half of all traffic accident deaths in the city of Sao Paulo. especially for days and times associated with parties and bars (weekends between 12 a.m. and 6 a.m.). (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Background. A consistent association between paternal age and their offspring`s risk of schizophrenia has been observed, with no independent association with maternal age. The relationship of paternal and maternal ages with risk of bipolar affective disorders (BPAD) in the offspring is less clear. The present study aimed at testing the hypothesis that paternal age is associated with their offspring`s risk of BPAD, whereas maternal age is not. Method. This population-based cohort study was conducted with individuals born in Sweden during 1973-1980 and still resident there at age 16 years. Outcome was first hospital admission with a diagnosis of BPAD. Hazard ratios (HRs) were calculated using Cox`s proportional hazard regression. Results. After adjustment for all potential confounding variables except maternal age, the HR for risk of BPAD for each 10-year increase in paternal age was 1.28 [95% confidence interval (Cl) 1.11-1.48], but this fell to 1.20 (95% CI 0.97-1.48) after adjusting for maternal age. A similar result was found for maternal age and risk of BPAD [HR 1.30 (95% CI 1.08-1.56) before adjustment for paternal age, HR 1.12 (95% Cl 0.86-1.45) after adjustment]. The HR associated with having either parent aged 30 years or over was 1.26 (95% CI 1.01-1.57) and it was 1.45 (95%, CI 1.16-1.81) if both parents were >30 years. Conclusions. Unlike schizophrenia, the risk of BPAD seems to be associated with both paternal and maternal ages.
Resumo:
Methods. We studied participants with acute and/or early HIV infection and TDR in 2 cohorts (San Francisco, California, and Sao Paulo, Brazil). We followed baseline mutations longitudinally and compared replacement rates between mutation classes with use of a parametric proportional hazards model. Results. Among 75 individuals with 195 TDR mutations, M184V/I became undetectable markedly faster than did nonnucleoside reverse-transcriptase inhibitor (NNRTI) mutations (hazard ratio, 77.5; 95% confidence interval [CI], 14.7-408.2; P < .0001), while protease inhibitor and NNRTI replacement rates were similar. Higher plasma HIV-1 RNA level predicted faster mutation replacement, but this was not statistically significant (hazard ratio, 1.71 log(10) copies/mL; 95% CI, .90-3.25 log(10) copies/mL; P = .11). We found substantial person-to-person variability in mutation replacement rates not accounted for by viral load or mutation class (P < .0001). Conclusions. The rapid replacement of M184V/I mutations is consistent with known fitness costs. The long-term persistence of NNRTI and protease inhibitor mutations suggests a risk for person-to-person propagation. Host and/or viral factors not accounted for by viral load or mutation class are likely influencing mutation replacement and warrant further study.
Resumo:
An increasing number of studies have shown altered expression of secreted protein acidic and rich in cysteine (SPARC) and N-myc down-regulated gene (NDRG1) in several malignancies, including breast carcinoma; however, the role of these potential biomarkers in tumor development and progression is controversial. In this study, NDRG1 and SPARC protein expression was evaluated by immunohistochemistry on tissue microarrays containing breast tumor specimens from patients with 10 years of follow-up. NDRG1 and SPARC protein expression was determined in 596 patients along with other prognostic markers, such as ER, PR, and HER2. The status of NDRG1 and SPARC protein expression was correlated with prognostic variables and patient clinical outcome. Immunostaining revealed that 272 of the 596 cases (45.6%) were positive for NDRG1 and 431 (72.3%) were positive for SPARC. Statistically significant differences were found between the presence of SPARC and NDRG1 protein expression and standard clinicopathological variables. Kaplan-Meier analysis showed that NDRG1 positivity was directly associated with shorter disease-free survival (DFS, P < 0.001) and overall survival (OS, P < 0.001). In contrast, patients expressing low levels of SPARC protein had worse DFS (P = 0.001) and OS (P = 0.001) compared to those expressing high levels. Combined analysis of the two markers indicated that DFS (P < 0.001) and OS rates (P < 0.001) were lowest for patients with NDRG1-positive and SPARC-negative tumors. Furthermore, NDRG1 over-expression and SPARC down-regulation correlated with poor prognosis in patients with luminal A or triple-negative subtype breast cancer. On multivariate analysis using a Cox proportional hazards model, NDRG1 and SPARC protein expression were independent prognostic factors for both DFS and OS of breast cancer patients. These data indicate that NDRG1 over-expression and SPARC down-regulation could play important roles in breast cancer progression and serve as useful biomarkers to better define breast cancer prognosis.
Resumo:
OBJECTIVES We sought to assess the prognostic value and risk classification improvement using contemporary single-photon emission computed tomography myocardial perfusion imaging (SPECT-MPI) to predict all-cause mortality. BACKGROUND Myocardial perfusion is a strong estimator of prognosis. Evidence published to date has not established the added prognostic value of SPECT-MPI nor defined an approach to detect improve classification of risk in women from a developing nation. METHODS A total of 2,225 women referred for SPECT-MPI were followed by a mean period of 3.7 +/- 1.4 years. SPECT-MPI results were classified as abnormal on the presence of any perfusion defect. Abnormal scans were further classified as with mild/moderate reversible, severe reversible, partial reversible, or fixed perfusion defects. Risk estimates for incident mortality were categorized as <1%/year, 1% to 2%/year, and >2%/year using Cox proportional hazard models. Risk-adjusted models incorporated clinical risk factors, left ventricular ejection fraction (LVEF), and perfusion variables. RESULTS All-cause death occurred in 139 patients. SPECT-MPI significantly risk stratified the population; patients with abnormal scans had significantly higher death rates compared with patients with normal scans, 13.1% versus 4.0%, respectively (p < 0.001). Cox analysis demonstrated that after adjusting for clinical risk factors and LVEF, SPECT-MPI improved the model discrimination (integrated discrimination index = 0.009; p = 0.02), added significant incremental prognostic information (global chi-square increased from 87.7 to 127.1; p < 0.0001), and improved risk prediction (net reclassification improvement = 0.12; p = 0.005). CONCLUSIONS SPECT-MPI added significant incremental prognostic information to clinical and left ventricular functional variables while enhancing the ability to classify this Brazilian female population into low-and high-risk categories of all-cause mortality. (J Am Coll Cardiol Img 2011;4:880-8) (C) 2011 by the American College of Cardiology Foundation
Resumo:
Objective: To evaluate the impact of antiretroviral therapy (ART) and the prognostic factors for in-intensive care unit (ICU) and 6-month mortality in human immunodeficiency virus (HIV)-infected patients. Design: A retrospective cohort study was conducted in patients admitted to the ICU from 1996 through 2006. The follow-up period extended for 6 months after ICU admission. Setting: The ICU of a tertiary-care teaching hospital at the Universidade de Sao Paulo, Brazil. Participants: A total of 278 HIV-infected patients admitted to the ICU were selected. We excluded ICU readmissions (37), ICU admissions who stayed less than 24 hours (44), and patients with unavailable medical charts (36). Outcome Measure: In-ICU and 6-month mortality. Main Results: Multivariate logistic regression analysis and Cox proportional hazards models demonstrated that the variables associated with in-ICU and 6-month mortality were sepsis as the cause of admission (odds ratio [OR] = 3.16 [95% confidence interval [CI] 1.65-6.06]); hazards ratio [HR] = 1.37 [95% Cl 1.01-1.88)), an Acute Physiology and Chronic Health Evaluation 11 score >19 [OR = 2.81 (95% CI 1.57-5.04); HR = 2.18 (95% CI 1.62-2.94)], mechanical ventilation during the first 24 hours [OR = 3.92 (95% CI 2.20-6.96); HR = 2.25 (95% CI 1.65-3.07)], and year of ICU admission [OR = 0.90 (95% CI 0.81-0.99); HR = 0.92 [95% CI 0.87-0.97)]. CD4 T-cell count <50 cells/mm(3) Was only associated with ICU mortality [OR = 2.10 (95% Cl 1.17-3.76)]. The use of ART in the ICU was negatively predictive of 6-month mortality in the Cox model [HR = 0.50 (95% CI 0.35-0.71)], especially if this therapy was introduced during the first 4 days of admission to the ICU [HR = 0.58 (95% CI 0.41-0.83)]. Regarding HIV-infected patients admitted to ICU without using ART, those who have started this treatment during ICU, stay presented a better prognosis when time and potential confounding factors were adjusted for [HR 0.55 (95% CI 0.31-0.98)]. Conclusions: The ICU outcome of HIV-infected patients seems to be dependent not only on acute illness severity, but also on the administration of antiretroviral treatment. (Crit Care Med 2009; 37: 1605-1611)
Resumo:
Background: Around 15% of patients die or become dependent after cerebral vein and dural sinus thrombosis (CVT). Method: We used the International Study on Cerebral Vein and Dural Sinus Thrombosis (ISCVT) sample (624 patients, with a median follow-up time of 478 days) to develop a Cox proportional hazards regression model to predict outcome, dichotomised by a modified Rankin Scale score > 2. From the model hazard ratios, a risk score was derived and a cut-off point selected. The model and the score were tested in 2 validation samples: (1) the prospective Cerebral Venous Thrombosis Portuguese Collaborative Study Group (VENO-PORT) sample with 91 patients; (2) a sample of 169 consecutive CVT patients admitted to 5 ISCVT centres after the end of the ISCVT recruitment period. Sensitivity, specificity, c statistics and overall efficiency to predict outcome at 6 months were calculated. Results: The model (hazard ratios: malignancy 4.53; coma 4.19; thrombosis of the deep venous system 3.03; mental status disturbance 2.18; male gender 1.60; intracranial haemorrhage 1.42) had overall efficiencies of 85.1, 84.4 and 90.0%, in the derivation sample and validation samples 1 and 2, respectively. Using the risk score (range from 0 to 9) with a cut-off of 6 3 points, overall efficiency was 85.4, 84.4 and 90.1% in the derivation sample and validation samples 1 and 2, respectively. Sensitivity and specificity in the combined samples were 96.1 and 13.6%, respectively. Conclusions: The CVT risk score has a good estimated overall rate of correct classifications in both validation samples, but its specificity is low. It can be used to avoid unnecessary or dangerous interventions in low-risk patients, and may help to identify high-risk CVT patients. Copyright (C) 2009 S. Karger AG, Basel
Resumo:
Objective: To evaluate the frequency of overweight and obesity in health professionals, before and after a single specialized dietary recommendation. Methods: Anthropometric measures of 579 workers of a general hospital in the city of Sao Paulo, Brazil were taken. The weight (f), height (h) and waist circumference (wc) were interpreted according to the WHO and NCEP ATP III guidelines. Nutrition specialist provided dietary and behavioral recommendations. The entire sample underwent a new evaluation one year later. Results: At the first evaluation, 79 employees presente WC >= 102 cm (male) or WC >= 88 cm (female). The association between WC >= 102 cm (men) or WC >= 88 cm (women) and BMI >= 30 kg/m(2) was found in 12.8 % (69 subjects). The BMI distribution per age group indicated that the increase in overweight and obesity was directly proportional to the age increase. Physical activities were not practiced by 75% of the subjects studied. A year later, the evaluation indicated lack of statistical differences regarding the BMI and waist circumference of the sample and only 2.8% started to practice a physical activity. Conclusion: Dietary recommendation alone failed to promote changes in the eating habits of health professionals who work at a general hospital or to encourage them to practice exercise.
Resumo:
Introduction. Nowadays, lung transplantation (LTx) allocation in Brazil is based mainly oil waiting time. There is a need to evaluate the equity of the current lung allocation system. Objectives. We sought to (1) determine the characteristics of registered patients on the waiting list and (2) identify predictors of death on the list. Materials and Methods. We analyzed the medical records as well as clinical and laboratory data of 164 patients registered on the waiting list from 2001 to June 2008. Predictors of mortality were obtained using Cox proportional hazards analysis. Results. Patients who were registered on the waiting list showed a mean age of 36.1 +/- 15.0 vs. 42.2 +/- 15.7 years, considering those who did versus did not, die on the list, respectively (P = .054). Emphysema was the most prevalent underlying disease among the patients who did not die on the list (28.8%); its prevalence was low among the patients who died on the list (6.5%; P = .009). The following variables correlated with the probability of death on the waiting list: emphysema or bronchiectasis diagnosis (hazard ratio [HR] = 0.15; P = .002); activated partial thromboplastin time > 30 seconds (HR = 3.28; P = .002); serum albumin > 3.5 g/dL (HR = 0.41; P = .033); and hemoglobin saturation > 85% (HR = 0.44; P = .031). Conclusions. Some variables seemed to predict death on the LTx waiting list; these characteristics should be used to improve the LTx allocation criteria in Brazil.
Resumo:
The present study has investigated in conscious rats the influence of the duration of physical training sessions on cardiac autonomic adaptations by using different approaches; 1) double blockade with methylatropine and propranolol; 2) the baroreflex sensitivity evaluated by alternating bolus injections of phenylephrine and sodium nitroprusside; and 3) the autonomic modulation of HRV in the frequency domain by means of spectral analysis. The animals were divided into four groups: one sedentary group and three training groups submitted to physical exercise (swimming) for 15, 30, and 60 min a day during 10 weeks. All training groups showed similar reduction in intrinsic heart rate (IHR) after double blockade with methylatropine and propranolol. However, only 30-min and 60-min physical training presented an increase in the vagal autonomic component for determination of basal heart rate (HR) in relation to group sedentary. Spectral analysis of HR showed that the 30-min and 60-min physical training presented the reduction in low-frequency oscillations (LF = 0.20-0.75 Hz) and the increase in high-frequency oscillations (HF = 0.75-2.5 Hz) in normalized units. These both groups only showed an increased baroreflex sensitivity to tachycardiac responses in relation to group sedentary, however when compared, the physical training of 30-min exhibited a greater gain. In conclusion, cardiac autonomic adaptations, characterised by the increased predominance of the vagal autonomic component, were not proportional to the duration of daily physical training sessions. In fact, 30-minute training sessions provided similar cardiac autonomic adaptations, or even more enhanced ones, as in the case of baroreflex sensitivity compared to 60-minute training sessions. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Experimental models of infection are good tools for establishing immunological parameters that have an effect on the host-pathogen relationship and also for designing new vaccines and immune therapies. In this work, we evaluated the evolution of experimental tuberculosis in mice infected with increasing bacterial doses or via distinct routes. We showed that mice infected with low bacterial doses by the intratracheal route were able to develop a progressive infection that was proportional to the inoculum size. In the initial phase of disease, mice developed a specific Th1-driven immune response independent of inoculum concentration. However, in the late phase, mice infected with higher concentrations exhibited a mixed Th1/Th2 response, while mice infected with lower concentrations sustained the Th1 pattern. Significant IL-10 concentrations and a more preeminent T regulatory cell recruitment were also detected at 70 days post-infection with high bacterial doses. These results suggest that mice infected with higher concentrations of bacilli developed an immune response similar to the pattern described for human tuberculosis wherein patients with progressive tuberculosis exhibit a down modulation of IFN-gamma production accompanied by increased levels of IL-4. Thus, these data indicate that the experimental model is important in evaluating the protective efficacy of new vaccines and therapies against tuberculosis. (C) 2010 Elsevier Ltd. All rights reserved.