987 resultados para Modèle de Cox pondéré
Resumo:
An increasing number of studies have shown altered expression of secreted protein acidic and rich in cysteine (SPARC) and N-myc down-regulated gene (NDRG1) in several malignancies, including breast carcinoma; however, the role of these potential biomarkers in tumor development and progression is controversial. In this study, NDRG1 and SPARC protein expression was evaluated by immunohistochemistry on tissue microarrays containing breast tumor specimens from patients with 10 years of follow-up. NDRG1 and SPARC protein expression was determined in 596 patients along with other prognostic markers, such as ER, PR, and HER2. The status of NDRG1 and SPARC protein expression was correlated with prognostic variables and patient clinical outcome. Immunostaining revealed that 272 of the 596 cases (45.6%) were positive for NDRG1 and 431 (72.3%) were positive for SPARC. Statistically significant differences were found between the presence of SPARC and NDRG1 protein expression and standard clinicopathological variables. Kaplan-Meier analysis showed that NDRG1 positivity was directly associated with shorter disease-free survival (DFS, P < 0.001) and overall survival (OS, P < 0.001). In contrast, patients expressing low levels of SPARC protein had worse DFS (P = 0.001) and OS (P = 0.001) compared to those expressing high levels. Combined analysis of the two markers indicated that DFS (P < 0.001) and OS rates (P < 0.001) were lowest for patients with NDRG1-positive and SPARC-negative tumors. Furthermore, NDRG1 over-expression and SPARC down-regulation correlated with poor prognosis in patients with luminal A or triple-negative subtype breast cancer. On multivariate analysis using a Cox proportional hazards model, NDRG1 and SPARC protein expression were independent prognostic factors for both DFS and OS of breast cancer patients. These data indicate that NDRG1 over-expression and SPARC down-regulation could play important roles in breast cancer progression and serve as useful biomarkers to better define breast cancer prognosis.
Resumo:
OBJECTIVES We sought to assess the prognostic value and risk classification improvement using contemporary single-photon emission computed tomography myocardial perfusion imaging (SPECT-MPI) to predict all-cause mortality. BACKGROUND Myocardial perfusion is a strong estimator of prognosis. Evidence published to date has not established the added prognostic value of SPECT-MPI nor defined an approach to detect improve classification of risk in women from a developing nation. METHODS A total of 2,225 women referred for SPECT-MPI were followed by a mean period of 3.7 +/- 1.4 years. SPECT-MPI results were classified as abnormal on the presence of any perfusion defect. Abnormal scans were further classified as with mild/moderate reversible, severe reversible, partial reversible, or fixed perfusion defects. Risk estimates for incident mortality were categorized as <1%/year, 1% to 2%/year, and >2%/year using Cox proportional hazard models. Risk-adjusted models incorporated clinical risk factors, left ventricular ejection fraction (LVEF), and perfusion variables. RESULTS All-cause death occurred in 139 patients. SPECT-MPI significantly risk stratified the population; patients with abnormal scans had significantly higher death rates compared with patients with normal scans, 13.1% versus 4.0%, respectively (p < 0.001). Cox analysis demonstrated that after adjusting for clinical risk factors and LVEF, SPECT-MPI improved the model discrimination (integrated discrimination index = 0.009; p = 0.02), added significant incremental prognostic information (global chi-square increased from 87.7 to 127.1; p < 0.0001), and improved risk prediction (net reclassification improvement = 0.12; p = 0.005). CONCLUSIONS SPECT-MPI added significant incremental prognostic information to clinical and left ventricular functional variables while enhancing the ability to classify this Brazilian female population into low-and high-risk categories of all-cause mortality. (J Am Coll Cardiol Img 2011;4:880-8) (C) 2011 by the American College of Cardiology Foundation
Resumo:
OBJECTIVE. The purpose of the study was to investigate patient characteristics associated with image quality and their impact on the diagnostic accuracy of MDCT for the detection of coronary artery stenosis. MATERIALS AND METHODS. Two hundred ninety-one patients with a coronary artery calcification (CAC) score of <= 600 Agatston units (214 men and 77 women; mean age, 59.3 +/- 10.0 years [SD]) were analyzed. An overall image quality score was derived using an ordinal scale. The accuracy of quantitative MDCT to detect significant (>= 50%) stenoses was assessed using quantitative coronary angiography (QCA) per patient and per vessel using a modified 19-segment model. The effect of CAC, obesity, heart rate, and heart rate variability on image quality and accuracy were evaluated by multiple logistic regression. Image quality and accuracy were further analyzed in subgroups of significant predictor variables. Diagnostic analysis was determined for image quality strata using receiver operating characteristic (ROC) curves. RESULTS. Increasing body mass index (BMI) (odds ratio [OR] = 0.89, p < 0.001), increasing heart rate (OR = 0.90, p < 0.001), and the presence of breathing artifact (OR = 4.97, p = 0.001) were associated with poorer image quality whereas sex, CAC score, and heart rate variability were not. Compared with examinations of white patients, studies of black patients had significantly poorer image quality (OR = 0.58, p = 0.04). At a vessel level, CAC score (10 Agatston units) (OR = 1.03, p = 0.012) and patient age (OR = 1.02, p = 0.04) were significantly associated with the diagnostic accuracy of quantitative MDCT compared with QCA. A trend was observed in differences in the areas under the ROC curves across image quality strata at the vessel level (p = 0.08). CONCLUSION. Image quality is significantly associated with patient ethnicity, BMI, mean scan heart rate, and the presence of breathing artifact but not with CAC score at a patient level. At a vessel level, CAC score and age were associated with reduced diagnostic accuracy.
Resumo:
Background and Objectives: Some authors states that the removal of lymph node would only contribute towards assessing the lymph node status and regional disease control, without any benefit for the patients` survival. The aim of this paper was to assess the influence of the number of surgically dissected pelvic lymph nodes (PLN) on disease-free Survival. Methods: Retrospective cohort study on 42 women presenting squamous cell carcinoma (SCC) of the uterine cervix, with metastases in PLN treated by radical surgery. The Cox model was used to identify risk factors for recurrence. The model variables were adjusted for treatment-related factors (year of treatment, surgical margins and postoperative radiotherapy). The cutoff value for classifying the lymphadenectomy as comprehensive (15 PLN or more) or non-comprehensive (<15 PLN) was determined from analysis of the ROC curve. Results: Fourteen recurrences (32.6%) were recorded: three pelvic, eight distant, two both pelvic and distant, and one at an unknown location. The following risk factors for recurrence were identified: invasion of the deep third of the cervix and number of dissected lymph nodes <15. Conclusions: Deep invasion and non-comprehensive pelvic lymphadenectomy are possible risk factors for recurrence of SCC of the uterine cervix with metastases in PLN. J. Surg. Oncol. 2009;100:252-257. (C) 2009 Wiley-Liss, Inc.
Resumo:
Objective: To evaluate the impact of antiretroviral therapy (ART) and the prognostic factors for in-intensive care unit (ICU) and 6-month mortality in human immunodeficiency virus (HIV)-infected patients. Design: A retrospective cohort study was conducted in patients admitted to the ICU from 1996 through 2006. The follow-up period extended for 6 months after ICU admission. Setting: The ICU of a tertiary-care teaching hospital at the Universidade de Sao Paulo, Brazil. Participants: A total of 278 HIV-infected patients admitted to the ICU were selected. We excluded ICU readmissions (37), ICU admissions who stayed less than 24 hours (44), and patients with unavailable medical charts (36). Outcome Measure: In-ICU and 6-month mortality. Main Results: Multivariate logistic regression analysis and Cox proportional hazards models demonstrated that the variables associated with in-ICU and 6-month mortality were sepsis as the cause of admission (odds ratio [OR] = 3.16 [95% confidence interval [CI] 1.65-6.06]); hazards ratio [HR] = 1.37 [95% Cl 1.01-1.88)), an Acute Physiology and Chronic Health Evaluation 11 score >19 [OR = 2.81 (95% CI 1.57-5.04); HR = 2.18 (95% CI 1.62-2.94)], mechanical ventilation during the first 24 hours [OR = 3.92 (95% CI 2.20-6.96); HR = 2.25 (95% CI 1.65-3.07)], and year of ICU admission [OR = 0.90 (95% CI 0.81-0.99); HR = 0.92 [95% CI 0.87-0.97)]. CD4 T-cell count <50 cells/mm(3) Was only associated with ICU mortality [OR = 2.10 (95% Cl 1.17-3.76)]. The use of ART in the ICU was negatively predictive of 6-month mortality in the Cox model [HR = 0.50 (95% CI 0.35-0.71)], especially if this therapy was introduced during the first 4 days of admission to the ICU [HR = 0.58 (95% CI 0.41-0.83)]. Regarding HIV-infected patients admitted to ICU without using ART, those who have started this treatment during ICU, stay presented a better prognosis when time and potential confounding factors were adjusted for [HR 0.55 (95% CI 0.31-0.98)]. Conclusions: The ICU outcome of HIV-infected patients seems to be dependent not only on acute illness severity, but also on the administration of antiretroviral treatment. (Crit Care Med 2009; 37: 1605-1611)
Resumo:
Background: Around 15% of patients die or become dependent after cerebral vein and dural sinus thrombosis (CVT). Method: We used the International Study on Cerebral Vein and Dural Sinus Thrombosis (ISCVT) sample (624 patients, with a median follow-up time of 478 days) to develop a Cox proportional hazards regression model to predict outcome, dichotomised by a modified Rankin Scale score > 2. From the model hazard ratios, a risk score was derived and a cut-off point selected. The model and the score were tested in 2 validation samples: (1) the prospective Cerebral Venous Thrombosis Portuguese Collaborative Study Group (VENO-PORT) sample with 91 patients; (2) a sample of 169 consecutive CVT patients admitted to 5 ISCVT centres after the end of the ISCVT recruitment period. Sensitivity, specificity, c statistics and overall efficiency to predict outcome at 6 months were calculated. Results: The model (hazard ratios: malignancy 4.53; coma 4.19; thrombosis of the deep venous system 3.03; mental status disturbance 2.18; male gender 1.60; intracranial haemorrhage 1.42) had overall efficiencies of 85.1, 84.4 and 90.0%, in the derivation sample and validation samples 1 and 2, respectively. Using the risk score (range from 0 to 9) with a cut-off of 6 3 points, overall efficiency was 85.4, 84.4 and 90.1% in the derivation sample and validation samples 1 and 2, respectively. Sensitivity and specificity in the combined samples were 96.1 and 13.6%, respectively. Conclusions: The CVT risk score has a good estimated overall rate of correct classifications in both validation samples, but its specificity is low. It can be used to avoid unnecessary or dangerous interventions in low-risk patients, and may help to identify high-risk CVT patients. Copyright (C) 2009 S. Karger AG, Basel
Resumo:
Purpose: To evaluate overall and relapse-free survival (RFS) in patients with nonmycosis fungoides (non-MF) primary cutaneous lymphoma (PCL). Methods: Thirty-eight patients with PCL excluding cases of MF treated between 1993 and 2006 were analyzed retrospectively. Survival statistics were estimated by the methods of Kaplan and Meier, and univariate and multivariate significance testing were performed by Cox regression analysis. Results: The median follow-up was 34.6 months (range, 2-138.3 months). The overall survival for the entire study population, at 5 and 10 years, was 97% and 78%, respectively. The RFS for the entire study population, at 5 and 10 years, was 30% and 22%, respectively. For those who received radiotherapy (RT) as a component of their initial therapy, the RFS at 5 and 10 years was 48% and 36%, respectively. Among those receiving RT who relapsed, the site of relapse was out-of-field in 82% of the cases. In our multivariate analysis, only RT as a component of the initial therapy and the absence of bulky disease had a statistically significant improvement in RFS (P = 0.01 and < 0.01, respectively). Conclusion: RT improves the local control and RFS of patients with non-MF PCL.
Resumo:
Background/Aims: Statistical analysis of age-at-onset involving family data is particularly complicated because there is a correlation pattern that needs to be modeled and also because there are measurements that are censored. In this paper, our main purpose was to evaluate the effect of genetic and shared family environmental factors on age-at-onset of three cardiovascular risk factors: hypertension, diabetes and high cholesterol. Methods: The mixed-effects Cox model proposed by Pankratz et al. [2005] was used to analyze the data from 81 families, involving 1,675 individuals from the village of Baependi, in the state of Minas Gerais, Brazil. Results: The analyses performed showed that the polygenic effect plays a greater role than the shared family environmental effect in explaining the variability of the age-at-onset of hypertension, diabetes and high cholesterol. The model which simultaneously evaluated both effects indicated that there are individuals which may have risk of hypertension due to polygenic effects 130% higher than the overall average risk for the entire sample. For diabetes and high cholesterol the risks of some individuals were 115 and 45%, respectively, higher than the overall average risk for the entire population. Conclusions: Results showed evidence of significant polygenic effects indicating that age-at-onset is a useful trait for gene mapping of the common complex diseases analyzed. In addition, we found that the polygenic random component might absorb the effects of some covariates usually considered in the risk evaluation, such as gender, age and BMI. Copyright (C) 2008 S. Karger AG, Basel
Resumo:
PURPOSE. To assess whether baseline Glaucoma Probability Score (GPS; HRT-3; Heidelberg Engineering, Dossenheim, Germany) results are predictive of progression in patients with suspected glaucoma. The GPS is a new feature of the confocal scanning laser ophthalmoscope that generates an operator-independent, three-dimensional model of the optic nerve head and gives a score for the probability that this model is consistent with glaucomatous damage. METHODS. The study included 223 patients with suspected glaucoma during an average follow-up of 63.3 months. Included subjects had a suspect optic disc appearance and/or elevated intraocular pressure, but normal visual fields. Conversion was defined as development of either repeatable abnormal visual fields or glaucomatous deterioration in the appearance of the optic disc during the study period. The association between baseline GPS and conversion was investigated by Cox regression models. RESULTS. Fifty-four (24.2%) eyes converted. In multivariate models, both higher values of GPS global and subjective stereophotograph assessment ( larger cup-disc ratio and glaucomatous grading) were predictive of conversion: adjusted hazard ratios (95% CI): 1.31 (1.15 - 1.50) per 0.1 higher global GPS, 1.34 (1.12 - 1.62) per 0.1 higher CDR, and 2.34 (1.22 - 4.47) for abnormal grading, respectively. No significant differences ( P > 0.05 for all comparisons) were found between the c-index values ( equivalent to area under ROC curve) for the multivariate models (0.732, 0.705, and 0.699, respectively). CONCLUSIONS. GPS values were predictive of conversion in our population of patients with suspected glaucoma. Further, they performed as well as subjective assessment of the optic disc. These results suggest that GPS could potentially replace stereophotograph as a tool for estimating the likelihood of conversion to glaucoma.
Resumo:
Objective To investigate whether the cox-2 inhibitor celecoxib has antidepressant effects in bipolar disorder (BD) patients during depressive or mixed phases. Methods We studied 28 DSM-IV BD patients who were experiencing a depressive or mixed episode and were on a stable dose of a mood stabilizer or atypical antipsychotic medication. Subjects were randomized to receive 6 weeks of double-blind placebo or celecoxib (400 mg/day) treatment. Current mood stabilizer or antipsychotic medication remained at the same doses during the trial. Results Intention-to-treat analysis showed that the patients receiving celecoxib had lower Hamilton Depression Rating Scale (HamD) scores after 1 week of treatment compared to the patients receiving placebo, but this difference was not statistically significant (p=0.09). The improvement in the first week of treatment was statistically significant when the analysis included only the subjects who completed the full 6-week trial (p=0.03). The two groups did not differ significantly on depressive or manic symptoms from the second week until the end of the trial. Celecoxib was well tolerated with the exception of two subjects who dropped out of the study due to rash. Conclusions Our findings suggest that adjunctive treatment with celecoxib may produce a rapid-onset antidepressant effect in BD patients experiencing depressive or mixed episodes. Copyright (C) 2008 John Wiley & Sons, Ltd.
Resumo:
Introduction. Nowadays, lung transplantation (LTx) allocation in Brazil is based mainly oil waiting time. There is a need to evaluate the equity of the current lung allocation system. Objectives. We sought to (1) determine the characteristics of registered patients on the waiting list and (2) identify predictors of death on the list. Materials and Methods. We analyzed the medical records as well as clinical and laboratory data of 164 patients registered on the waiting list from 2001 to June 2008. Predictors of mortality were obtained using Cox proportional hazards analysis. Results. Patients who were registered on the waiting list showed a mean age of 36.1 +/- 15.0 vs. 42.2 +/- 15.7 years, considering those who did versus did not, die on the list, respectively (P = .054). Emphysema was the most prevalent underlying disease among the patients who did not die on the list (28.8%); its prevalence was low among the patients who died on the list (6.5%; P = .009). The following variables correlated with the probability of death on the waiting list: emphysema or bronchiectasis diagnosis (hazard ratio [HR] = 0.15; P = .002); activated partial thromboplastin time > 30 seconds (HR = 3.28; P = .002); serum albumin > 3.5 g/dL (HR = 0.41; P = .033); and hemoglobin saturation > 85% (HR = 0.44; P = .031). Conclusions. Some variables seemed to predict death on the LTx waiting list; these characteristics should be used to improve the LTx allocation criteria in Brazil.
Resumo:
Epileptic seizures are hypersynchronous, paroxystic and abnormal neuronal discharges. Epilepsies are characterized by diverse mechanisms involving alteration of excitatory and inhibitory neurotransmission that result in hyperexcitability of the central nervous system (CNS). Enhanced neuronal excitability can also be achieved by inflammatory processes, including the participation of cytokines, prostaglandins or kinins, molecules known to be involved in either triggering or in the establishment of inflammation. Multiple inductions of audiogenic seizures in the Wistar audiogenic rat (WAR) strain are a model of temporal lobe epilepsy (TLE), due to the recruitment of limbic areas such as hippocampus and amygdata. In this study we investigated the modulation of the B-1 and B-2 kinin receptors expression levels in neonatal WARs as well as in adult WARs subjected to the TLE model. The expression levels of pro-inflammatory (IL-1 beta) and anti-inflammatory (IL-10) cytokines were also evaluated, as well as cyclooxygenase (COX-2). Our results showed that the B-1 and B-2 kinin receptors mRNAs were up-regulated about 7- and 4-fold, respectively, in the hippocampus of kindled WARs. On the other hand, the expressions of the IL-1 beta, IL-10 and COX-2 were not related to the observed increase of expression of kinin receptors. Based on those results we believe that the B, and B2 kinin receptors have a pivotal role in this model of TLE, although their participation is not related to an inflammatory process. We believe that kinin receptors in the CNS may act in seizure mechanisms by participating in a specific kininergic neurochemical pathway. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
Heart failure (HF) incidence in diabetes in both the presence and absence of CHD is rising. Prospective population-based studies can help describe the relationship between HbA(1c), a measure of glycaemia control, and HF risk. We studied the incidence of HF hospitalisation or death among 1,827 participants in the Atherosclerosis Risk in Communities (ARIC) study with diabetes and no evidence of HF at baseline. Cox proportional hazard models included age, sex, race, education, health insurance status, alcohol consumption, BMI and WHR, and major CHD risk factors (BP level and medications, LDL- and HDL-cholesterol levels, and smoking). In this population of persons with diabetes, crude HF incidence rates per 1,000 person-years were lower in the absence of CHD (incidence rate 15.5 for CHD-negative vs 56.4 for CHD-positive, p < 0.001). The adjusted HR of HF for each 1% higher HbA(1c) was 1.17 (95% CI 1.11-1.25) for the non-CHD group and 1.20 (95% CI 1.04-1.40) for the CHD group. When the analysis was limited to HF cases which occurred in the absence of prevalent or incident CHD (during follow-up) the adjusted HR remained 1.20 (95% CI 1.11-1.29). These data suggest HbA(1c) is an independent risk factor for incident HF in persons with diabetes with and without CHD. Long-term clinical trials of tight glycaemic control should quantify the impact of different treatment regimens on HF risk reduction.