911 resultados para Cox regression
Resumo:
Dissertação de mestrado em Estatística
Resumo:
OBJECTIVE: To detect factors associated with cardiovascular mortality in the elderly of Botucatu. METHODS: We evaluated 29 variables of interest in a cohort of patients aged ³60 using data from a survey conducted between 1983/84. The elderly cohort was analyzed in 1992 to detect the occurrence of cardiovascular deaths. Survival analysis was performed using the Kaplan-Meier method, the log-rank test, and Cox regression analysis. Three models were adapted for each group of variables, and a final model was chosen from those variables selected from each group. RESULTS: We identified predictor for cardiovascular death according to age for elderly males not supporting the family, not possessing a vehicle, and previous cardiovascular disease. In elderly females, the predictor variables were previous cardiovascular disease and diabetes mellitus. CONCLUSION: Socioeconomic indicators (family heading and vehicle ownerrship) may be added to well stabilished medical factors (diabete mellitus and hypertension to select target groups for programs intended to reduce deaths due to cardiovascular diseases in elderly people.
Resumo:
OBJECTIVE: To investigate preoperative predictive factors of severe perioperative intercurrent events and in-hospital mortality in coronary artery bypass graft (CABG) surgery and to develop specific models of risk prediction for these events, mainly those that can undergo changes in the preoperative period. METHODS: We prospectively studied 453 patients who had undergone CABG. Factors independently associated with the events of interest were determined with multiple logistic regression and Cox proportional hazards regression model. RESULTS: The mortality rate was 11.3% (51/453), and 21.2% of the patients had 1 or more perioperative intercurrent events. In the final model, the following variables remained associated with the risk of intercurrent events: age ³ 70 years, female sex, hospitalization via SUS (Sistema Único de Saúde - the Brazilian public health system), cardiogenic shock, ischemia, and dependence on dialysis. Using multiple logistic regression for in-hospital mortality, the following variables participated in the model of risk prediction: age ³ 70 years, female sex, hospitalization via SUS, diabetes, renal dysfunction, and cardiogenic shock. According to the Cox regression model for death within the 7 days following surgery, the following variables remained associated with mortality: age ³ 70 years, female sex, cardiogenic shock, and hospitalization via SUS. CONCLUSION: The aspects linked to the structure of the Brazilian health system, such as factors of great impact on the results obtained, indicate that the events investigated also depend on factors that do not relate to the patient's intrinsic condition.
Resumo:
Background: According to some international studies, patients with acute coronary syndrome (ACS) and increased left atrial volume index (LAVI) have worse long-term prognosis. However, national Brazilian studies confirming this prediction are still lacking. Objective: To evaluate LAVI as a predictor of major cardiovascular events (MCE) in patients with ACS during a 365-day follow-up. Methods: Prospective cohort of 171 patients diagnosed with ACS whose LAVI was calculated within 48 hours after hospital admission. According to LAVI, two groups were categorized: normal LAVI (≤ 32 mL/m2) and increased LAVI (> 32 mL/m2). Both groups were compared regarding clinical and echocardiographic characteristics, in- and out-of-hospital outcomes, and occurrence of ECM in up to 365 days. Results: Increased LAVI was observed in 78 patients (45%), and was associated with older age, higher body mass index, hypertension, history of myocardial infarction and previous angioplasty, and lower creatinine clearance and ejection fraction. During hospitalization, acute pulmonary edema was more frequent in patients with increased LAVI (14.1% vs. 4.3%, p = 0.024). After discharge, the occurrence of combined outcome for MCE was higher (p = 0.001) in the group with increased LAVI (26%) as compared to the normal LAVI group (7%) [RR (95% CI) = 3.46 (1.54-7.73) vs. 0.80 (0.69-0.92)]. After Cox regression, increased LAVI increased the probability of MCE (HR = 3.08, 95% CI = 1.28-7.40, p = 0.012). Conclusion: Increased LAVI is an important predictor of MCE in a one-year follow-up.
Resumo:
Background:Recent studies have suggested that B-type Natriuretic Peptide (BNP) is an important predictor of ischemia and death in patients with suspected acute coronary syndrome. Increased levels of BNP are seen after episodes of myocardial ischemia and may be related to future adverse events.Objectives:To determine the prognostic value of BNP for major cardiac events and to evaluate its association with ischemic myocardial perfusion scintigraphy (MPS).Methods:This study included retrospectively 125 patients admitted to the chest pain unit between 2002 and 2006, who had their BNP levels measured on admission and underwent CPM for risk stratification. BNP values were compared with the results of the MPS. The chi-square test was used for qualitative variables and the Student t test, for quantitative variables. Survival curves were adjusted using the Kaplan-Meier method and analyzed by using Cox regression. The significance level was 5%.Results:The mean age was 63.9 ± 13.8 years, and the male sex represented 51.2% of the sample. Ischemia was found in 44% of the MPS. The mean BNP level was higher in patients with ischemia compared to patients with non-ischemic MPS (188.3 ± 208.7 versus 131.8 ± 88.6; p = 0.003). A BNP level greater than 80 pg/mL was the strongest predictor of ischemia on MPS (sensitivity = 60%, specificity = 70%, accuracy = 66%, PPV = 61%, NPV = 70%), and could predict medium-term mortality (RR = 7.29, 95% CI: 0.90-58.6; p = 0.045) independently of the presence of ischemia.Conclusions:BNP levels are associated with ischemic MPS findings and adverse prognosis in patients presenting with acute chest pain to the emergency room, thus, providing important prognostic information for an unfavorable clinical outcome.
Resumo:
Background:Information about post-acute coronary syndrome (ACS) survival have been mostly short-term findings or based on specialized, cardiology referral centers.Objectives:To describe one-year case-fatality rates in the Strategy of Registry of Acute Coronary Syndrome (ERICO) cohort, and to study baseline characteristics as predictors.Methods:We analyzed data from 964 ERICO participants enrolled from February 2009 to December 2012. We assessed vital status by telephone contact and official death certificate searches. The cause of death was determined according to the official death certificates. We used log-rank tests to compare the probabilities of survival across subgroups. We built crude and adjusted (for age, sex and ACS subtype) Cox regression models to study if the ACS subtype or baseline characteristics were independent predictors of all-cause or cardiovascular mortality.Results:We identified 110 deaths in the cohort (case-fatality rate, 12.0%). Age [Hazard ratio (HR) = 2.04 per 10 year increase; 95% confidence interval (95%CI) = 1.75–2.38], non-ST elevation myocardial infarction (HR = 3.82 ; 95%CI = 2.21–6.60) or ST elevation myocardial infarction (HR = 2.59; 95%CI = 1.38–4.89) diagnoses, and diabetes (HR = 1.78; 95%CI = 1.20‑2.63) were significant risk factors for all-cause mortality in the adjusted models. We found similar results for cardiovascular mortality. A previous coronary artery disease diagnosis was also an independent predictor of all-cause mortality (HR = 1.61; 95%CI = 1.04–2.50), but not for cardiovascular mortality.Conclusion:We found an overall one-year mortality rate of 12.0% in a sample of post-ACS patients in a community, non-specialized hospital in São Paulo, Brazil. Age, ACS subtype, and diabetes were independent predictors of poor one‑year survival for overall and cardiovascular-related causes.
Resumo:
AbstractBackground:30-40% of cardiac resynchronization therapy cases do not achieve favorable outcomes.Objective:This study aimed to develop predictive models for the combined endpoint of cardiac death and transplantation (Tx) at different stages of cardiac resynchronization therapy (CRT).Methods:Prospective observational study of 116 patients aged 64.8 ± 11.1 years, 68.1% of whom had functional class (FC) III and 31.9% had ambulatory class IV. Clinical, electrocardiographic and echocardiographic variables were assessed by using Cox regression and Kaplan-Meier curves.Results:The cardiac mortality/Tx rate was 16.3% during the follow-up period of 34.0 ± 17.9 months. Prior to implantation, right ventricular dysfunction (RVD), ejection fraction < 25% and use of high doses of diuretics (HDD) increased the risk of cardiac death and Tx by 3.9-, 4.8-, and 5.9-fold, respectively. In the first year after CRT, RVD, HDD and hospitalization due to congestive heart failure increased the risk of death at hazard ratios of 3.5, 5.3, and 12.5, respectively. In the second year after CRT, RVD and FC III/IV were significant risk factors of mortality in the multivariate Cox model. The accuracy rates of the models were 84.6% at preimplantation, 93% in the first year after CRT, and 90.5% in the second year after CRT. The models were validated by bootstrapping.Conclusion:We developed predictive models of cardiac death and Tx at different stages of CRT based on the analysis of simple and easily obtainable clinical and echocardiographic variables. The models showed good accuracy and adjustment, were validated internally, and are useful in the selection, monitoring and counseling of patients indicated for CRT.
Resumo:
Abstract Background: Pulmonary hypertension is associated with poor prognosis in heart failure. However, non-invasive diagnosis is still challenging in clinical practice. Objective: We sought to assess the prognostic utility of non-invasive estimation of pulmonary vascular resistances (PVR) by cardiovascular magnetic resonance to predict adverse cardiovascular outcomes in heart failure with reduced ejection fraction (HFrEF). Methods: Prospective registry of patients with left ventricular ejection fraction (LVEF) < 40% and recently admitted for decompensated heart failure during three years. PVRwere calculated based on right ventricular ejection fraction and average velocity of the pulmonary artery estimated during cardiac magnetic resonance. Readmission for heart failure and all-cause mortality were considered as adverse events at follow-up. Results: 105 patients (average LVEF 26.0 ±7.7%, ischemic etiology 43%) were included. Patients with adverse events at long-term follow-up had higher values of PVR (6.93 ± 1.9 vs. 4.6 ± 1.7estimated Wood Units (eWu), p < 0.001). In multivariate Cox regression analysis, PVR ≥ 5 eWu(cutoff value according to ROC curve) was independently associated with increased risk of adverse events at 9 months follow-up (HR2.98; 95% CI 1.12-7.88; p < 0.03). Conclusions: In patients with HFrEF, the presence of PVR ≥ 5.0 Wu is associated with significantly worse clinical outcome at follow-up. Non-invasive estimation of PVR by cardiac magnetic resonance might be useful for risk stratification in HFrEF, irrespective of etiology, presence of late gadolinium enhancement or LVEF.
Resumo:
Abstract Background: There are sparse data on the performance of different types of drug-eluting stents (DES) in acute and real-life setting. Objective: The aim of the study was to compare the safety and efficacy of first- versus second-generation DES in patients with acute coronary syndromes (ACS). Methods: This all-comer registry enrolled consecutive patients diagnosed with ACS and treated with percutaneous coronary intervention with the implantation of first- or second-generation DES in one-year follow-up. The primary efficacy endpoint was defined as major adverse cardiac and cerebrovascular event (MACCE), a composite of all-cause death, nonfatal myocardial infarction, target-vessel revascularization and stroke. The primary safety outcome was definite stent thrombosis (ST) at one year. Results: From the total of 1916 patients enrolled into the registry, 1328 patients were diagnosed with ACS. Of them, 426 were treated with first- and 902 with second-generation DES. There was no significant difference in the incidence of MACCE between two types of DES at one year. The rate of acute and subacute ST was higher in first- vs. second-generation DES (1.6% vs. 0.1%, p < 0.001, and 1.2% vs. 0.2%, p = 0.025, respectively), but there was no difference regarding late ST (0.7% vs. 0.2%, respectively, p = 0.18) and gastrointestinal bleeding (2.1% vs. 1.1%, p = 0.21). In Cox regression, first-generation DES was an independent predictor for cumulative ST (HR 3.29 [1.30-8.31], p = 0.01). Conclusions: In an all-comer registry of ACS, the one-year rate of MACCE was comparable in groups treated with first- and second-generation DES. The use of first-generation DES was associated with higher rates of acute and subacute ST and was an independent predictor of cumulative ST.
Resumo:
BACKGROUND: Patterns of morbidity and mortality among human immunodeficiency virus (HIV)-infected individuals taking antiretroviral therapy are changing as a result of immune reconstitution and improved survival. We studied the influence of aging on the epidemiology of non-AIDS diseases in the Swiss HIV Cohort Study. METHODS: The Swiss HIV Cohort Study is a prospective observational cohort established in 1988 with continuous enrollment. We determined the incidence of clinical events (per 1000 person-years) from January 2008 (when a new questionnaire on non-AIDS-related morbidity was introduced) through December 2010. Differences across age groups were analyzed using Cox regression, adjusted for CD4 cell count, viral load, sex, injection drug use, smoking, and years of HIV infection. RESULTS: Overall, 8444 (96%) of 8848 participants contributed data from 40,720 semiannual visits; 2233 individuals (26.4%) were aged 50-64 years, and 450 (5.3%) were aged ≥65 years. The median duration of HIV infection was 15.4 years (95% confidence interval [CI], 9.59-22.0 years); 23.2% had prior clinical AIDS. We observed 994 incident non-AIDS events in the reference period: 201 cases of bacterial pneumonia, 55 myocardial infarctions, 39 strokes, 70 cases of diabetes mellitus, 123 trauma-associated fractures, 37 fractures without adequate trauma, and 115 non-AIDS malignancies. Multivariable hazard ratios for stroke (17.7; CI, 7.06-44.5), myocardial infarction (5.89; 95% CI, 2.17-16.0), diabetes mellitus (3.75; 95% CI, 1.80-7.85), bone fractures without adequate trauma (10.5; 95% CI, 3.58-30.5), osteoporosis (9.13; 95% CI, 4.10-20.3), and non-AIDS-defining malignancies (6.88; 95% CI, 3.89-12.2) were elevated for persons aged ≥65 years. CONCLUSIONS: Comorbidity and multimorbidity because of non-AIDS diseases, particularly diabetes mellitus, cardiovascular disease, non-AIDS-defining malignancies, and osteoporosis, become more important in care of HIV-infected persons and increase with older age.
Resumo:
OBJECTIVE: Ability to work and live independently is of particular concern for patients with Parkinson's disease (PD). We studied a series of PD patients able to work or live independently at baseline, and evaluated potential risk factors for two separate outcomes: loss of ability to work and loss of ability to live independently. METHODS: The series comprised 495 PD patients followed prospectively. Ability to work and ability to live independently were based on clinical interview and examination. Cox regression models adjusted for age and disease duration were used to evaluate associations of baseline characteristics with loss of ability to work and loss of ability to live independently. RESULTS: Higher UPDRS dyskinesia score, UPDRS instability score, UPDRS total score, Hoehn and Yahr stage, and presence of intellectual impairment at baseline were all associated with increased risk of future loss of ability to work and loss of ability to live independently (P ≤ 0.0033). Five years after initial visit, for patients ≤70 years of age with a disease duration ≤4 years at initial visit, 88% were still able to work and 90% to live independently. These estimates worsened as age and disease duration at initial visit increased; for patients >70 years of age with a disease duration >4 years, estimates at 5 years were 43% able to work and 57% able to live independently. CONCLUSIONS: The information provided in this study can offer useful information for PD patients in preparing for future ability to perform activities of daily living.
Resumo:
BACKGROUND: Antiretroviral compounds have been predominantly studied in human immunodeficiency virus type 1 (HIV-1) subtype B, but only ~10% of infections worldwide are caused by this subtype. The analysis of the impact of different HIV subtypes on treatment outcome is important. METHODS: The effect of HIV-1 subtype B and non-B on the time to virological failure while taking combination antiretroviral therapy (cART) was analyzed. Other studies that have addressed this question were limited by the strong correlation between subtype and ethnicity. Our analysis was restricted to white patients from the Swiss HIV Cohort Study who started cART between 1996 and 2009. Cox regression models were performed; adjusted for age, sex, transmission category, first cART, baseline CD4 cell counts, and HIV RNA levels; and stratified for previous mono/dual nucleoside reverse-transcriptase inhibitor treatment. RESULTS: Included in our study were 4729 patients infected with subtype B and 539 with non-B subtypes. The most prevalent non-B subtypes were CRF02_AG (23.8%), A (23.4%), C (12.8%), and CRF01_AE (12.6%). The incidence of virological failure was higher in patients with subtype B (4.3 failures/100 person-years; 95% confidence interval [CI], 4.0-4.5]) compared with non-B (1.8 failures/100 person-years; 95% CI, 1.4-2.4). Cox regression models confirmed that patients infected with non-B subtypes had a lower risk of virological failure than those infected with subtype B (univariable hazard ratio [HR], 0.39 [95% CI, .30-.52; P < .001]; multivariable HR, 0.68 [95% CI, .51-.91; P = .009]). In particular, subtypes A and CRF02_AG revealed improved outcomes (multivariable HR, 0.54 [95% CI, .29-.98] and 0.39 [95% CI, .19-.79], respectively). CONCLUSIONS: Improved virological outcomes among patients infected with non-B subtypes invalidate concerns that these individuals are at a disadvantage because drugs have been designed primarily for subtype B infections.
Resumo:
Purpose: In this prospective randomized study efficacy and safety of two immunosuppressive regimens (Tac, MMF, Steroids vs. CsA, MMF, Steroids) after Lung Transplantation were compared. Primary objective was the incidence of bronchiolitis obliterans syndrome (BOS). Secondary objectives were incidence of acute rejection and infection, survival and adverse events. 248 patients with a complete 3 year follow-up were included in the analysis. Methods and Materials: Patients were randomized to treatment group A: Tac (0.01-0.03 mg/kg/d iv-0.05-0.3 mg/kg/d po) or B: CsA (1-3 mg/kg/d iv-2-8 mg/kg/d po). MMF dose was1-4 mg/d in both groups. No induction therapy was given. Patients were stratified for cystic fibrosis. Intention to treat analysis was performed in patients who were switched to a different immunosuppressive regimen. Results: 3 of 123 Tac patients and 41 of 125 CsA patients were switched to another immunosuppressive regimen and were analyzed as intention to treat. Three year follow-up data of the complete patient cohort were included in this final analysis. Groups showed no difference in demographic data. Kaplan Meier analysis revealed significantly less BOS in Tac treated patients (p=0.033, log rank test, pooled over strata). Cox regression showed a twice as high risk for BOS in the CsA group (factor 2.003). Incidence of acute rejection was 67.5% (Tac) and 75.2% (CsA) (p=0.583). One- and 3-year-survival-rates were not different (85.4% Tac vs. 88.8% CsA, and 80.5% Tac vs. 83.2% CsA, p=n.s.). Incidence of infections and renal failure was similar (p=n.s.). Conclusions: Tac significantly reduced the risk for BOS after 3 years in this intention to treat analysis. Both regimens have a good immunosuppressive potential and offer a similar safety profile with excellent one and three year survival rates. Acute rejection rates were similar in both groups. Incidence of infections and renal failure showed no difference.
Resumo:
Lung transplantation has evolved from an experimental procedure to a viable therapeutic option in many countries. In Switzerland, the first lung transplant was performed in November 1992, more than ten years after the first successful procedure world-wide. Thenceforward, a prospective national lung transplant registry was established, principally to enable quality control. The data of all patients transplanted in the two Swiss Lung Transplant centres Zurich University Hospital and Centre de Romandie (Geneva-Lausanne) were analysed. In 10 years 242 lung transplants have been performed. Underlying lung diseases were cystic fibrosis including bronchiectasis (32%), emphysema (32%), parenchymal disorders (19%), pulmonary hypertension (11%) and lymphangioleiomyomatosis (3%). There were only 3% redo procedures. The 1, 5 and 9 year survival rates were 77% (95% CI 72-82), 64% (95% CI 57-71) and 56% (95% CI 45-67), respectively. The 5 year survival rate of patients transplanted since 1998 was 72% (95% CI 64-80). Multivariate Cox regression analysis revealed that survival was significantly better in this group compared to those transplanted before 1998 (HR 0.44, 0.26-0.75). Patients aged 60 years and older (HR 5.67, 95% CI 2.50-12.89) and those with pulmonary hypertension (HR 2.01, 95% CI 1.10-3.65) had a significantly worse prognosis The most frequent causes of death were infections (29%), bronchiolitis obliterans syndrome (25%) and multiple organ failure (14%). The 10-year Swiss experience of lung transplantation compares favourably with the international data. The best results are obtained in cystic fibrosis, pulmonary emphysema and parenchymal disorders.
Resumo:
From January 1995 to August 1997 we evaluated prospectively the clinical presentation, laboratory findings and short-term survival of smear-positive pulmonary tuberculosis (TB) patients who sought care at our hospital. After providing informed, written consent, the patients were interviewed and laboratory tests were performed. Information about survivorship and death was collected through September 1998. Eighty-six smear-positive pulmonary TB patients were enrolled; 26.7% were HIV-seropositive. Seventeen HIV-seronegative pulmonary TB patients (19.8%) presented chronic diseases in addition to TB. In the multiple logistic regression analysis a CD4+ cell count <= 200 cell/mm³ was independently associated with HIV seropositivity. In the Cox regression model, fitted to all patients, HIV seropositivity and age > or = 50 years were independently associated with decreased survival. Among HIV-seronegative persons, the presence of an additional disease increased the risk of death of almost six-fold. Use of antiretroviral drugs was associated with a lower risk of death among HIV-seropositive smear-positive pulmonary TB patients (RH = 0.32, 95% CI 0.10-0.92). In our study smear-positive pulmonary TB patients had a low short-term survival rate that was strongly associated with HIV infection, age and co-morbidities. Therapy with antiretroviral drugs reduced the short-term risk of death among HIV-seropositive patients after TB diagnosis.