190 resultados para Proportional counters.
Resumo:
BACKGROUND & AIMS: The beneficial effect of nonselective beta-blockers (NSBB) has recently been questioned in patients with end-stage cirrhosis. We analysed the impact of NSBB on outcomes in severe alcoholic hepatitis (AH). METHODS: This study was based on a prospective database of patients with severe, biopsy-proven AH. Patients admitted from July, 2006 to July, 2014 were retrospectively studied. Patients were divided into two groups (with and without NSBB) and assessed for the occurrence of Acute Kidney Injury (AKI) and transplant-free mortality during a 168-day follow-up period. RESULTS: One hundred thirty-nine patients were included, the mean Maddrey score was 71 ± 34 and 86 patients (61.9%) developed AKI. Forty-eight patients (34.5%) received NSBB. The overall 168-day transplant-free mortality was 50.5% (95%CI, 41.3-60.0%). The overall 168-day cumulative incidence of AKI was 61.9% (95%CI, 53.2-69.4%). When compared, patients with NSBB had a lower heart rate (65 ± 13 vs 92 ± 12, P < 0.0001) and a lower mean arterial pressure (MAP, 78 ± 3 vs 87 ± 5, P < 0.0001). Patients with NSBB had comparable MELD scores, Maddrey scores, and medical histories. The 168-day transplant-free mortality was 56.8% (95%CI, 41.3-69.7%) in patients with NSBB and 46.7% (95%CI, 35.0-57.6%) without NSBB (P = 0.25). The 168-day cumulative incidence of AKI was 89.6% (95%CI, 74.9-95.9%) with NSBB compared to 50.4% (95%CI: 39.0-60.7) for no NSBB (P = 0.0001). The independent factors predicting AKI were a higher MELD score and the presence of NSBB. CONCLUSIONS: The use of NSBB in patients with severe AH is independently associated with a higher cumulative incidence of AKI.
Resumo:
BACKGROUND: The impact of early treatment with immunomodulators (IM) and/or TNF antagonists on bowel damage in Crohn's disease (CD) patients is unknown. AIM: To assess whether 'early treatment' with IM and/or TNF antagonists, defined as treatment within a 2-year period from the date of CD diagnosis, was associated with development of lesser number of disease complications when compared to 'late treatment', which was defined as treatment initiation after >2 years from the time of CD diagnosis. METHODS: Data from the Swiss IBD Cohort Study were analysed. The following outcomes were assessed using Cox proportional hazard modelling: bowel strictures, perianal fistulas, internal fistulas, intestinal surgery, perianal surgery and any of the aforementioned complications. RESULTS: The 'early treatment' group of 292 CD patients was compared to the 'late treatment' group of 248 CD patients. We found that 'early treatment' with IM or TNF antagonists alone was associated with reduced risk of bowel strictures [hazard ratio (HR) 0.496, P = 0.004 for IM; HR 0.276, P = 0.018 for TNF antagonists]. Furthermore, 'early treatment' with IM was associated with reduced risk of undergoing intestinal surgery (HR 0.322, P = 0.005), and perianal surgery (HR 0.361, P = 0.042), as well as developing any complication (HR 0.567, P = 0.006). CONCLUSIONS: Treatment with immunomodulators or TNF antagonists within the first 2 years of CD diagnosis was associated with reduced risk of developing bowel strictures, when compared to initiating these drugs >2 years after diagnosis. Furthermore, early immunomodulators treatment was associated with reduced risk of intestinal surgery, perianal surgery and any complication.
Resumo:
BACKGROUND: The impact of early valve surgery (EVS) on the outcome of Staphylococcus aureus (SA) prosthetic valve infective endocarditis (PVIE) is unresolved. The objective of this study was to evaluate the association between EVS, performed within the first 60 days of hospitalization, and outcome of SA PVIE within the International Collaboration on Endocarditis-Prospective Cohort Study. METHODS: Participants were enrolled between June 2000 and December 2006. Cox proportional hazards modeling that included surgery as a time-dependent covariate and propensity adjustment for likelihood to receive cardiac surgery was used to evaluate the impact of EVS and 1-year all-cause mortality on patients with definite left-sided S. aureus PVIE and no history of injection drug use. RESULTS: EVS was performed in 74 of the 168 (44.3%) patients. One-year mortality was significantly higher among patients with S. aureus PVIE than in patients with non-S. aureus PVIE (48.2% vs 32.9%; P = .003). Staphylococcus aureus PVIE patients who underwent EVS had a significantly lower 1-year mortality rate (33.8% vs 59.1%; P = .001). In multivariate, propensity-adjusted models, EVS was not associated with 1-year mortality (risk ratio, 0.67 [95% confidence interval, .39-1.15]; P = .15). CONCLUSIONS: In this prospective, multinational cohort of patients with S. aureus PVIE, EVS was not associated with reduced 1-year mortality. The decision to pursue EVS should be individualized for each patient, based upon infection-specific characteristics rather than solely upon the microbiology of the infection causing PVIE.
Resumo:
INTRODUCTION: Time to fitness for work (TFW) was measured as the number of days that were paid as compensation for work disability during the 4 years after discharge from the rehabilitation clinic in a population of patients hospitalised for rehabilitation after orthopaedic trauma. The aim of this study was to test whether some psychological variables can be used as potential early prognostic factors of TFW. MATERIAL AND METHODS: A Cox proportional hazards model was used to estimate the associations between predictive variables and TFW. Predictors were global health, pain at hospitalisation and pain decrease during the stay (all continuous and standardised by subtracting the mean and dividing by two standard deviations), perceived severity of the trauma and expectation of a positive evolution (both binary variables). RESULTS: Full data were available for 807 inpatients (660 men, 147 women). TFW was positively associated with better perceived health (hazard ratio [HR] 1.16, 95% confidence interval [CI] 1.13-1.19), pain decrease (HR 1.46, 95% CI 1.30-1.64) and expectation of a positive evolution (HR 1.50, 95% CI 1.32-1.70) and negatively associated with pain at hospitalisation (HR 0.67, 95% CI 0.59-0.76) and high perceived severity (HR 0.72, 95% CI 0.61-0.85). DISCUSSION: The present results provide some evidence that work disability during a four-year period after rehabilitation may be predicted by prerehabilitation perceptions of general health, pain, injury severity, as well as positive expectation of evolution.
Resumo:
BACKGROUND: An inverse correlation between expression of the aldehyde dehydrogenase 1 subfamily A2 (ALDH1A2) and gene promoter methylation has been identified as a common feature of oropharyngeal squamous cell carcinoma (OPSCC). Moreover, low ALDH1A2 expression was associated with an unfavorable prognosis of OPSCC patients, however the causal link between reduced ALDH1A2 function and treatment failure has not been addressed so far. METHODS: Serial sections from tissue microarrays of patients with primary OPSCC (n = 101) were stained by immunohistochemistry for key regulators of retinoic acid (RA) signaling, including ALDH1A2. Survival with respect to these regulators was investigated by univariate Kaplan-Meier analysis and multivariate Cox regression proportional hazard models. The impact of ALDH1A2-RAR signaling on tumor-relevant processes was addressed in established tumor cell lines and in an orthotopic mouse xenograft model. RESULTS: Immunohistochemical analysis showed an improved prognosis of ALDH1A2(high) OPSCC only in the presence of CRABP2, an intracellular RA transporter. Moreover, an ALDH1A2(high)CRABP2(high) staining pattern served as an independent predictor for progression-free (HR: 0.395, p = 0.007) and overall survival (HR: 0.303, p = 0.002), suggesting a critical impact of RA metabolism and signaling on clinical outcome. Functionally, ALDH1A2 expression and activity in tumor cell lines were related to RA levels. While administration of retinoids inhibited clonogenic growth and proliferation, the pharmacological inhibition of ALDH1A2-RAR signaling resulted in loss of cell-cell adhesion and a mesenchymal-like phenotype. Xenograft tumors derived from FaDu cells with stable silencing of ALDH1A2 and primary tumors from OPSCC patients with low ALDH1A2 expression exhibited a mesenchymal-like phenotype characterized by vimentin expression. CONCLUSIONS: This study has unraveled a critical role of ALDH1A2-RAR signaling in the pathogenesis of head and neck cancer and our data implicate that patients with ALDH1A2(low) tumors might benefit from adjuvant treatment with retinoids.
Resumo:
BACKGROUND: Postoperative hemithoracic radiotherapy has been used to treat malignant pleural mesothelioma, but it has not been assessed in a randomised trial. We assessed high-dose hemithoracic radiotherapy after neoadjuvant chemotherapy and extrapleural pneumonectomy in patients with malignant pleural mesothelioma. METHODS: We did this phase 2 trial in two parts at 14 hospitals in Switzerland, Belgium, and Germany. We enrolled patients with pathologically confirmed malignant pleural mesothelioma; resectable TNM stages T1-3 N0-2, M0; WHO performance status 0-1; age 18-70 years. In part 1, patients were given three cycles of neoadjuvant chemotherapy (cisplatin 75 mg/m(2) and pemetrexed 500 mg/m(2) on day 1 given every 3 weeks) and extrapleural pneumonectomy; the primary endpoint was complete macroscopic resection (R0-1). In part 2, participants with complete macroscopic resection were randomly assigned (1:1) to receive high-dose radiotherapy or not. The target volume for radiotherapy encompassed the entire hemithorax, the thoracotomy channel, and mediastinal nodal stations if affected by the disease or violated surgically. A boost was given to areas at high risk for locoregional relapse. The allocation was stratified by centre, histology (sarcomatoid vs epithelioid or mixed), mediastinal lymph node involvement (N0-1 vs N2), and T stage (T1-2 vs T3). The primary endpoint of part 1 was the proportion of patients achieving complete macroscopic resection (R0 and R1). The primary endpoint in part 2 was locoregional relapse-free survival, analysed by intention to treat. The trial is registered with ClinicalTrials.gov, number NCT00334594. FINDINGS: We enrolled patients between Dec 7, 2005, and Oct 17, 2012. Overall, we analysed 151 patients receiving neoadjuvant chemotherapy, of whom 113 (75%) had extrapleural pneumonectomy. Median follow-up was 54·2 months (IQR 32-66). 52 (34%) of 151 patients achieved an objective response. The most common grade 3 or 4 toxic effects were neutropenia (21 [14%] of 151 patients), anaemia (11 [7%]), and nausea or vomiting (eight [5%]). 113 patients had extrapleural pneumonectomy, with complete macroscopic resection achieved in 96 (64%) of 151 patients. We enrolled 54 patients in part 2; 27 in each group. The main reasons for exclusion were patient refusal (n=20) and ineligibility (n=10). 25 of 27 patients completed radiotherapy. Median total radiotherapy dose was 55·9 Gy (IQR 46·8-56·0). Median locoregional relapse-free survival from surgery, was 7·6 months (95% CI 4·5-10·7) in the no radiotherapy group and 9·4 months (6·5-11·9) in the radiotherapy group. The most common grade 3 or higher toxic effects related to radiotherapy were nausea or vomiting (three [11%] of 27 patients), oesophagitis (two [7%]), and pneumonitis (two [7%]). One patient died of pneumonitis. We recorded no toxic effects data for the control group. INTERPRETATION: Our findings do not support the routine use of hemithoracic radiotherapy for malignant pleural mesothelioma after neoadjuvant chemotherapy and extrapleural pneumonectomy. FUNDING: Swiss Group for Clinical Cancer Research, Swiss State Secretariat for Education, Research and Innovation, Eli Lilly.
Resumo:
BACKGROUND: While the association between smoking and arterial cardiovascular events has been well established, the association between smoking and venous thromboembolism (VTE) remains controversial. OBJECTIVES: To assess the association between smoking and the risk of recurrent VTE and bleeding in patients who have experienced acute VTE. PATIENTS/METHODS: This study is part of a prospective Swiss multicenter cohort that included patients aged ≥65years with acute VTE. Three groups were defined according to smoking status: never, former and current smokers. The primary outcome was the time to a first symptomatic, objectively confirmed VTE recurrence. Secondary outcomes were the time to a first major and clinically relevant non-major bleeding. Associations between smoking status and outcomes were analysed using proportional hazard models for the subdistribution of a competing risk of death. RESULTS: Among 988 analysed patients, 509 (52%) had never smoked, 403 (41%) were former smokers, and 76 (8%) current smokers. After a median follow-up of 29.6months, we observed a VTE recurrence rate of 4.9 (95% confidence interval [CI] 3.7-6.4) per 100 patient-years for never smokers, 6.6 (95% CI 5.1-8.6) for former smokers, and 5.2 (95% CI 2.6-10.5) for current smokers. Compared to never smokers, we found no association between current smoking and VTE recurrence (adjusted sub-hazard ratio [SHR] 1.05, 95% CI 0.49-2.28), major bleeding (adjusted SHR 0.59, 95% CI 0.25-1.39), and clinically relevant non-major bleeding (adjusted SHR 1.21, 95% CI 0.73-2.02). CONCLUSIONS: In this multicentre prospective cohort study, we found no association between smoking status and VTE recurrence or bleeding in elderly patients with VTE.
Resumo:
BACKGROUND: The most recommended NRTI combinations as first-line antiretroviral treatment for HIV-1 infection in resource-rich settings are tenofovir/emtricitabine, abacavir/lamivudine, tenofovir/lamivudine and zidovudine/lamivudine. Efficacy studies of these combinations also considering pill numbers, dosing frequencies and ethnicities are rare. METHODS: We included patients starting first-line combination ART (cART) with or switching from first-line cART without treatment failure to tenofovir/emtricitabine, abacavir/lamivudine, tenofovir/lamivudine and zidovudine/lamivudine plus efavirenz or nevirapine. Cox proportional hazards regression was used to investigate the effect of the different NRTI combinations on two primary outcomes: virological failure (VF) and emergence of NRTI resistance. Additionally, we performed a pill burden analysis and adjusted the model for pill number and dosing frequency. RESULTS: Failure events per treated patient for the four NRTI combinations were as follows: 19/1858 (tenofovir/emtricitabine), 9/387 (abacavir/lamivudine), 11/344 (tenofovir/lamivudine) and 45/1244 (zidovudine/lamivudine). Compared with tenofovir/emtricitabine, abacavir/lamivudine had an adjusted HR for having VF of 2.01 (95% CI 0.86-4.55), tenofovir/lamivudine 2.89 (1.22-6.88) and zidovudine/lamivudine 2.28 (1.01-5.14), whereas for the emergence of NRTI resistance abacavir/lamivudine had an HR of 1.17 (0.11-12.2), tenofovir/lamivudine 11.3 (2.34-55.3) and zidovudine/lamivudine 4.02 (0.78-20.7). Differences among regimens disappeared when models were additionally adjusted for pill burden. However, non-white patients compared with white patients and higher pill number per day were associated with increased risks of VF and emergence of NRTI resistance: HR of non-white ethnicity for VF was 2.85 (1.64-4.96) and for NRTI resistance 3.54 (1.20-10.4); HR of pill burden for VF was 1.41 (1.01-1.96) and for NRTI resistance 1.72 (0.97-3.02). CONCLUSIONS: Although VF and emergence of resistance was very low in the population studied, tenofovir/emtricitabine appears to be superior to abacavir/lamivudine, tenofovir/lamivudine and zidovudine/lamivudine. However, it is unclear whether these differences are due to the substances as such or to an association of tenofovir/emtricitabine regimens with lower pill burden.
Resumo:
BACKGROUND: Biliary tract cancer is an uncommon cancer with a poor outcome. We assembled data from the National Cancer Research Institute (UK) ABC-02 study and 10 international studies to determine prognostic outcome characteristics for patients with advanced disease. METHODS: Multivariable analyses of the final dataset from the ABC-02 study were carried out. All variables were simultaneously included in a Cox proportional hazards model, and backward elimination was used to produce the final model (using a significance level of 10%), in which the selected variables were associated independently with outcome. This score was validated externally by receiver operating curve (ROC) analysis using the independent international dataset. RESULTS: A total of 410 patients were included from the ABC-02 study and 753 from the international dataset. An overall survival (OS) and progression-free survival (PFS) Cox model was derived from the ABC-02 study. White blood cells, haemoglobin, disease status, bilirubin, neutrophils, gender, and performance status were considered prognostic for survival (all with P < 0.10). Patients with metastatic disease {hazard ratio (HR) 1.56 [95% confidence interval (CI) 1.20-2.02]} and Eastern Cooperative Oncology Group performance status (ECOG PS) 2 had worse survival [HR 2.24 (95% CI 1.53-3.28)]. In a dataset restricted to patients who received cisplatin and gemcitabine with ECOG PS 0 and 1, only haemoglobin, disease status, bilirubin, and neutrophils were associated with PFS and OS. ROC analysis suggested the models generated from the ABC-02 study had a limited prognostic value [6-month PFS: area under the curve (AUC) 62% (95% CI 57-68); 1-year OS: AUC 64% (95% CI 58-69)]. CONCLUSION: These data propose a set of prognostic criteria for outcome in advanced biliary tract cancer derived from the ABC-02 study that are validated in an international dataset. Although these findings establish the benchmark for the prognostic evaluation of patients with ABC and confirm the value of longheld clinical observations, the ability of the model to correctly predict prognosis is limited and needs to be improved through identification of additional clinical and molecular markers.
Resumo:
UNLABELLED: It is uncertain whether bone mineral density (BMD) can accurately predict fracture in kidney transplant recipients. Trabecular bone score (TBS) provides information independent of BMD. Kidney transplant recipients had abnormal bone texture as measured by lumbar spine TBS, and a lower TBS was associated with incident fractures in recipients. INTRODUCTION: Trabecular bone score (TBS) is a texture measure derived from dual energy X-ray absorptiometry (DXA) lumbar spine images, providing information independent of bone mineral density. We assessed characteristics associated with TBS and fracture outcomes in kidney transplant recipients. METHODS: We included 327 kidney transplant recipients from Manitoba, Canada, who received a post-transplant DXA (median 106 days post-transplant). We matched each kidney transplant recipient (mean age 45 years, 39 % men) to three controls from the general population (matched on age, sex, and DXA date). Lumbar spine (L1-L4) DXA images were used to derive TBS. Non-traumatic incident fracture (excluding hand, foot, and craniofacial) (n = 31) was assessed during a mean follow-up of 6.6 years. We used multivariable linear regression models to test predictors of TBS, and multivariable Cox proportional hazard regression was used to estimate hazard ratios (HRs) per standard deviation decrease in TBS to express the gradient of risk. RESULTS: Compared to the general population, kidney transplant recipients had a significantly lower lumbar spine TBS (1.365 ± 0.129 versus 1.406 ± 0.125, P < 0.001). Multivariable linear regression revealed that receipt of a kidney transplant was associated with a significantly lower mean TBS compared to controls (-0.0369, 95 % confidence interval [95 % CI] -0.0537 to -0.0202). TBS was associated with fractures independent of the Fracture Risk Assessment score including BMD (adjusted HR per standard deviation decrease in TBS 1.64, 95 % CI 1.15-2.36). CONCLUSION: Kidney transplant recipients had abnormal bone texture as assessed by TBS and a lower lumbar spine TBS was associated with fractures in recipients.