68 resultados para mean-risk
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
OBJECTIVES: To estimate changes in coronary risk factors and their implications for coronary heart disease (CHD) rates in men starting highly active antiretroviral therapy (HAART). METHODS: Men participating in the Swiss HIV Cohort Study with measurements of coronary risk factors both before and up to 3 years after starting HAART were identified. Fractional polynomial regression was used to graph associations between risk factors and time on HAART. Mean risk factor changes associated with starting HAART were estimated using multilevel models. A prognostic model was used to predict corresponding CHD rate ratios. RESULTS: Of 556 eligible men, 259 (47%) started a nonnucleoside reverse transcriptase inhibitor (NNRTI) and 297 a protease inhibitor (PI) based regimen. Levels of most risk factors increased sharply during the first 3 months on HAART, then more slowly. Increases were greater with PI- than NNRTI-based HAART for total cholesterol (1.18 vs. 0.98 mmol L(-1)), systolic blood pressure (3.6 vs. 0 mmHg) and BMI (1.04 vs. 0.55 kg m(2)) but not HDL cholesterol (0.24 vs. 0.32 mmol L(-1)) or glucose (1.02 vs. 1.03 mmol L(-1)). Predicted CHD rate ratios were 1.40 (95% CI 1.13-1.75) and 1.17 (0.95-1.47) for PI- and NNRTI-based HAART respectively. CONCLUSIONS: Coronary heart disease rates will increase in a majority of patients starting HAART: however the increases corresponding to typical changes in risk factors are relatively modest and could be offset by lifestyle changes.
Resumo:
Background Current knowledge about risk factors promoting hypertensive crisis originates from retrospective data. Therefore, potential risk factors of hypertensive crisis were assessed in a prospective longitudinal study. Methods Eighty-nine patients of the medical outpatient unit at the University Hospital of Bern (Bern, Switzerland) with previously diagnosed hypertension participated in this study. At baseline, 33 potential risk factors were assessed. All patients were followed-up for the outcome of hypertensive crisis. Cox regression models were used to detect relationships between risk factors and hypertensive crisis (defined as acute rise of systolic blood pressure (BP) ≥200mmHg and/or diastolic BP ≥120mmHg). Results The mean duration of follow-up was 1.6 ± 0.3 years (range 1.0–2.4 years). Four patients (4.5%) were lost to follow-up. Thirteen patients (15.3%) experienced hypertensive crisis during follow-up. Several potential risk factors were significantly associated with hypertensive crisis: female sex, higher grades of obesity, the presence of a hypertensive or coronary heart disease, the presence of a somatoform disorder, a higher number of antihypertensive drugs, and nonadherence to medication. As measured by the hazard ratio, nonadherence was the most important factor associated with hypertensive crisis (hazard ratio 5.88, 95% confidence interval 1.59–21.77, P < 0.01). Conclusions This study identified several potential risk factors of hypertensive crisis. Results of this study are consistent with the hypothesis that improvement of medical adherence in antihypertensive therapy would help to prevent hypertensive crises. However, larger studies are needed to assess potential confounding, other risk factors and the possibility of interaction between predictors.
Resumo:
Background Surgical risk scores, such as the logistic EuroSCORE (LES) and Society of Thoracic Surgeons Predicted Risk of Mortality (STS) score, are commonly used to identify high-risk or “inoperable” patients for transcatheter aortic valve implantation (TAVI). In Europe, the LES plays an important role in selecting patients for implantation with the Medtronic CoreValve System. What is less clear, however, is the role of the STS score of these patients and the relationship between the LES and STS. Objective The purpose of this study is to examine the correlation between LES and STS scores and their performance characteristics in high-risk surgical patients implanted with the Medtronic CoreValve System. Methods All consecutive patients (n = 168) in whom a CoreValve bioprosthesis was implanted between November 2005 and June 2009 at 2 centers (Bern University Hospital, Bern, Switzerland, and Erasmus Medical Center, Rotterdam, The Netherlands) were included for analysis. Patient demographics were recorded in a prospective database. Logistic EuroSCORE and STS scores were calculated on a prospective and retrospective basis, respectively. Results Observed mortality was 11.1%. The mean LES was 3 times higher than the mean STS score (LES 20.2% ± 13.9% vs STS 6.7% ± 5.8%). Based on the various LES and STS cutoff values used in previous and ongoing TAVI trials, 53% of patients had an LES ≥15%, 16% had an STS ≥10%, and 40% had an LES ≥20% or STS ≥10%. Pearson correlation coefficient revealed a reasonable (moderate) linear relationship between the LES and STS scores, r = 0.58, P < .001. Although the STS score outperformed the LES, both models had suboptimal discriminatory power (c-statistic, 0.49 for LES and 0.69 for STS) and calibration. Conclusions Clinical judgment and the Heart Team concept should play a key role in selecting patients for TAVI, whereas currently available surgical risk score algorithms should be used to guide clinical decision making.
Resumo:
Objectives The aim of this study was to assess the role of transcatheter aortic valve implantation (TAVI) compared with medical treatment (MT) and surgical aortic valve replacement (SAVR) in patients with severe aortic stenosis (AS) at increased surgical risk. Background Elderly patients with comorbidities are at considerable risk for SAVR. Methods Since July 2007, 442 patients with severe AS (age: 81.7 ± 6.0 years, mean logistic European System for Cardiac Operative Risk Evaluation: 22.3 ± 14.6%) underwent treatment allocation to MT (n = 78), SAVR (n = 107), or TAVI (n = 257) on the basis of a comprehensive evaluation protocol as part of a prospective registry. Results Baseline clinical characteristics were similar among patients allocated to MT and TAVI, whereas patients allocated to SAVR were younger (p < 0.001) and had a lower predicted peri-operative risk (p < 0.001). Unadjusted rates of all-cause mortality at 30 months were lower for SAVR (22.4%) and TAVI (22.6%) compared with MT (61.5%, p < 0.001). Adjusted hazard ratios for death were 0.51 (95% confidence interval: 0.30 to 0.87) for SAVR compared with MT and 0.38 (95% confidence interval: 0.25 to 0.58) for TAVI compared with MT. Medical treatment (<0.001), older age (>80 years, p = 0.01), peripheral vascular disease (<0.001), and atrial fibrillation (p = 0.04) were significantly associated with all-cause mortality at 30 months in the multivariate analysis. At 1 year, more patients undergoing SAVR (92.3%) or TAVI (93.2%) had New York Heart Association functional class I/II as compared with patients with MT (70.8%, p = 0.003). Conclusions Among patients with severe AS with increased surgical risk, SAVR and TAVI improve survival and symptoms compared with MT. Clinical outcomes of TAVI and SAVR seem similar among carefully selected patients with severe symptomatic AS at increased risk.
Resumo:
Introduction Reduced left ventricular function in patients with severe symptomatic valvular aortic stenosis is associated with impaired clinical outcome in patients undergoing surgical aortic valve replacement (SAVR). Transcatheter Aortic Valve Implantation (TAVI) has been shown non-inferior to SAVR in high-risk patients with respect to mortality and may result in faster left ventricular recovery. Methods We investigated clinical outcomes of high-risk patients with severe aortic stenosis undergoing medical treatment (n = 71) or TAVI (n = 256) stratified by left ventricular ejection fraction (LVEF) in a prospective single center registry. Results Twenty-five patients (35%) among the medical cohort were found to have an LVEF≤30% (mean 26.7±4.1%) and 37 patients (14%) among the TAVI patients (mean 25.2±4.4%). Estimated peri-interventional risk as assessed by logistic EuroSCORE was significantly higher in patients with severely impaired LVEF as compared to patients with LVEF>30% (medical/TAVI 38.5±13.8%/40.6±16.4% versus medical/TAVI 22.5±10.8%/22.1±12.8%, p <0.001). In patients undergoing TAVI, there was no significant difference in the combined endpoint of death, myocardial infarction, major stroke, life-threatening bleeding, major access-site complications, valvular re-intervention, or renal failure at 30 days between the two groups (21.0% versus 27.0%, p = 0.40). After TAVI, patients with LVEF≤30% experienced a rapid improvement in LVEF (from 25±4% to 34±10% at discharge, p = 0.002) associated with improved NYHA functional class at 30 days (decrease ≥1 NYHA class in 95%). During long-term follow-up no difference in survival was observed in patients undergoing TAVI irrespective of baseline LVEF (p = 0.29), whereas there was a significantly higher mortality in medically treated patients with severely reduced LVEF (log rank p = 0.001). Conclusion TAVI in patients with severely reduced left ventricular function may be performed safely and is associated with rapid recovery of systolic left ventricular function and heart failure symptoms.
Resumo:
Knowledge on the relative importance of alternative sources of human campylobacteriosis is important in order to implement effective disease prevention measures. The objective of this study was to assess the relative importance of three key exposure pathways (travelling abroad, poultry meat, pet contact) for different patient age groups in Switzerland. With a stochastic exposure model data on Campylobacter incidence for the years 2002-2007 were linked with data for the three exposure pathways and the results of a case-control study. Mean values for the population attributable fractions (PAF) over all age groups and years were 27% (95% CI 17-39) for poultry consumption, 27% (95% CI 22-32) for travelling abroad, 8% (95% CI 6-9) for pet contact and 39% (95% CI 25-50) for other risk factors. This model provided robust results when using data available for Switzerland, but the uncertainties remained high. The output of the model could be improved if more accurate input data are available to estimate the infection rate per exposure. In particular, the relatively high proportion of cases attributed to 'other risk factors' requires further attention.
Resumo:
Background Guidelines for the prevention of coronary heart disease (CHD) recommend use of Framingham-based risk scores that were developed in white middle-aged populations. It remains unclear whether and how CHD risk prediction might be improved among older adults. We aimed to compare the prognostic performance of the Framingham risk score (FRS), directly and after recalibration, with refit functions derived from the present cohort, as well as to assess the utility of adding other routinely available risk parameters to FRS. Methods Among 2193 black and white older adults (mean age, 73.5 years) without pre-existing cardiovascular disease from the Health ABC cohort, we examined adjudicated CHD events, defined as incident myocardial infarction, CHD death, and hospitalization for angina or coronary revascularization. Results During 8-year follow-up, 351 participants experienced CHD events. The FRS poorly discriminated between persons who experienced CHD events vs. not (C-index: 0.577 in women; 0.583 in men) and underestimated absolute risk prediction by 51% in women and 8% in men. Recalibration of the FRS improved absolute risk prediction, particulary for women. For both genders, refitting these functions substantially improved absolute risk prediction, with similar discrimination to the FRS. Results did not differ between whites and blacks. The addition of lifestyle variables, waist circumference and creatinine did not improve risk prediction beyond risk factors of the FRS. Conclusions The FRS underestimates CHD risk in older adults, particularly in women, although traditional risk factors remain the best predictors of CHD. Re-estimated risk functions using these factors improve accurate estimation of absolute risk.
Resumo:
Over the last couple of decades, the treatment of psychoses has much advanced; yet, despite all progress, the individual and societal burden associated with psychosis and particularly schizophrenia has largely remained unchanged. Therefore, much hope is currently placed on indicated prevention as a mean to fight these burdens before they set in. Though the number of studies investigating pharmacological interventions is still limited, encouraging results have been reported from the pioneering trials, despite several methodological limitations. Furthermore, it has become clear that persons characterized by the at-risk criteria are already ill and do not only need preventive intervention, but also treatment. In consequence, outcome criteria have to be broadened to cover the current needs of the patients. As is indicated by a recent study successfully using Omega-3 fatty acids for both purposes, it may be promising to develop and investigate interventions especially for the at-risk state, independent of their effectiveness in manifest disease states. Treatment studies may become promoted by the proposed introduction of a new disorder category into DSM-V. Future prevention studies, however, need to solve the challenge of changing immediate transition rates, demanding for new risk enrichment strategies as a prerequisite for feasible trial designs.
Resumo:
Ultrasound detection of sub-clinical atherosclerosis (ATS) may help identify individuals at high cardiovascular risk. Most studies evaluated intima-media thickness (IMT) at carotid level. We compared the relationships between main cardiovascular risk factors (CVRF) and five indicators of ATS (IMT, mean and maximal plaque thickness, mean and maximal plaque area) at both carotid and femoral levels. Ultrasound was performed on 496 participants aged 45-64 years randomly selected from the general population of the Republic of Seychelles. 73.4 % participants had ≥ 1 plaque (IMT thickening ≥ 1.2 mm) at carotid level and 67.5 % at femoral level. Variance (adjusted R2) contributed by age, sex and CVRF (smoking, LDL-cholesterol, HDL-cholesterol, blood pressure, diabetes) in predicting any of the ATS markers was larger at femoral than carotid level. At both carotid and femoral levels, the association between CVRF and ATS was stronger based on plaque-based markers than IMT. Our findings show that the associations between CVRF and ATS markers were stronger at femoral than carotid level, and with plaque-based markers rather than IMT. Pending comparison of these markers using harder cardiovascular endpoints, our findings suggest that markers based on plaque morphology assessed at femoral artery level might be useful cardiovascular risk predictors.
Resumo:
OBJECTIVE: To examine the duration of methicillin-resistant Staphylococcus aureus (MRSA) carriage and its determinants and the influence of eradication regimens. DESIGN: Retrospective cohort study. SETTING: A 1,033-bed tertiary care university hospital in Bern, Switzerland, in which the prevalence of methicillin resistance among S. aureus isolates is less than 5%. PATIENTS: A total of 116 patients with first-time MRSA detection identified at University Hospital Bern between January 1, 2000, and December 31, 2003, were followed up for a mean duration of 16.2 months. RESULTS: Sixty-eight patients (58.6%) cleared colonization, with a median time to clearance of 7.4 months. Independent determinants for shorter carriage duration were the absence of any modifiable risk factor (receipt of antibiotics, use of an indwelling device, or presence of a skin lesion) (hazard ratio [HR], 0.20 [95% confidence interval {CI}, 0.09-0.42]), absence of immunosuppressive therapy (HR, 0.49 [95% CI, 0.23-1.02]), and hemodialysis (HR, 0.08 [95% CI, 0.01-0.66]) at the time MRSA was first MRSA detected and the administration of decolonization regimen in the absence of a modifiable risk factor (HR, 2.22 [95% CI, 1.36-3.64]). Failure of decolonization treatment was associated with the presence of risk factors at the time of treatment (P=.01). Intermittent screenings that were negative for MRSA were frequent (26% of patients), occurred early after first detection of MRSA (median, 31.5 days), and were associated with a lower probability of clearing colonization (HR, 0.34 [95% CI, 0.17-0.67]) and an increased risk of MRSA infection during follow-up. CONCLUSIONS: Risk factors for MRSA acquisition should be carefully assessed in all MRSA carriers and should be included in infection control policies, such as the timing of decolonization treatment, the definition of MRSA clearance, and the decision of when to suspend isolation measures.
Resumo:
BACKGROUND AND PURPOSE: Sleep-disordered breathing (SDB) is frequent in stroke patients. Risk factors, treatment response, short-term and long-term outcome of SDB in stroke patients are poorly known. METHODS: We prospectively studied 152 patients (mean age 56+/-13 years) with acute ischemic stroke. Cardiovascular risk factors, Epworth sleepiness score (ESS), stroke severity/etiology, and time of stroke onset were assessed. The apnea-hypopnea index (AHI) was determined 3+/-2 days after stroke onset and 6 months later (subacute phase). Continuous positive airway pressure (CPAP) treatment was started acutely in patients with SDB (AHI > or =15 or AHI > or =10+ESS >10). CPAP compliance, incidence of vascular events, and stroke outcome were assessed 60+/-16 months later (chronic phase). RESULTS: Initial AHI was 18+/-16 (> or =10 in 58%, > or =30 in 17% of patients) and decreased in the subacute phase (P<0.001). Age, diabetes, and nighttime stroke onset were independent predictors of AHI (r2=0.34). In patients with AHI > or =30, age, male gender, body mass index, diabetes, hypertension, coronary heart disease, ESS, and macroangiopathic etiology of stroke were significantly higher/more common than in patients with AHI <10. Long-term incidence of vascular events and stroke outcome were similar in both groups. CPAP was started in 51% and continued chronically in 15% of SDB pts. Long-term stroke mortality was associated with initial AHI, age, hypertension, diabetes, and coronary heart disease. CONCLUSIONS: SDB is common particularly in elderly stroke male patients with diabetes, nighttime stroke onset, and macroangiopathy as cause of stroke; it improves after the acute phase, is associated with an increased poststroke mortality, and can be treated with CPAP in a small percentage of patients.
Resumo:
Introduction Low central venous oxygen saturation (ScvO2) has been associated with increased risk of postoperative complications in high-risk surgery. Whether this association is centre-specific or more generalisable is not known. The aim of this study was to assess the association between peri- and postoperative ScvO2 and outcome in high-risk surgical patients in a multicentre setting. Methods Three large European university hospitals (two in Finland, one in Switzerland) participated. In 60 patients with intra-abdominal surgery lasting more than 90 minutes, the presence of at least two of Shoemaker's criteria, and ASA (American Society of Anesthesiologists) class greater than 2, ScvO2 was determined preoperatively and at two hour intervals during the operation until 12 hours postoperatively. Hospital length of stay (LOS) mortality, and predefined postoperative complications were recorded. Results The age of the patients was 72 ± 10 years (mean ± standard deviation), and simplified acute physiology score (SAPS II) was 32 ± 12. Hospital LOS was 10.5 (8 to 14) days, and 28-day hospital mortality was 10.0%. Preoperative ScvO2 decreased from 77% ± 10% to 70% ± 11% (p < 0.001) immediately after surgery and remained unchanged 12 hours later. A total of 67 postoperative complications were recorded in 32 patients. After multivariate analysis, mean ScvO2 value (odds ratio [OR] 1.23 [95% confidence interval (CI) 1.01 to 1.50], p = 0.037), hospital LOS (OR 0.75 [95% CI 0.59 to 0.94], p = 0.012), and SAPS II (OR 0.90 [95% CI 0.82 to 0.99], p = 0.029) were independently associated with postoperative complications. The optimal value of mean ScvO2 to discriminate between patients who did or did not develop complications was 73% (sensitivity 72%, specificity 61%). Conclusion Low ScvO2 perioperatively is related to increased risk of postoperative complications in high-risk surgery. This warrants trials with goal-directed therapy using ScvO2 as a target in high-risk surgery patients.
Resumo:
The coronary artery calcium (CAC) score is a readily and widely available tool for the noninvasive diagnosis of atherosclerotic coronary artery disease (CAD). The aim of this study was to investigate the added value of the CAC score as an adjunct to gated SPECT for the assessment of CAD in an intermediate-risk population. METHODS: Seventy-seven prospectively recruited patients with intermediate risk (as determined by the Framingham Heart Study 10-y CAD risk score) and referred for coronary angiography because of suspected CAD underwent stress (99m)Tc-tetrofosmin SPECT myocardial perfusion imaging (MPI) and CT CAC scoring within 2 wk before coronary angiography. The sensitivity and specificity of SPECT alone and of the combination of the 2 methods (SPECT plus CAC score) in demonstrating significant CAD (>/=50% stenosis on coronary angiography) were compared. RESULTS: Forty-two (55%) of the 77 patients had CAD on coronary angiography, and 35 (45%) had abnormal SPECT results. The CAC score was significantly higher in subjects with perfusion abnormalities than in those who had normal SPECT results (889 +/- 836 [mean +/- SD] vs. 286 +/- 335; P < 0.0001). Similarly, with rising CAC scores, a larger percentage of patients had CAD. Receiver-operating-characteristic analysis showed that a CAC score of greater than or equal to 709 was the optimal cutoff for detecting CAD missed by SPECT. SPECT alone had a sensitivity and a specificity for the detection of significant CAD of 76% and 91%, respectively. Combining SPECT with the CAC score (at a cutoff of 709) improved the sensitivity of SPECT (from 76% to 86%) for the detection of CAD, in association with a nonsignificant decrease in specificity (from 91% to 86%). CONCLUSION: The CAC score may offer incremental diagnostic information over SPECT data for identifying patients with significant CAD and negative MPI results.
Resumo:
OBJECTIVES: In patients with a clinically isolated syndrome (CIS), the time interval to convert to clinically definite multiple sclerosis (CDMS) is highly variable. Individual and geographical prognostic factors remain to be determined. Whether anti-myelin antibodies may predict the risk of conversion to CDMS in Swiss CIS patients of the canton Berne was the subject of the study. METHODS: Anti-myelin oligodendrocyte glycoprotein and anti-myelin basic protein antibodies were determined prospectively in patients admitted to our department. RESULTS: After a mean follow-up of 12 months, none of nine antibody-negative, but 22 of 30 antibody-positive patients had progressed to CDMS. Beta-Interferon treatment delayed the time to conversion from a mean of 7.4 to 10.9 months. CONCLUSIONS: In a Swiss cohort, antibody-negative CIS patients have a favorable short-term prognosis, and antibody-positive patients benefit from early treatment.
Resumo:
Risk factors for invasive aspergillosis (IA) are incompletely identified and may undergo changes due to differences in medical practice. A cohort of 189 consecutive, adult patients with neutropenia hospitalized in the hemato-oncology ward of the University hospital Berne between 1995 and 1999 were included in a retrospective study to assess risk factors for IA. In total, 45 IA cases (nine proven, three probable, 33 possible), 11 patients with refractory fever and 133 controls were analyzed. IA cases had more often acute leukemia or myelodysplastic syndrome (MDS) (88 vs 38%, P < 0.001) and a longer duration of neutropenia (mean 20.6 vs 9.9 days, P < 0.001). They also had fewer neutropenic episodes during the preceding 6 months (mean 0.42 vs 1.03, P < 0.001), that is, confirmed (82%) and probable (73%) IA occurred most often during the induction cycle. A short time interval ( < or = 14 days) between neutropenic episodes increased the risk of IA four-fold (P = 0.06). Bacteremia, however, was not related to the number of preceding neutropenic episodes. Therefore, neutropenic patients with leukemia or MDS have the highest risk of IA. The risk is highest during the first induction cycle of treatment and increases with short-time intervals between treatment cycles.