180 resultados para Risk Impact
Resumo:
BACKGROUND: Due to the underlying diseases and the need for immunosuppression, patients after lung transplantation are particularly at risk for gastrointestinal (GI) complications that may negatively influence long-term outcome. The present study assessed the incidences and impact of GI complications after lung transplantation and aimed to identify risk factors. METHODS: Retrospective analysis of all 227 consecutively performed single- and double-lung transplantations at the University hospitals of Lausanne and Geneva was performed between January 1993 and December 2010. Logistic regressions were used to test the effect of potentially influencing variables on the binary outcomes overall, severe, and surgery-requiring complications, followed by a multiple logistic regression model. RESULTS: Final analysis included 205 patients for the purpose of the present study, and 22 patients were excluded due to re-transplantation, multiorgan transplantation, or incomplete datasets. GI complications were observed in 127 patients (62 %). Gastro-esophageal reflux disease was the most commonly observed complication (22.9 %), followed by inflammatory or infectious colitis (20.5 %) and gastroparesis (10.7 %). Major GI complications (Dindo/Clavien III-V) were observed in 83 (40.5 %) patients and were fatal in 4 patients (2.0 %). Multivariate analysis identified double-lung transplantation (p = 0.012) and early (1993-1998) transplantation period (p = 0.008) as independent risk factors for developing major GI complications. Forty-three (21 %) patients required surgery such as colectomy, cholecystectomy, and fundoplication in 6.8, 6.3, and 3.9 % of the patients, respectively. Multivariate analysis identified Charlson comorbidity index of ≥3 as an independent risk factor for developing GI complications requiring surgery (p = 0.015). CONCLUSION: GI complications after lung transplantation are common. Outcome was rather encouraging in the setting of our transplant center.
Resumo:
Among the various strategies to reduce the incidence of non-communicable diseases reduction of sodium intake in the general population has been recognized as one of the most cost-effective means because of its potential impact on the development of hypertension and cardiovascular diseases. Yet, this strategic health recommendation of the WHO and many other international organizations is far from being universally accepted. Indeed, there are still several unresolved scientific and epidemiological questions that maintain an ongoing debate. Thus what is the adequate low level of sodium intake to recommend to the general population and whether national strategies should be oriented to the overall population or only to higher risk fractions of the population such as salt-sensitive patients are still discussed. In this paper, we shall review the recent results of the literature regarding salt, blood pressure and cardiovascular risk and we present the recommendations recently proposed by a group of experts of Switzerland. The propositions of the participating medical societies are to encourage national health authorities to continue their discussion with the food industry in order to reduce the sodium intake of food products with a target of mean salt intake of 5-6 grams per day in the population. Moreover, all initiatives to increase the information on the effect of salt on health and on the salt content of food are supported.
Resumo:
BACKGROUND: The impact of early treatment with immunomodulators (IM) and/or TNF antagonists on bowel damage in Crohn's disease (CD) patients is unknown. AIM: To assess whether 'early treatment' with IM and/or TNF antagonists, defined as treatment within a 2-year period from the date of CD diagnosis, was associated with development of lesser number of disease complications when compared to 'late treatment', which was defined as treatment initiation after >2 years from the time of CD diagnosis. METHODS: Data from the Swiss IBD Cohort Study were analysed. The following outcomes were assessed using Cox proportional hazard modelling: bowel strictures, perianal fistulas, internal fistulas, intestinal surgery, perianal surgery and any of the aforementioned complications. RESULTS: The 'early treatment' group of 292 CD patients was compared to the 'late treatment' group of 248 CD patients. We found that 'early treatment' with IM or TNF antagonists alone was associated with reduced risk of bowel strictures [hazard ratio (HR) 0.496, P = 0.004 for IM; HR 0.276, P = 0.018 for TNF antagonists]. Furthermore, 'early treatment' with IM was associated with reduced risk of undergoing intestinal surgery (HR 0.322, P = 0.005), and perianal surgery (HR 0.361, P = 0.042), as well as developing any complication (HR 0.567, P = 0.006). CONCLUSIONS: Treatment with immunomodulators or TNF antagonists within the first 2 years of CD diagnosis was associated with reduced risk of developing bowel strictures, when compared to initiating these drugs >2 years after diagnosis. Furthermore, early immunomodulators treatment was associated with reduced risk of intestinal surgery, perianal surgery and any complication.
Resumo:
BACKGROUND: Lymphedema is an underdiagnosed pathology which in industrialized countries mainly affects cancer patients that underwent lymph node dissection and/or radiation. Currently no effective therapy is available so that patients' life quality is compromised by swellings of the concerned body region. This unfortunate condition is associated with body imbalance and subsequent osteochondral deformations and impaired function as well as with an increased risk of potentially life threatening soft tissue infections. METHODS: The effects of PRP and ASC on angiogenesis (anti-CD31 staining), microcirculation (Laser Doppler Imaging), lymphangiogenesis (anti-LYVE1 staining), microvascular architecture (corrosion casting) and wound healing (digital planimetry) are studied in a murine tail lymphedema model. RESULTS: Wounds treated by PRP and ASC healed faster and showed a significantly increased epithelialization mainly from the proximal wound margin. The application of PRP induced a significantly increased lymphangiogenesis while the application of ASC did not induce any significant change in this regard. CONCLUSIONS: PRP and ASC affect lymphangiogenesis and lymphedema development and might represent a promising approach to improve regeneration of lymphatic vessels, restore disrupted lymphatic circulation and treat or prevent lymphedema alone or in combination with currently available lymphedema therapies.
Resumo:
Major route additional cytogenetic aberrations (ACA) at diagnosis of chronic myeloid leukaemia (CML) indicate an increased risk of progression and shorter survival. Since major route ACA are almost always unbalanced, it is unclear whether other unbalanced ACA at diagnosis also confer an unfavourable prognosis. On the basis of 1348 Philadelphia chromosome-positive chronic phase patients of the randomized CML study IV, we examined the impact of unbalanced minor route ACA at diagnosis versus major route ACA on prognosis. At diagnosis, 1175 patients (87.2 %) had a translocation t(9;22)(q34;q11) and 74 (5.5 %) a variant translocation t(v;22) only, while a loss of the Y chromosome (-Y) was present in addition in 44 (3.3 %), balanced or unbalanced minor route ACA each in 17 (1.3 %) and major route ACA in 21 (1.6 %) cases. Patients with unbalanced minor route ACA had no significantly different cumulative incidences of complete cytogenetic remission or major molecular remission and no significantly different progression-free survival (PFS) or overall survival (OS) than patients with t(9;22), t(v;22), -Y and balanced minor route karyotypes. In contrast, patients with major route ACA had a shorter OS and PFS than all other groups (all pairwise comparisons to each of the other groups: p ≤ 0.015). Five-year survival probabilities were for t(9;22) 91.4 % (95 % CI 89.5-93.1), t(v; 22) 87 % (77.2-94.3), -Y 89.0 % (76.7-97.0), balanced 100 %, unbalanced minor route 92.3 % (72.4-100) and major route 52.2 % (28.2-75.5). We conclude that only major route, but not balanced or unbalanced minor route ACA at diagnosis, has a negative impact on prognosis of CML.
Resumo:
BACKGROUND: The impact of early valve surgery (EVS) on the outcome of Staphylococcus aureus (SA) prosthetic valve infective endocarditis (PVIE) is unresolved. The objective of this study was to evaluate the association between EVS, performed within the first 60 days of hospitalization, and outcome of SA PVIE within the International Collaboration on Endocarditis-Prospective Cohort Study. METHODS: Participants were enrolled between June 2000 and December 2006. Cox proportional hazards modeling that included surgery as a time-dependent covariate and propensity adjustment for likelihood to receive cardiac surgery was used to evaluate the impact of EVS and 1-year all-cause mortality on patients with definite left-sided S. aureus PVIE and no history of injection drug use. RESULTS: EVS was performed in 74 of the 168 (44.3%) patients. One-year mortality was significantly higher among patients with S. aureus PVIE than in patients with non-S. aureus PVIE (48.2% vs 32.9%; P = .003). Staphylococcus aureus PVIE patients who underwent EVS had a significantly lower 1-year mortality rate (33.8% vs 59.1%; P = .001). In multivariate, propensity-adjusted models, EVS was not associated with 1-year mortality (risk ratio, 0.67 [95% confidence interval, .39-1.15]; P = .15). CONCLUSIONS: In this prospective, multinational cohort of patients with S. aureus PVIE, EVS was not associated with reduced 1-year mortality. The decision to pursue EVS should be individualized for each patient, based upon infection-specific characteristics rather than solely upon the microbiology of the infection causing PVIE.
Resumo:
BACKGROUND: Compared with usual care, noninvasive ventilation (NIV) lowers the risk of intubation and death for subjects with respiratory failure secondary to COPD exacerbations, but whether administration of NIV by a specialized, dedicated team improves its efficiency remains uncertain. Our aim was to test whether a dedicated team of respiratory therapists applying all acute NIV treatments would reduce the risk of intubation or death for subjects with COPD admitted for respiratory failure. METHODS: We carried out a retrospective study comparing subjects with COPD admitted to the ICU before (2001-2003) and after (2010-2012) the creation of a dedicated NIV team in a regional acute care hospital. The primary outcome was the risk of intubation or death. The secondary outcomes were the individual components of the primary outcome and ICU/hospital stay. RESULTS: A total of 126 subjects were included: 53 in the first cohort and 73 in the second. There was no significant difference in the demographic characteristics and severity of respiratory failure. Fifteen subjects (28.3%) died or had to undergo tracheal intubation in the first cohort, and only 10 subjects (13.7%) in the second cohort (odds ratio 0.40, 95% CI 0.16-0.99, P = .04). In-hospital mortality (15.1% vs 4.1%, P = .03) and median stay (ICU: 3.1 vs 1.9 d, P = .04; hospital: 11.5 vs 9.6 d, P = .04) were significantly lower in the second cohort, and a trend for a lower intubation risk was observed (20.8% vs 11% P = .13). CONCLUSIONS: The delivery of NIV by a dedicated team was associated with a lower risk of death or intubation in subjects with respiratory failure secondary to COPD exacerbations. Therefore, the implementation of a team administering all NIV treatments on a 24-h basis should be considered in institutions admitting subjects with COPD exacerbations.
Resumo:
ABSTRACT: A workshop was held at the National Institute for Diabetes and Digestive and Kidney Diseases with a focus on the impact of sleep and circadian disruption on energy balance and diabetes. The workshop identified a number of key principles for research in this area and a number of specific opportunities. Studies in this area would be facilitated by active collaboration between investigators in sleep/circadian research and investigators in metabolism/diabetes. There is a need to translate the elegant findings from basic research into improving the metabolic health of the American public. There is also a need for investigators studying the impact of sleep/circadian disruption in humans to move beyond measurements of insulin and glucose and conduct more in-depth phenotyping. There is also a need for the assessments of sleep and circadian rhythms as well as assessments for sleep-disordered breathing to be incorporated into all ongoing cohort studies related to diabetes risk. Studies in humans need to complement the elegant short-term laboratory-based human studies of simulated short sleep and shift work etc. with studies in subjects in the general population with these disorders. It is conceivable that chronic adaptations occur, and if so, the mechanisms by which they occur needs to be identified and understood. Particular areas of opportunity that are ready for translation are studies to address whether CPAP treatment of patients with pre-diabetes and obstructive sleep apnea (OSA) prevents or delays the onset of diabetes and whether temporal restricted feeding has the same impact on obesity rates in humans as it does in mice.
Resumo:
Two enoxaparin dosage regimens are used as comparators to evaluate new anticoagulants for thromboprophylaxis in patients undergoing major orthopaedic surgery, but so far no satisfactory direct comparison between them has been published. Our objective was to compare the efficacy and safety of enoxaparin 3,000 anti-Xa IU twice daily and enoxaparin 4,000 anti-Xa IU once daily in this clinical setting by indirect comparison meta-analysis, using Bucher's method. We selected randomised controlled trials comparing another anticoagulant, placebo (or no treatment) with either enoxaparin regimen for venous thromboembolism prophylaxis after hip or knee replacement or hip fracture surgery, provided that the second regimen was assessed elsewhere versus the same comparator. Two authors independently evaluated study eligibility, extracted the data, and assessed the risk of bias. The primary efficacy outcome was the incidence of venous thomboembolism. The main safety outcome was the incidence of major bleeding. Overall, 44 randomised comparisons in 56,423 patients were selected, 35 being double-blind (54,117 patients). Compared with enoxaparin 4,000 anti-Xa IU once daily, enoxaparin 3,000 anti-Xa IU twice daily was associated with a reduced risk of venous thromboembolism (relative risk [RR]: 0.53, 95% confidence interval [CI]: 0.40 to 0.69), but an increased risk of major bleeding (RR: 2.01, 95% CI: 1.23 to 3.29). In conclusion, when interpreting the benefit-risk ratio of new anticoagulant drugs versus enoxaparin for thromboprophylaxis after major orthopaedic surgery, the apparently greater efficacy but higher bleeding risk of the twice-daily 3,000 anti-Xa IU enoxaparin regimen compared to the once-daily 4,000 anti-Xa IU regimen should be taken into account.
Resumo:
BackgroundBipolar disorder is a highly heritable polygenic disorder. Recent enrichment analyses suggest that there may be true risk variants for bipolar disorder in the expression quantitative trait loci (eQTL) in the brain.AimsWe sought to assess the impact of eQTL variants on bipolar disorder risk by combining data from both bipolar disorder genome-wide association studies (GWAS) and brain eQTL.MethodTo detect single nucleotide polymorphisms (SNPs) that influence expression levels of genes associated with bipolar disorder, we jointly analysed data from a bipolar disorder GWAS (7481 cases and 9250 controls) and a genome-wide brain (cortical) eQTL (193 healthy controls) using a Bayesian statistical method, with independent follow-up replications. The identified risk SNP was then further tested for association with hippocampal volume (n = 5775) and cognitive performance (n = 342) among healthy individuals.ResultsIntegrative analysis revealed a significant association between a brain eQTL rs6088662 on chromosome 20q11.22 and bipolar disorder (log Bayes factor = 5.48; bipolar disorder P = 5.85×10(-5)). Follow-up studies across multiple independent samples confirmed the association of the risk SNP (rs6088662) with gene expression and bipolar disorder susceptibility (P = 3.54×10(-8)). Further exploratory analysis revealed that rs6088662 is also associated with hippocampal volume and cognitive performance in healthy individuals.ConclusionsOur findings suggest that 20q11.22 is likely a risk region for bipolar disorder; they also highlight the informative value of integrating functional annotation of genetic variants for gene expression in advancing our understanding of the biological basis underlying complex disorders, such as bipolar disorder.
Resumo:
To evaluate whether screening for hypertension should start early in life, information on the risk of diseases associated with the level of blood pressure in childhood or adolescence is needed. The study by Leiba et al. that is reported in the current issue of Pediatric Nephrology demonstrates convincingly that hypertensive adolescents are at higher risk of cardiovascular death than normotensive adolescents. Nevertheless, it can be shown that this excess risk is not sufficient to justify a screen-and-treat strategy. Since the large majority of cardiovascular deaths occur among normotensive adolescents, measures for primordial prevention of cardiovascular diseases could have a much larger impact at the population level.
Resumo:
BACKGROUND AND OBJECTIVES: Sudden cardiac death (SCD) is a severe burden of modern medicine. Aldosterone antagonist is publicized as effective in reducing mortality in patients with heart failure (HF) or post myocardial infarction (MI). Our study aimed to assess the efficacy of AAs on mortality including SCD, hospitalization admission and several common adverse effects. METHODS: We searched Embase, PubMed, Web of Science, Cochrane library and clinicaltrial.gov for randomized controlled trials (RCTs) assigning AAs in patients with HF or post MI through May 2015. The comparator included standard medication or placebo, or both. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines were followed. Event rates were compared using a random effects model. Prospective RCTs of AAs with durations of at least 8 weeks were selected if they included at least one of the following outcomes: SCD, all-cause/cardiovascular mortality, all-cause/cardiovascular hospitalization and common side effects (hyperkalemia, renal function degradation and gynecomastia). RESULTS: Data from 19,333 patients enrolled in 25 trials were included. In patients with HF, this treatment significantly reduced the risk of SCD by 19% (RR 0.81; 95% CI, 0.67-0.98; p = 0.03); all-cause mortality by 19% (RR 0.81; 95% CI, 0.74-0.88, p<0.00001) and cardiovascular death by 21% (RR 0.79; 95% CI, 0.70-0.89, p<0.00001). In patients with post-MI, the matching reduced risks were 20% (RR 0.80; 95% CI, 0.66-0.98; p = 0.03), 15% (RR 0.85; 95% CI, 0.76-0.95, p = 0.003) and 17% (RR 0.83; 95% CI, 0.74-0.94, p = 0.003), respectively. Concerning both subgroups, the relative risks respectively decreased by 19% (RR 0.81; 95% CI, 0.71-0.92; p = 0.002) for SCD, 18% (RR 0.82; 95% CI, 0.77-0.88, p < 0.0001) for all-cause mortality and 20% (RR 0.80; 95% CI, 0.74-0.87, p < 0.0001) for cardiovascular mortality in patients treated with AAs. As well, hospitalizations were significantly reduced, while common adverse effects were significantly increased. CONCLUSION: Aldosterone antagonists appear to be effective in reducing SCD and other mortality events, compared with placebo or standard medication in patients with HF and/or after a MI.
Resumo:
BACKGROUND: Uncertainty about the presence of infection results in unnecessary and prolonged empiric antibiotic treatment of newborns at risk for early-onset sepsis (EOS). This study evaluates the impact of this uncertainty on the diversity in management. METHODS: A web-based survey with questions addressing management of infection risk-adjusted scenarios was performed in Europe, North America, and Australia. Published national guidelines (n = 5) were reviewed and compared with the results of the survey. RESULTS: 439 Clinicians (68% were neonatologists) from 16 countries completed the survey. In the low-risk scenario, 29% would start antibiotic therapy and 26% would not, both groups without laboratory investigations; 45% would start if laboratory markers were abnormal. In the high-risk scenario, 99% would start antibiotic therapy. In the low-risk scenario, 89% would discontinue antibiotic therapy before 72 hours. In the high-risk scenario, 35% would discontinue therapy before 72 hours, 56% would continue therapy for 5-7 days, and 9% for more than 7 days. Laboratory investigations were used in 31% of scenarios for the decision to start, and in 72% for the decision to discontinue antibiotic treatment. National guidelines differ considerably regarding the decision to start in low-risk and regarding the decision to continue therapy in higher risk situations. CONCLUSIONS: There is a broad diversity of clinical practice in management of EOS and a lack of agreement between current guidelines. The results of the survey reflect the diversity of national guidelines. Prospective studies regarding management of neonates at risk of EOS with safety endpoints are needed.
Resumo:
BACKGROUND: Despite a low positive predictive value, diagnostic tests such as complete blood count (CBC) and C-reactive protein (CRP) are commonly used to evaluate whether infants with risk factors for early-onset neonatal sepsis (EOS) should be treated with antibiotics. STUDY DESIGN: We investigated the impact of imple- menting a protocol aiming at reducing the number of dia- gnostic tests in infants with risk factors for EOS in order to compare the diagnostic performance of repeated clinical examination with CBC and CRP measurement. The primary outcome was the time between birth and the first dose of antibiotics in infants treated for suspected EOS. RESULTS: Among the 11,503 infants born at 35 weeks during the study period, 222 were treated with antibiotics for suspected EOS. The proportion of infants receiving an- tibiotics for suspected EOS was 2.1% and 1.7% before and after the change of protocol (p = 0.09). Reduction of dia- gnostic tests was associated with earlier antibiotic treat- ment in infants treated for suspected EOS (hazard ratio 1.58; 95% confidence interval [CI] 1.20-2.07; p <0.001), and in infants with neonatal infection (hazard ratio 2.20; 95% CI 1.19-4.06; p = 0.01). There was no difference in the duration of hospital stay nor in the proportion of infants requiring respiratory or cardiovascular support before and after the change of protocol. CONCLUSION: Reduction of diagnostic tests such as CBC and CRP does not delay initiation of antibiotic treat- ment in infants with suspected EOS. The importance of clinical examination in infants with risk factors for EOS should be emphasised.
Resumo:
Trabecular bone score (TBS) is a gray-level textural index of bone microarchitecture derived from lumbar spine dual-energy X-ray absorptiometry (DXA) images. TBS is a bone mineral density (BMD)-independent predictor of fracture risk. The objective of this meta-analysis was to determine whether TBS predicted fracture risk independently of FRAX probability and to examine their combined performance by adjusting the FRAX probability for TBS. We utilized individual-level data from 17,809 men and women in 14 prospective population-based cohorts. Baseline evaluation included TBS and the FRAX risk variables, and outcomes during follow-up (mean 6.7 years) comprised major osteoporotic fractures. The association between TBS, FRAX probabilities, and the risk of fracture was examined using an extension of the Poisson regression model in each cohort and for each sex and expressed as the gradient of risk (GR; hazard ratio per 1 SD change in risk variable in direction of increased risk). FRAX probabilities were adjusted for TBS using an adjustment factor derived from an independent cohort (the Manitoba Bone Density Cohort). Overall, the GR of TBS for major osteoporotic fracture was 1.44 (95% confidence interval [CI] 1.35-1.53) when adjusted for age and time since baseline and was similar in men and women (p > 0.10). When additionally adjusted for FRAX 10-year probability of major osteoporotic fracture, TBS remained a significant, independent predictor for fracture (GR = 1.32, 95% CI 1.24-1.41). The adjustment of FRAX probability for TBS resulted in a small increase in the GR (1.76, 95% CI 1.65-1.87 versus 1.70, 95% CI 1.60-1.81). A smaller change in GR for hip fracture was observed (FRAX hip fracture probability GR 2.25 vs. 2.22). TBS is a significant predictor of fracture risk independently of FRAX. The findings support the use of TBS as a potential adjustment for FRAX probability, though the impact of the adjustment remains to be determined in the context of clinical assessment guidelines. © 2015 American Society for Bone and Mineral Research.