160 resultados para CRITICAL CARE
Resumo:
Background: The incidence of delirium in ventilated patients is estimated at up to 82%, and it is associated with longer intensive care and hospital stays, and long-term cognitive impairment and mortality. The pathophysiology of delirium has been linked with inflammation and neuronal apoptosis. Simvastatin has pleiotropic properties; it penetrates the brain and, as well as reducing cholesterol, reduces inflammation when used at clinically relevant doses over the short term. This is a single centre randomised, controlled trial which aims to test the hypothesis that treatment with simvastatin will modify delirium incidence and outcomes.
Methods/Design: The ongoing study will include 142 adults admitted to the Watford General Hospital Intensive Care Unit who require mechanical ventilation in the first 72 hours of admission. The primary outcome is the number of delirium- and coma-free days in the first 14 days. Secondary outcomes include incidence of delirium, delirium- and coma-free days in the first 28 days, days in delirium and in coma at 14 and 28 days, number of ventilator-free days at 28 days, length of critical care and hospital stay, mortality, cognitive decline and healthcare resource use. Informed consent will be taken from patient's consultee before randomisation to receive either simvastatin (80 mg) or placebo once daily. Daily data will be recorded until day 28 after randomisation or until discharge from the ICU if sooner. Surviving patients will be followed up on at six months from discharge. Plasma and urine samples will be taken to investigate the biological effect of simvastatin on systemic markers of inflammation, as related to the number of delirium- and coma-free days, and the potential of cholinesterase activity and beta-amyloid as predictors of the risk of delirium and long-term cognitive impairment.
Discussion: This trial will test the efficacy of simvastatin on reducing delirium in the critically ill. If patients receiving the statin show a reduced number of days in delirium compared with the placebo group, the inflammatory theory implicated in the pathogenesis of delirium will be strengthened.
Resumo:
Introduction: In this cohort study, we explored the relationship between fluid balance, intradialytic hypotension and outcomes in critically ill patients with acute kidney injury (AKI) who received renal replacement therapy (RRT).
Methods: We analysed prospectively collected registry data on patients older than 16 years who received RRT for at least two days in an intensive care unit at two university-affiliated hospitals. We used multivariable logistic regression to determine the relationship between mean daily fluid balance and intradialytic hypotension, both over seven days following RRT initiation, and the outcomes of hospital mortality and RRT dependence in survivors.
Results: In total, 492 patients were included (299 male (60.8%), mean (standard deviation (SD)) age 62.9 (16.3) years); 251 (51.0%) died in hospital. Independent risk factors for mortality were mean daily fluid balance (odds ratio (OR) 1.36 per 1000 mL positive (95% confidence interval (CI) 1.18 to 1.57), intradialytic hypotension (OR 1.14 per 10% increase in days with intradialytic hypotension (95% CI 1.06 to 1.23)), age (OR 1.15 per five-year increase (95% CI 1.07 to 1.25)), maximum sequential organ failure assessment score on days 1 to 7 (OR 1.21 (95% CI 1.13 to 1.29)), and Charlson comorbidity index (OR 1.28 (95% CI 1.14 to 1.44)); higher baseline creatinine (OR 0.98 per 10 mu mol/L (95% CI 0.97 to 0.996)) was associated with lower risk of death. Of 241 hospital survivors, 61 (25.3%) were RRT dependent at discharge. The only independent risk factor for RRT dependence was pre-existing heart failure (OR 3.13 (95% CI 1.46 to 6.74)). Neither mean daily fluid balance nor intradialytic hypotension was associated with RRT dependence in survivors. Associations between these exposures and mortality were similar in sensitivity analyses accounting for immortal time bias and dichotomising mean daily fluid balance as positive or negative. In the subgroup of patients with data on pre-RRT fluid balance, fluid overload at RRT initiation did not modify the association of mean daily fluid balance with mortality.
Conclusions: In this cohort of patients with AKI requiring RRT, a more positive mean daily fluid balance and intradialytic hypotension were associated with hospital mortality but not with RRT dependence at hospital discharge in survivors.
Resumo:
Consulting with users is considered best practice and is highly recommended in designing new trials. As part of our feasibility work, we undertook a consultation exercise with parents, ex-patients and young people prior to designing a trial of protocol-based ventilator weaning. Our aims were to (1) ascertain views on the relevance and importance of the trial; (2) determine the important parent/patient outcome measures; and (3) ascertain views on informed consent in a cluster randomized controlled trial. We conducted audio-recorded face-to-face, telephone and focus group interviews with parents and young people. Data were content analysed to generate information to address our specific consultation objectives. The setting was the north-western region of England. A total of 16 participants were interviewed: 2 parents of paediatric intensive care unit (PICU) survivors; 1 PICU survivor; and 13 young people from the former Medicines for Children Research Network. The trial objectives were deemed important and relevant, and participants considered the most important outcome measure to be the length of time on ventilation. Parents and young people did not consider written informed consent to be a necessary requirement in the context of this trial, rather awareness of unit participation in the trial was important with the opportunity of opting out of data collection. This consultation provided useful, pragmatic insights to inform trial design. We encountered significant challenges in recruiting parents and young people for this consultation exercise, and novel recruitment methods need to be considered for future work in this field. Patient and public involvement is essential to ensure that future trials answer parent-relevant questions and have meaningful outcome measures, as well as involving parents and young people in the general development of health care services.
Resumo:
Background
Among clinical trials of interventions that aim to modify time spent on mechanical ventilation for critically ill patients there is considerable inconsistency in chosen outcomes and how they are measured. The Core Outcomes in Ventilation Trials (COVenT) study aims to develop a set of core outcomes for use in future ventilation trials in mechanically ventilated adults and children.
Methods/design
We will use a mixed methods approach that incorporates a randomised trial nested within a Delphi study and a consensus meeting. Additionally, we will conduct an observational cohort study to evaluate uptake of the core outcome set in published studies at 5 and 10 years following core outcome set publication. The three-round online Delphi study will use a list of outcomes that have been reported previously in a review of ventilation trials. The Delphi panel will include a range of stakeholder groups including patient support groups. The panel will be randomised to one of three feedback methods to assess the impact of the feedback mechanism on subsequent ranking of outcomes. A final consensus meeting will be held with stakeholder representatives to review outcomes.
Discussion
The COVenT study aims to develop a core outcome set for ventilation trials in critical care, explore the best Delphi feedback mechanism for achieving consensus and determine if participation increases use of the core outcome set in the long term.
Resumo:
RATIONALE: Anaerobic bacteria are present in large numbers in the airways of people with cystic fibrosis (PWCF). In the gut, anaerobes produce short-chain fatty acids (SCFAs) that modulate immune/inflammatory processes.
OBJECTIVES: To investigate the capacity of anaerobes to contribute to CF airway pathogenesis via SCFAs.
METHODS: Samples from 109 PWCF were processed using anaerobic microbiological culture with bacteria present identified by 16S RNA sequencing. SCFAs levels in anaerobe supernatants and bronchoalveolar lavage (BAL) were determined by gas chromatography. The mRNA and/or protein expression of SCFAs receptors, GPR41 and GPR43, in CF and non-CF bronchial brushings, and 16HBE14o- and CFBE41o- cells were evaluated using RT-PCR, western blot, laser scanning cytometry and confocal microscopy. SCFAs-induced IL-8 secretion was monitored by ELISA.
MEASUREMENTS AND MAIN RESULTS: Fifty seven of 109 (52.3%) PWCF were anaerobe-positive. Prevalence increased with age, from 33.3% to 57.7% in PWCF under (n=24) and over 6 years (n=85). All evaluated anaerobes produced millimolar concentrations of SCFAs, including acetic, propionic and butyric acid. SCFAs levels were higher in BAL samples from adults than children. GPR41 levels were elevated in; CFBE41o- versus 16HBE14o- cells; CF versus non-CF bronchial brushings; 16HBE14o- cells after treatment with CFTR inhibitor CFTR(inh)-172, CF BAL, or inducers of endoplasmic reticulum stress. SCFAs induced a dose-dependent and pertussis toxin-sensitive IL-8 response in bronchial epithelial cells with a higher production of IL-8 in CFBE41o- than 16HBE14o- cells.
CONCLUSIONS: This study illustrates that SCFAs contribute to excessive production of IL-8 in CF airways colonized with anaerobes via upregulated GPR41.
Resumo:
Rationale: IL-17A is purported to help drive early pathogenesis in acute respiratory distress syndrome (ARDS) by enhancing neutrophil recruitment. Whilst IL-17A is the archetypal cytokine of T helper (Th)17 cells, it is produced by a number of lymphocytes, the source during ARDS being unknown.
Objectives: To identify the cellular source and the role of IL17A in the early phase of lung injury
Methods: Lung injury was induced in WT (C57BL/6) and IL-17 KO mice with aerosolised LPS (100 µg) or Pseudomonas aeruginosa infection. Detailed phenotyping of the cells expressing RORγt, the transcriptional regulator of IL-17 production, in the mouse lung at 24 hours was carried out by flow cytometry.
Measurement and Main Results: A 100-fold reduction in neutrophil infiltration was observed in the lungs of the IL-17A KO compared to wild type (WT) mice. The majority of RORγt+ cells in the mouse lung were the recently identified type 3 innate lymphoid cells (ILC3). Detailed characterisation revealed these pulmonary ILC3s (pILC3s) to be discrete from those described in the gut. The critical role of these cells was verified by inducing injury in Rag2 KO mice which lack T cells but retain ILCs. No amelioration of pathology was observed in the Rag2 KO mice.
Conclusions: IL-17 is rapidly produced during lung injury and significantly contributes to early immunopathogenesis. This is orchestrated largely by a distinct population of pILC3 cells. Modulation of pILC3s’ activity may potentiate early control of the inflammatory dysregulation seen in ARDS, opening up new therapeutic targets.
Resumo:
Background: Rapid Response Systems (RRS) have been implemented nationally and internationally to improve patient safety in hospital. However, to date the majority of the RRS research evidence has focused on measuring the effectiveness of the intervention on patient outcomes. To evaluate RRS it has been recommended that a multimodal approach is required to address the broad range of process and outcome measures required to determine the effectiveness of the RRS concept. Aim: The aim of this paper is to evaluate the official RRS programme theoretical assumptions regarding how the programme is meant to work against actual practice in order to determine what works. Methods: The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory [1]. This approach used a variety of mixed methods to test the programme theories including individual and focus group interviews with a purposive sample of 75 nurses and doctors, observation of ward practices and documentary analysis. The findings from the case studies were analysed and compared within and across cases to identify what works for whom and in what circumstances. Results: The RRS programme theories were critically evaluated and compared with study findings to develop a mid-range theory to explain what works, for whom in what circumstances. The findings of what works suggests that clinical experience, established working relationships, flexible implementation of protocols, ongoing experiential learning, empowerment and pre-emptive management are key to the success of RRS implementation. Conclusion:These findings highlight the combination of factors that can improve the implementation of RRS and in light of this evidence several recommendations are made to provide policymakers with guidance and direction for their success and sustainability.References: 1.Pawson R and Tilley N. (1997) Realistic Evaluation. Sage Publications; LondonType of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast
Resumo:
Realistic Evaluation of EWS and ALERT: factors enabling and constraining implementation Background The implementation of EWS and ALERT in practice is essential to the success of Rapid Response Systems but is dependent upon nurses utilising EWS protocols and applying ALERT best practice guidelines. To date there is limited evidence on the effectiveness of EWS or ALERT as research has primarily focused on measuring patient outcomes (cardiac arrests, ICU admissions) following the implementation of a Rapid Response Team. Complex interventions in healthcare aimed at changing service delivery and related behaviour of health professionals require a different research approach to evaluate the evidence. To understand how and why EWS and ALERT work, or might not work, research needs to consider the social, cultural and organisational influences that will impact on successful implementation in practice. This requires a research approach that considers both the processes and outcomes of complex interventions, such as EWS and ALERT, implemented in practice. Realistic Evaluation is such an approach and was used to explain the factors that enable and constrain the implementation of EWS and ALERT in practice [1]. Aim The aim of this study was to evaluate factors that enabled and constrained the implementation and service delivery of early warnings systems (EWS) and ALERT in practice in order to provide direction for enabling their success and sustainability. Methods The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory. This approach used a variety of mixed methods to test the programme theories including individual and focus group interviews, observation and documentary analysis in a two stage process. A purposive sample of 75 key informants participated in individual and focus group interviews. Observation and documentary analysis of EWS compliance data and ALERT training records provided further evidence to support or refute the interview findings. Data was analysed using NVIVO8 to categorise interview findings and SPSS for ALERT documentary data. These findings were further synthesised by undertaking a within and cross case comparison to explain the factors enabling and constraining EWS and ALERT. Results A cross case analysis highlighted similarities, differences and factors enabling or constraining successful implementation across the case study sites. Findings showed that personal (confidence; clinical judgement; personality), social (ward leadership; communication), organisational (workload and staffing issues; pressure from managers to complete EWS audit and targets), educational (constraints on training; no clinical educator on ward) and cultural (routine task delegated) influences impact on EWS and acute care training outcomes. There were also differences noted between medical and surgical wards across both case sites. Conclusions Realist Evaluation allows refinement and development of the RRS programme theory to explain the realities of practice. These refined RRS programme theories are capable of informing the planning of future service provision and provide direction for enabling their success and sustainability. References: 1. McGaughey J, Blackwood B, O’Halloran P, Trinder T. J. & Porter S. (2010) A realistic evaluation of Track and Trigger systems and acute care training for early recognition and management of deteriorating ward–based patients. Journal of Advanced Nursing 66 (4), 923-932. Type of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast