775 resultados para fixed-width confidence interval
Resumo:
Background: Angiotensin-converting enzyme inhibitors (ACEIs) and angiotensin II receptor blockers (ARBs) are commonly prescribed to the growing number of cancer patients (more than two million in the UK alone) often to treat hypertension. However, increased fatal cancer in ARB users in a randomized trial and increased breast cancer recurrence rates in ACEI users in a recent observational study have raised concerns about their safety in cancer patients. We investigated whether ACEI or ARB use after breast, colorectal or prostate cancer diagnosis was associated with increased risk of cancer-specific mortality.
Methods: Population-based cohorts of 9,814 breast, 4,762 colorectal and 6,339 prostate cancer patients newly diagnosed from 1998 to 2006 were identified in the UK Clinical Practice Research Datalink and confirmed by cancer registry linkage. Cancer-specific and all-cause mortality were identified from Office of National Statistics mortality data in 2011 (allowing up to 13 years of follow-up). A nested case–control analysis was conducted to compare ACEI/ARB use (from general practitioner prescription records) in cancer patients dying from cancer with up to five controls (not dying from cancer). Conditional logistic regression estimated the risk of cancer-specific, and all-cause, death in ACEI/ARB users compared with non-users.
Results: The main analysis included 1,435 breast, 1,511 colorectal and 1,184 prostate cancer-specific deaths (and 7,106 breast, 7,291 colorectal and 5,849 prostate cancer controls). There was no increase in cancer-specific mortality in patients using ARBs after diagnosis of breast (adjusted odds ratio (OR) = 1.06 95% confidence interval (CI) 0.84, 1.35), colorectal (adjusted OR = 0.82 95% CI 0.64, 1.07) or prostate cancer (adjusted OR = 0.79 95% CI 0.61, 1.03). There was also no evidence of increases in cancer-specific mortality with ACEI use for breast (adjusted OR = 1.06 95% CI 0.89, 1.27), colorectal (adjusted OR = 0.78 95% CI 0.66, 0.92) or prostate cancer (adjusted OR = 0.78 95% CI 0.66, 0.92).
Conclusions: Overall, we found no evidence of increased risks of cancer-specific mortality in breast, colorectal or prostate cancer patients who used ACEI or ARBs after diagnosis. These results provide some reassurance that these medications are safe in patients diagnosed with these cancers.
Keywords: Colorectal cancer; Breast cancer; Prostate cancer; Mortality; Angiotensin-converting enzyme inhibitors and angiotensin II receptor blockers
Resumo:
OBJECTIVES: Risk stratification of Barrett's esophagus (BE) patients based on clinical and endoscopic features may help to optimize surveillance practice for esophageal adenocarcinoma (EAC) development. The aim of this study was to investigate patient symptoms and endoscopic features at index endoscopy and risk of neoplastic progression in a large population-based cohort of BE patients.
METHODS: A retrospective review of hospital records relating to incident BE diagnosis was conducted in a subset of patients with specialized intestinal metaplasia from the Northern Ireland BE register. Patients were matched to the Northern Ireland Cancer Registry to identify progressors to EAC or esophageal high-grade dysplasia (HGD). Cox proportional hazards models were applied to evaluate the association between endoscopic features, symptoms, and neoplastic progression risk.
RESULTS: During 27,997 person-years of follow-up, 128 of 3,148 BE patients progressed to develop HGD/EAC. Ulceration within the Barrett's segment, but not elsewhere in the esophagus, was associated with an increased risk of progression (hazard ratio (HR) 1.72; 95% confidence interval (CI): 1.08–2.76). Long-segment BE carried a significant sevenfold increased risk of progression compared with short-segment BE; none of the latter group developed EAC during the study period. Conversely, the absence of reflux symptoms was associated with an increased risk of cancer progression (HR 1.61; 95% CI: 1.05–2.46).
CONCLUSIONS: BE patients presenting with a long-segment BE or Barrett's ulcer have an increased risk of progressing to HGD/EAC and should be considered for more intense surveillance. The absence of reflux symptoms at BE diagnosis is not associated with a reduced risk of malignant progression, and may carry an increased risk of progression.
Resumo:
Background: To investigate the association between post-diagnostic beta-blocker usage and risk of cancer-specific mortality in a large population-based cohort of female breast cancer patients.
Methods: A nested case-control study was conducted within a cohort of breast cancer patients identified from cancer registries in England(using the National Cancer Data repository) and diagnosed between 1998 and 2007. Patients who had a breast cancer-specific death(ascertained from Office of National Statistics death registration data) were each matched to four alive controls by year and age at diagnosis. Prescription data for these patients were available through the Clinical Practice Research Datalink. Conditional logistic regression models were used to investigate the association between breast cancer-specific death and beta-blocker usage.
Results: Post-diagnostic use of beta-blockers was identified in 18.9% of 1435 breast cancer-specific deaths and 19.4% of their 5697 matched controls,indicating little evidence of association between beta-blocker use and breast cancer-specific mortality [odds ratio (OR) = 0.97,95% confidence interval (CI) 0.83, 1.13]. There was also little evidence of an association when analyses were restricted to cardio non-selective beta-blockers (OR = 0.90, 95% CI 0.69, 1.17). Similar results were observed in analyses of drug dosage frequency and duration, and beta-blocker type.
Conclusions: In this large UK population-based cohort of breast cancer patients,there was little evidence of an association between post-diagnostic beta-blocker usage and breast cancer progression. Further studies which include information on tumour receptor status are warranted to determine whether response to beta-blockers varies by tumour subtypes.
Resumo:
Background: Many patients and healthcare professionals believe that work-related psychosocial stress, such as job strain, can make asthma worse, but this is not corroborated by empirical evidence. We investigated the associations between job strain and the incidence of severe asthma exacerbations in working-age European men and women. Methods: We analysed individual-level data, collected between 1985 and 2010, from 102 175 working-age men and women in 11 prospective European studies. Job strain (a combination of high demands and low control at work) was self-reported at baseline. Incident severe asthma exacerbations were ascertained from national hospitalization and death registries. Associations between job strain and asthma exacerbations were modelled using Cox regression and the study-specific findings combined using random-effects meta-analyses. Results: During a median follow-up of 10 years, 1 109 individuals experienced a severe asthma exacerbation (430 with asthma as the primary diagnostic code). In the age- and sex-adjusted analyses, job strain was associated with an increased risk of severe asthma exacerbations defined using the primary diagnostic code (hazard ratio, HR: 1.27, 95% confidence interval, CI: 1.00, 1.61). This association attenuated towards the null after adjustment for potential confounders (HR: 1.22, 95% CI: 0.96, 1.55). No association was observed in the analyses with asthma defined using any diagnostic code (HR: 1.01, 95% CI: 0.86, 1.19). Conclusions: Our findings suggest that job strain is probably not an important risk factor for severe asthma exacerbations leading to hospitalization or death.
Resumo:
Background: This study assessed the association between adolescent ecstasy use and depressive symptoms in adolescence. Methods: The Belfast Youth Development Study surveyed a cohort annually from age 11 to 16 years. Gender, Strengths and Difficulties Questionnaire emotional subscale, living arrangements, parental affluence, parent and peer attachment, tobacco, alcohol, cannabis and ecstasy use were investigated as predictors of Short Mood and Feelings Questionnaire (SMFQ) outcome. Results: Of 5371 respondents, 301 (5.6%) had an SMFQ > 15, and 1620 (30.2) had missing data for SMFQ. Around 8% of the cohort had used ecstasy by the end of follow-up. Of the non-drug users, ∼2% showed symptoms of depression, compared with 6% of those who had used alcohol, 6% of cannabis users, 6% of ecstasy users and 7% of frequent ecstasy users. Without adjustment, ecstasy users showed around a 4-fold increased odds of depressive symptoms compared with non-drug users [odds ratio (OR) = 0.26; 95% confidence interval (CI) = 0.10, 0.68]. Further adjustment for living arrangements, peer and parental attachment attenuated the association to under a 3-fold increase (OR = 0.37; 95% CI = 0.15, 0.94). There were no differences by frequency of use. Conclusions: Ecstasy use during adolescence may be associated with poorer mental health; however, this association can be explained by the confounding social influence of family dynamics. These findings could be used to aid effective evidence-based drug policies, which concentrate criminal justice and public health resources on reducing harm.
Resumo:
Objective: to explore maternal energy balance, incorporating free living physical activity and sedentary behaviour, in uncomplicated pregnancies at risk of macrosomia.
Methods: a parallel-group cross-sectional analysis was conducted in healthy pregnant women predicted to deliver infants weighing Z4000 g (study group) or o4000 g (control group). Women were recruited in a 1:1 ratio from antenatal clinics in Northern Ireland. Women wore a SenseWears Body Media Pro3 physical activity armband and completed a food diary for four consecutive days in the third trimester. Physical activity was measured in Metabolic Equivalent of Tasks (METs) where 1 MET¼1 kcal per kilogram of body weight per hour. Analysis of covariance (ANCOVA) was employed using the General Linear Model to adjust for potential confounders.
Findings: of the 112 women recruited, 100 complete datasets were available for analysis. There was no significant difference in energy balance between the two groups. Intensity of free living physical activity (average METs) of women predicted to deliver macrosomic infants (n¼50) was significantly lower than that of women in the control group (n¼50) (1.3 (0.2) METs (mean, standard deviation) versus 1.2 (0.2) METs; difference in means 0.1 METs (95% confidence interval: 0.19, 0.01); p¼0.021). Women predicted to deliver macrosomic infants also spent significantly more time in sedentary behaviour (r1 MET) than the control group (16.1 (2.8) hours versus 13.8 (4.3) hours; 2.0 hours (0.3, 3.7), p¼0.020).
Key conclusions and implications for practice: although there was no association between predicted fetal macrosomia and energy balance, those women predicted to deliver a macrosomic infant exhibited increased sedentary behaviour and reduced physical activity in the third trimester of pregnancy. Professionals caring for women during pregnancy have an important role in promoting and supporting more active lifestyles amongst women who are predicted to deliver a macrosomic infant given the known associated risks.
Resumo:
Purpose: To describe associations between reticular pseudodrusen, individual characteristics, and retinal function.
Design: Cohort study.
Participants: We recruited 105 patients (age range, 52–93 years) who had advanced neovascular age-related macular degeneration (AMD) in only 1 eye from 3 clinical centers in Europe.
Methods: Minimum follow-up was 12 months. The eye selected for study was the fellow eye without advanced disease. Clinical measures of vision were distance visual acuity, near visual acuity, and results of the Smith-Kettlewell low-luminance acuity test (SKILL). Fundus imaging included color photography, red-free imaging, blue autofluorescence imaging, fluorescein angiography, indocyanine green angiography, and optical coherence tomography using standardized protocols. These were used to detect progression to neovascular AMD in the study eye during follow-up. All imaging outputs were graded for the presence or absence of reticular pseudodrusen (RPD) using a multimodal approach. Choroidal thickness was measured at the foveal center and at 2 other equidistant locations from the fovea (1500 μm) nasally and temporally. Metrics on retinal thickness and volume were obtained from the manufacturer-supplied automated segmentation readouts.
Main Outcome Measures: Presence of RPD, distance visual acuity, near visual acuity, SKILL score, choroidal thickness, retinal thickness, and retinal volume.
Results: Reticular pseudodrusen was found in 43 participants (41%) on 1 or more imaging method. The SKILL score was significantly worse in those with reticular drusen (mean score ± standard deviation [SD, 38±12) versus those without (mean score ± SD, 33±9) (P = 0.034). Parafoveal retinal thickness, parafoveal retinal volume, and all of the choroidal thickness parameters measured were significantly lower in those with reticular drusen than in those without. The presence of RPD was associated with development of neovascular AMD when corrected for age and sex (odds ratio, 5.5; 95% confidence interval, 1.1–28.8; P = 0.042). All participants in whom geographic atrophy developed during follow-up had visible RPD at baseline.
Conclusions: Significant differences in retinal and choroidal anatomic features, visual function, and risk factor profile exist in unilateral neovascular AMD patients with RPD compared with those without; therefore, such patients should be monitored carefully because of the risk of developing bilateral disease.
Resumo:
Objective: To determine the pooled effect of exposure to one of 11 specialist palliative care teams providing services in patients’ homes.Design: Pooled analysis of a retrospective cohort study.Setting: Ontario, Canada.Participants: 3109 patients who received care from specialist palliative care teams in 2009-11 (exposed) matched by propensity score to 3109 patients who received usual care (unexposed).Intervention: The palliative care teams studied served different geographies and varied in team composition and size but had the same core team members and role: a core group of palliative care physicians, nurses, and family physicians who provide integrated palliative care to patients in their homes. The teams’ role was to manage symptoms, provide education and care, coordinate services, and be available without interruption regardless of time or day.Main outcome measures: Patients (a) being in hospital in the last two weeks of life; (b) having an emergency department visit in the last two weeks of life; or (c) dying in hospital.Results: In both exposed and unexposed groups, about 80% had cancer and 78% received end of life homecare services for the same average duration. Across all palliative care teams, 970 (31.2%) of the exposed group were in hospital and 896 (28.9%) had an emergency department visit in the last two weeks of life respectively, compared with 1219 (39.3%) and 1070 (34.5%) of the unexposed group (P<0.001). The pooled relative risks of being in hospital and having an emergency department visit in late life comparing exposed versus unexposed were 0.68 (95% confidence interval 0.61 to 0.76) and 0.77 (0.69 to 0.86) respectively. Fewer exposed than unexposed patients died in hospital (503 (16.2%) v 887 (28.6%), P<0.001), and the pooled relative risk of dying in hospital was 0.46 (0.40 to 0.52).Conclusions: Community based specialist palliative care teams, despite variation in team composition and geographies, were effective at reducing acute care use and hospital deaths at the end of life.
Resumo:
Objectives: This study sought to investigate the effect of a multiple micronutrient supplement on left ventricular ejection fraction (LVEF) in patients with heart failure. Background: Observational studies suggest that patients with heart failure have reduced intake and lower concentrations of a number of micronutrients. However, there have been very few intervention studies investigating the effect of micronutrient supplementation in patients with heart failure. Methods: This was a randomized, double-blind, placebo-controlled, parallel-group study involving 74 patients with chronic stable heart failure that compared multiple micronutrient supplementation taken once daily versus placebo for 12 months. The primary endpoint was LVEF assessed by cardiovascular magnetic resonance imaging or 3-dimensional echocardiography. Secondary endpoints were Minnesota Living With Heart Failure Questionnaire score, 6-min walk test distance, blood concentrations of N-terminal prohormone of brain natriuretic peptide, C-reactive protein, tumor necrosis factor alpha, interleukin-6, interleukin-10, and urinary levels of 8-iso-prostaglandin F2 alpha. Results: Blood concentrations of a number of micronutrients increased significantly in the micronutrient supplement group, indicating excellent compliance with the intervention. There was no significant difference in mean LVEF at 12 months between treatment groups after adjusting for baseline (mean difference: 1.6%, 95% confidence interval: -2.6 to 5.8, p = 0.441). There was also no significant difference in any of the secondary endpoints at 12 months between treatment groups. Conclusions: This study provides no evidence to support the routine treatment of patients with chronic stable heart failure with a multiple micronutrient supplement. (Micronutrient Supplementation in Patients With Heart Failure [MINT-HF]; NCT01005303).
Resumo:
Objective: To simultaneously evaluate 14 biomarkers from distinct biological pathways for risk prediction of ischemic stroke, including biomarkers of hemostasis, inflammation, and endothelial activation as well as chemokines and adipocytokines.
Methods and Results: The Prospective Epidemiological Study on Myocardial Infarction (PRIME) is a cohort of 9771 healthy men 50 to 59 years of age who were followed up over 10 years. In a nested case–control study, 95 ischemic stroke cases were matched with 190 controls. After multivariable adjustment for traditional risk factors, fibrinogen (odds ratio [OR], 1.53; 95% confidence interval [CI], 1.03–2.28), E-selectin (OR, 1.76; 95% CI, 1.06–2.93), interferon-γ-inducible-protein-10 (OR, 1.72; 95% CI, 1.06–2.78), resistin (OR, 2.86; 95% CI, 1.30–6.27), and total adiponectin (OR, 1.82; 95% CI, 1.04–3.19) were significantly associated with ischemic stroke. Adding E-selectin and resistin to a traditional risk factor model significantly increased the area under the receiver-operating characteristic curve from 0.679 (95% CI, 0.612–0.745) to 0.785 and 0.788, respectively, and yielded a categorical net reclassification improvement of 29.9% (P=0.001) and 28.4% (P=0.002), respectively. Their simultaneous inclusion in the traditional risk factor model increased the area under the receiver-operating characteristic curve to 0.824 (95% CI, 0.770–0.877) and resulted in an net reclassification improvement of 41.4% (P<0.001). Results were confirmed when using continuous net reclassification improvement.
Conclusion: Among multiple biomarkers from distinct biological pathways, E-selectin and resistin provided incremental and additive value to traditional risk factors in predicting ischemic stroke.
Resumo:
Solid organ transplant recipients have elevated cancer risks, owing in part to pharmacologic immunosuppression. However, little is known about risks for hematologic malignancies of myeloid origin. We linked the US Scientific Registry of Transplant Recipients with 15 population-based cancer registries to ascertain cancer occurrence among 207 859 solid organ transplants (1987–2009). Solid organ transplant recipients had a significantly elevated risk for myeloid neoplasms, with standardized incidence ratios (SIRs) of 4.6 (95% confidence interval 3.8–5.6; N=101) for myelodysplastic syndromes (MDS), 2.7 (2.2–3.2; N=125) for acute myeloid leukemia (AML), 2.3 (1.6–3.2; N=36) for chronic myeloid leukemia and 7.2 (5.4–9.3; N=57) for polycythemia vera. SIRs were highest among younger individuals and varied by time since transplantation and organ type (Poisson regression P<0.05 for all comparisons). Azathioprine for initial maintenance immunosuppression increased risk for MDS (P=0.0002) and AML (2–5 years after transplantation, P=0.0163). Overall survival following AML/MDS among transplant recipients was inferior to that of similar patients reported to US cancer registries (log-rank P<0.0001). Our novel finding of increased risks for specific myeloid neoplasms after solid organ transplantation supports a role for immune dysfunction in myeloid neoplasm etiology. The increased risks and inferior survival should heighten clinician awareness of myeloid neoplasms during follow-up of transplant recipients.
Resumo:
The Schizophrenia Psychiatric Genome-Wide Association Study Consortium (PGC) highlighted 81 single-nucleotide polymorphisms (SNPs) with moderate evidence for association to schizophrenia. After follow-up in independent samples, seven loci attained genome-wide significance (GWS), but multi-locus tests suggested some SNPs that did not do so represented true associations. We tested 78 of the 81 SNPs in 2640 individuals with a clinical diagnosis of schizophrenia attending a clozapine clinic (CLOZUK), 2504 cases with a research diagnosis of bipolar disorder, and 2878 controls. In CLOZUK, we obtained significant replication to the PGC-associated allele for no fewer than 37 (47%) of the SNPs, including many prior GWS major histocompatibility complex (MHC) SNPs as well as 3/6 non-MHC SNPs for which we had data that were reported as GWS by the PGC. After combining the new schizophrenia data with those of the PGC, variants at three loci (ITIH3/4, CACNA1C and SDCCAG8) that had not previously been GWS in schizophrenia attained that level of support. In bipolar disorder, we also obtained significant evidence for association for 21% of the alleles that had been associated with schizophrenia in the PGC. Our study independently confirms association to three loci previously reported to be GWS in schizophrenia, and identifies the first GWS evidence in schizophrenia for a further three loci. Given the number of independent replications and the power of our sample, we estimate 98% (confidence interval (CI) 78-100%) of the original set of 78 SNPs represent true associations. We also provide strong evidence for overlap in genetic risk between schizophrenia and bipolar disorder.
Resumo:
A course of treatment with narrow-band ultraviolet B (NB-UVB) improves psoriasis and increases serum 25-hydroxyvitamin D (25(OH)D). In this study 12 patients with psoriasis who were supplemented with oral cholecalciferol, 20 µg daily, were given a course of NB-UVB and their response measured. At baseline, serum 25(OH)D was 74.14 ± 22.9 nmol/l. At the 9th exposure to NB-UVB 25(OH)D had increased by 13.2 nmol/l (95% confidence interval (95% CI) 7.2–18.4) and at the 18th exposure by 49.4 nmol/l (95% CI 35.9–64.6) above baseline. Psoriasis Area Severity Index score improved from 8.7 ± 3.5 to 4.5 ± 2.0 (p < 0.001). At baseline, psoriasis lesions showed low vitamin D metabolizing enzyme (CYP27A1, CYP27B1) and high human β-defensin-2 mRNA expression levels compared with those of the healthy subjects. In conclusion, NB-UVB treatment significantly increases serum 25(OH)D in patients with psoriasis who are taking oral vitamin D supplementation, and the concentrations remain far from the toxicity level. Healing psoriasis lesions show similar mRNA expression of vitamin D metabolizing enzymes, but higher antimicrobial peptide levels than NB-UVB-treated skin in healthy subjects.
Resumo:
Background: This is an update of a review last published in Issue 5, 2010, of The Cochrane Library. Reducing weaning time is desirable in minimizing potential complications from mechanical ventilation. Standardized weaning protocols are purported to reduce time spent on mechanical ventilation. However, evidence supporting their use in clinical practice is inconsistent. Objectives: The first objective of this review was to compare the total duration of mechanical ventilation of critically ill adults who were weaned using protocols versus usual (non-protocolized) practice.The second objective was to ascertain differences between protocolized and non-protocolized weaning in outcomes measuring weaning duration, harm (adverse events) and resource use (intensive care unit (ICU) and hospital length of stay, cost).The third objective was to explore, using subgroup analyses, variations in outcomes by type of ICU, type of protocol and approach to delivering the protocol (professional-led or computer-driven). Search methods: We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library Issue 1, 2014), MEDLINE (1950 to January 2014), EMBASE (1988 to January 2014), CINAHL (1937 to January 2014), LILACS (1982 to January 2014), ISI Web of Science and ISI Conference Proceedings (1970 to February 2014), and reference lists of articles. We did not apply language restrictions. The original search was performed in January 2010 and updated in January 2014.Selection criteriaWe included randomized controlled trials (RCTs) and quasi-RCTs of protocolized weaning versus non-protocolized weaning from mechanical ventilation in critically ill adults. Data collection and analysis: Two authors independently assessed trial quality and extracted data. We performed a priori subgroup and sensitivity analyses. We contacted study authors for additional information. Main results: We included 17 trials (with 2434 patients) in this updated review. The original review included 11 trials. The total geometric mean duration of mechanical ventilation in the protocolized weaning group was on average reduced by 26% compared with the usual care group (N = 14 trials, 95% confidence interval (CI) 13% to 37%, P = 0.0002). Reductions were most likely to occur in medical, surgical and mixed ICUs, but not in neurosurgical ICUs. Weaning duration was reduced by 70% (N = 8 trials, 95% CI 27% to 88%, P = 0.009); and ICU length of stay by 11% (N = 9 trials, 95% CI 3% to 19%, P = 0.01). There was significant heterogeneity among studies for total duration of mechanical ventilation (I2 = 67%, P < 0.0001) and weaning duration (I2 = 97%, P < 0.00001), which could not be explained by subgroup analyses based on type of unit or type of approach. Authors' conclusions: There is evidence of reduced duration of mechanical ventilation, weaning duration and ICU length of stay with use of standardized weaning protocols. Reductions are most likely to occur in medical, surgical and mixed ICUs, but not in neurosurgical ICUs. However, significant heterogeneity among studies indicates caution in generalizing results. Some study authors suggest that organizational context may influence outcomes, however these factors were not considered in all included studies and could not be evaluated. Future trials should consider an evaluation of the process of intervention delivery to distinguish between intervention and implementation effects. There is an important need for further development and research in the neurosurgical population.
Resumo:
IntroductionAutomated weaning systems may improve adaptation of mechanical support for a patient’s ventilatory needs and facilitate systematic and early recognition of their ability to breathe spontaneously and the potential for discontinuation of ventilation. Our objective was to compare mechanical ventilator weaning duration for critically ill adults and children when managed with automated systems versus non-automated strategies. Secondary objectives were to determine differences in duration of ventilation, intensive care unit (ICU) and hospital length of stay (LOS), mortality, and adverse events.MethodsElectronic databases were searched to 30 September 2013 without language restrictions. We also searched conference proceedings; trial registration websites; and article reference lists. Two authors independently extracted data and assessed risk of bias. We combined data using random-effects modelling.ResultsWe identified 21 eligible trials totalling 1,676 participants. Pooled data from 16 trials indicated that automated systems reduced the geometric mean weaning duration by 30% (95% confidence interval (CI) 13% to 45%), with substantial heterogeneity (I2 = 87%, P <0.00001). Reduced weaning duration was found with mixed or medical ICU populations (42%, 95% CI 10% to 63%) and Smartcare/PS™ (28%, 95% CI 7% to 49%) but not with surgical populations or using other systems. Automated systems reduced ventilation duration with no heterogeneity (10%, 95% CI 3% to 16%) and ICU LOS (8%, 95% CI 0% to 15%). There was no strong evidence of effect on mortality, hospital LOS, reintubation, self-extubation and non-invasive ventilation following extubation. Automated systems reduced prolonged mechanical ventilation and tracheostomy. Overall quality of evidence was high.ConclusionsAutomated systems may reduce weaning and ventilation duration and ICU stay. Due to substantial trial heterogeneity an adequately powered, high quality, multi-centre randomized controlled trial is needed.