12 resultados para Sprint-type interval training
Resumo:
Short interbirth interval has been associated with maternal complications and childhood autism and leukemia, possibly due to deficiencies in maternal micronutrients at conception or increased exposure to sibling infections. A possible association between interbirth interval and subsequent risk of childhood type 1 diabetes has not been investigated. A secondary analysis of 14 published observational studies of perinatal risk factors for type 1 diabetes was conducted. Risk estimates of diabetes by category of interbirth interval were calculated for each study. Random effects models were used to calculate pooled odds ratios (ORs) and investigate heterogeneity between studies. Overall, 2,787 children with type 1 diabetes were included. There was a reduction in the risk of childhood type 1 diabetes in children born to mothers after interbirth intervals <3 years compared with longer interbirth intervals (OR 0.82 [95% CI 0.72-0.93]). Adjustments for various potential confounders little altered this estimate. In conclusion, there was evidence of a 20% reduction in the risk of childhood diabetes in children born to mothers after interbirth intervals <3 years.
Resumo:
The aim of this cluster randomised controlled trial was to test the impact of an infection control education and training programme on meticillin-resistant Staphylococcus aureus (MRSA) prevalence in nursing homes. Nursing homes were randomised to intervention (infection control education and training programme; N¼16) or control (usual practice continued; N¼16). Staff in intervention homes were educated and trained (0, 3 and 6 months) in the principles and implementation of good infection control practice with infection control audits conducted in all sites (0, 3, 6 and 12 months) to assess compliance with good practice. Audit scores were fed back to nursing home managers in intervention homes, together with a written report indicating where practice could be improved. Nasal swabs were taken from all consenting residents and staff at 0, 3, 6 and 12 months. The primary outcome was MRSA prevalence in residents and staff, and the secondary outcome was a change in infection control audit scores. In all, 793 residents and 338 staff were recruited at baseline. MRSA prevalence did not change during the study in residents or staff. The relative risk of a resident being colonised with MRSA in an intervention home compared with a control home at 12 months was 0.99 (95% con?dence interval: 0.69, 1.42) after adjustment for clustering. Mean infection control audit scores were signi?cantly higher in the intervention homes (82%) compared with the control homes (64%) at 12 months (P<0.0001). Consideration should be given to other approaches which may help to reduce MRSA in this setting.
Resumo:
Background: The incidence rates of childhood onset type 1 diabetes are almost universally increasing across the globe but the aetiology of the disease remains largely unknown. We investigated whether birth order is associated with the risk of childhood diabetes by performing a pooled analysis of previous studies. Methods: Relevant studies published before January 2010 were identified from MEDLINE, Web of Science and EMBASE. Authors of studies provided individual patient data or conducted pre-specified analyses. Meta-analysis techniques were used to derive combined odds ratios (ORs), before and after adjustment for confounders, and investigate heterogeneity. Results: Data were available for 6 cohort and 25 case-control studies, including 11 955 cases of type 1 diabetes. Overall, there was no evidence of an association prior to adjustment for confounders. After adjustment for maternal age at birth and other confounders, a reduction in the risk of diabetes in second-or later born children became apparent [fully adjusted OR=0.90 95% confidence interval (CI) 0.83-0.98; P=0.02] but this association varied markedly between studies (I 2=67%). An a priori subgroup analysis showed that the association was stronger and more consistent in children <5years of age (n=25 studies, maternal age adjusted OR=0.84 95% CI 0.75, 0.93; I 2=23%). Conclusion: Although the association varied between studies, there was some evidence of a lower risk of childhood onset type 1 diabetes with increasing birth order, particularly in children aged <5 years. This finding could reflect increased exposure to infections in early life in later born children. Published by Oxford University Press on behalf of the International Epidemiological Association © The Author 2010; all rights reserved.
Resumo:
Evidence of high-velocity features (HVFs) such as those seen in the near-maximum spectra of some Type Ia supernovae (SNe Ia; e. g., SN 2000cx) has been searched for in the available SN Ia spectra observed earlier than 1 week before B maximum. Recent observational efforts have doubled the number of SNe Ia with very early spectra. Remarkably, all SNe Ia with early data ( seven in our Research Training Network sample and 10 from other programs) show signs of such features, to a greater or lesser degree, in Ca II IR and some also in the Si II lambda 6355 line. HVFs may be interpreted as abundance or density enhancements. Abundance enhancements would imply an outer region dominated by Si and Ca. Density enhancements may result from the sweeping up of circumstellar material (CSM) by the highest velocity SN ejecta. In this scenario, the high incidence of HVFs suggests that a thick disk and/or a high-density companion wind surrounds the exploding white dwarf, as may be the case in single degenerate systems. Large-scale angular fluctuations in the radial density and abundance distribution may also be responsible: this could originate in the explosion and would suggest a deflagration as the more likely explosion mechanism. CSM interaction and surface fluctuations may coexist, possibly leaving different signatures on the spectrum. In some SNe, the HVFs are narrowly confined in velocity, suggesting the ejection of blobs of burned material.
Resumo:
OBJECTIVESTo determine whether skin-intrinsic fluorescence (SIF) is associated with long-term complications of type 1 diabetes (T1D) and, if so, whether it is independent of chronic glycemic exposure and previous intensive therapy.RESEARCH DESIGN AND METHODSWe studied 1,185 (92%) of 1,289 active Diabetes Control and Complications Trial (DCCT)/Epidemiology of Diabetes Interventions and Complications (EDIC) participants from 2010 to 2011. SIF was determined using a fluorescence spectrometer and related cross-sectionally to recently determined measures of retinopathy (stereo fundus photography), cardiac autonomic neuropathy (CAN; R-R interval), confirmed clinical neuropathy, nephropathy (albumin excretion rate [AER]), and coronary artery calcification (CAC).RESULTSOverall, moderately strong associations were seen with all complications, before adjustment for mean HbA1c over time, which rendered these associations nonsignificant with the exception of sustained AER >30 mg/24 h and CAC, which were largely unaffected by adjustment. However, when examined within the former DCCT treatment group, associations were generally weaker in the intensive group and nonsignificant after adjustment, while in the conventional group, associations remained significant for CAN, sustained AER >30 mg/24 h, and CAC even after mean HbA1c adjustment.CONCLUSIONSSIF is associated with T1D complications in DCCT\EDIC. Much of this association appears to be related to historical glycemic exposure, particularly in the previously intensively treated participants, in whom adjustment for HbA1c eliminates statistical significance.
Resumo:
We consider the uplink of massive multicell multiple-input multiple-output systems, where the base stations (BSs), equipped with massive arrays, serve simultaneously several terminals in the same frequency band. We assume that the BS estimates the channel from uplink training, and then uses the maximum ratio combining technique to detect the signals transmitted from all terminals in its own cell. We propose an optimal resource allocation scheme which jointly selects the training duration, training signal power, and data signal power in order to maximize the sum spectral efficiency, for a given total energy budget spent in a coherence interval. Numerical results verify the benefits of the optimal resource allocation scheme. Furthermore, we show that more training signal power should be used at low signal-to-noise ratio (SNRs), and vice versa at high SNRs. Interestingly, for the entire SNR regime, the optimal training duration is equal to the number of terminals.
Resumo:
Effectiveness of brief/minimal contact self-activation interventions that encourage participation in physical activity (PA) for chronic low back pain (CLBP >12 weeks) is unproven. The primary objective of this assessor-blinded randomized controlled trial was to investigate the difference between an individualized walking programme (WP), group exercise class (EC), and usual physiotherapy (UP, control) in mean change in functional disability at 6 months. A sample of 246 participants with CLBP aged 18 to 65 years (79 men and 167 women; mean age ± SD: 45.4 ± 11.4 years) were recruited from 5 outpatient physiotherapy departments in Dublin, Ireland. Consenting participants completed self-report measures of functional disability, pain, quality of life, psychosocial beliefs, and PA were randomly allocated to the WP (n = 82), EC (n = 83), or UP (n = 81) and followed up at 3 (81%; n = 200), 6 (80.1%; n = 197), and 12 months (76.4%; n = 188). Cost diaries were completed at all follow-ups. An intention-to-treat analysis using a mixed between-within repeated-measures analysis of covariance found significant improvements over time on the Oswestry Disability Index (Primary Outcome), the Numerical Rating Scale, Fear Avoidance-PA scale, and the EuroQol EQ-5D-3L Weighted Health Index (P < 0.05), but no significant between-group differences and small between-group effect sizes (WP: mean difference at 6 months, 6.89 Oswestry Disability Index points, 95% confidence interval [CI] -3.64 to -10.15; EC: -5.91, CI: -2.68 to -9.15; UP: -5.09, CI: -1.93 to -8.24). The WP had the lowest mean costs and the highest level of adherence. Supervised walking provides an effective alternative to current forms of CLBP management.
Resumo:
PURPOSE: Arteriovenous fistulae (AVFs) are the preferred option for vascular access, as they are associated with lower mortality in hemodialysis patients than in those patients with arteriovenous grafts (AVGs) or central venous catheters (CVCs). We sought to assess whether vascular access outcomes for surgical trainees are comparable to fully trained surgeons.
METHODS: A prospectively collected database of patients was created and information recorded regarding patient demographics, past medical history, preoperative investigations, grade of operating surgeon, type of AVF formed, primary AVF function, cumulative AVF survival and functional patency.
RESULTS: One hundred and sixty-two patients were identified as having had vascular access procedures during the 6 month study period and 143 were included in the final analysis. Secondary AVF patency was established in 123 (86%) of these AVFs and 89 (62.2%) were used for dialysis. There was no significant difference in survival of AVFs according to training status of surgeon (log rank x2 0.506 p=0.477) or type of AVF (log rank x2 0.341 p=0.559). Patency rates of successful AVFs at 1 and 2 years were 60.9% and 47.9%, respectively.
CONCLUSION: We have demonstrated in this prospective study that there are no significant differences in outcomes of primary AVFs formed by fully trained surgeons versus surgical trainees. Creation of a primary AVF represents an excellent training platform for intermediate stage surgeons across general and vascular surgical specialties.
Resumo:
Statement of purpose The purpose of this concurrent session is to present the main findings and recommendations from a five year study evaluating the implementation of Early Warning Systems (EWS) and the Acute Life-threatening Events: Recognition and Treatment (ALERT) course in Northern Ireland. The presentation will provide delegates with an understanding of those factors that enable and constrain successful implementation of EWS and ALERT in practice in order to provide an impetus for change. Methods The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory [1]. The stages included identifying the programme theories underpinning EWS and ALERT, generating hypotheses, gathering empirical evidence and refining the programme theories. This approach used a variety of mixed methods including individual and focus group interviews, observation and documentary analysis of EWS compliance data and ALERT training records. A within and across case comparison facilitated the development of mid-range theories from the research evidence. Results The official RRS theories developed from the realist synthesis were critically evaluated and compared with the study findings to develop a mid-range theory to explain what works, for whom in what circumstances. The findings of what works suggests that clinical experience, established working relationships, flexible implementation of protocols, ongoing experiential learning, empowerment and pre-emptive management are key to the success of EWS and ALERT implementation. Each concept is presented as ‘context, mechanism and outcome configurations’ to provide an understanding of how the context impacts on individual reasoning or behaviour to produce certain outcomes. Conclusion These findings highlight the combination of factors that can improve the implementation and sustainability of EWS and ALERT and in light of this evidence several recommendations are made to provide policymakers with guidance and direction for future policy development. References: 1. Pawson R and Tilley N. (1997) Realistic Evaluation. Sage Publications; London Type of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast
Non-pharmacological interventions for cognitive impairment due to systemic cancer treatment (Review)
Resumo:
Background
It is estimated that up to 75% of cancer survivors may experience cognitive impairment as a result of cancer treatment and given the increasing size of the cancer survivor population, the number of affected people is set to rise considerably in coming years. There is a need, therefore, to identify effective, non-pharmacological interventions for maintaining cognitive function or ameliorating cognitive impairment among people with a previous cancer diagnosis.
Objectives
To evaluate the cognitive effects, non-cognitive effects, duration and safety of non-pharmacological interventions among cancer patients targeted at maintaining cognitive function or ameliorating cognitive impairment as a result of cancer or receipt of systemic cancer treatment (i.e. chemotherapy or hormonal therapies in isolation or combination with other treatments).
Search methods
We searched the Cochrane Centre Register of Controlled Trials (CENTRAL), MEDLINE, Embase, PUBMED, Cumulative Index of Nursing and Allied Health Literature (CINAHL) and PsycINFO databases. We also searched registries of ongoing trials and grey literature including theses, dissertations and conference proceedings. Searches were conducted for articles published from 1980 to 29 September 2015.
Selection criteria
Randomised controlled trials (RCTs) of non-pharmacological interventions to improve cognitive impairment or to maintain cognitive functioning among survivors of adult-onset cancers who have completed systemic cancer therapy (in isolation or combination with other treatments) were eligible. Studies among individuals continuing to receive hormonal therapy were included. We excluded interventions targeted at cancer survivors with central nervous system (CNS) tumours or metastases, non-melanoma skin cancer or those who had received cranial radiation or, were from nursing or care home settings. Language restrictions were not applied.
Data collection and analysis
Author pairs independently screened, selected, extracted data and rated the risk of bias of studies. We were unable to conduct planned meta-analyses due to heterogeneity in the type of interventions and outcomes, with the exception of compensatory strategy training interventions for which we pooled data for mental and physical well-being outcomes. We report a narrative synthesis of intervention effectiveness for other outcomes.
Main results
Five RCTs describing six interventions (comprising a total of 235 participants) met the eligibility criteria for the review. Two trials of computer-assisted cognitive training interventions (n = 100), two of compensatory strategy training interventions (n = 95), one of meditation (n = 47) and one of physical activity intervention (n = 19) were identified. Each study focused on breast cancer survivors. All five studies were rated as having a high risk of bias. Data for our primary outcome of interest, cognitive function were not amenable to being pooled statistically. Cognitive training demonstrated beneficial effects on objectively assessed cognitive function (including processing speed, executive functions, cognitive flexibility, language, delayed- and immediate- memory), subjectively reported cognitive function and mental well-being. Compensatory strategy training demonstrated improvements on objectively assessed delayed-, immediate- and verbal-memory, self-reported cognitive function and spiritual quality of life (QoL). The meta-analyses of two RCTs (95 participants) did not show a beneficial effect from compensatory strategy training on physical well-being immediately (standardised mean difference (SMD) 0.12, 95% confidence interval (CI) -0.59 to 0.83; I2= 67%) or two months post-intervention (SMD - 0.21, 95% CI -0.89 to 0.47; I2 = 63%) or on mental well-being two months post-intervention (SMD -0.38, 95% CI -1.10 to 0.34; I2 = 67%). Lower mental well-being immediately post-intervention appeared to be observed in patients who received compensatory strategy training compared to wait-list controls (SMD -0.57, 95% CI -0.98 to -0.16; I2 = 0%). We assessed the assembled studies using GRADE for physical and mental health outcomes and this evidence was rated to be low quality and, therefore findings should be interpreted with caution. Evidence for physical activity and meditation interventions on cognitive outcomes is unclear.
Authors' conclusions
Overall, the, albeit low-quality evidence may be interpreted to suggest that non-pharmacological interventions may have the potential to reduce the risk of, or ameliorate, cognitive impairment following systemic cancer treatment. Larger, multi-site studies including an appropriate, active attentional control group, as well as consideration of functional outcomes (e.g. activities of daily living) are required in order to come to firmer conclusions about the benefits or otherwise of this intervention approach. There is also a need to conduct research into cognitive impairment among cancer patient groups other than women with breast cancer.
Resumo:
Although epidemiological studies suggest that type 2 diabetes mellitus (T2DM) increases the risk of late-onset Alzheimer's disease (LOAD), the biological basis of this relationship is not well understood. The aim of this study was to examine the genetic comorbidity between the 2 disorders and to investigate whether genetic liability to T2DM, estimated by a genotype risk scores based on T2DM associated loci, is associated with increased risk of LOAD. This study was performed in 2 stages. In stage 1, we combined genotypes for the top 15 T2DM-associated polymorphisms drawn from approximately 3000 individuals (1349 cases and 1351 control subjects) with extracted and/or imputed data from 6 genome-wide studies (>10,000 individuals; 4507 cases, 2183 controls, 4989 population controls) to form a genotype risk score and examined if this was associated with increased LOAD risk in a combined meta-analysis. In stage 2, we investigated the association of LOAD with an expanded T2DM score made of 45 well-established variants drawn from the 6 genome-wide studies. Results were combined in a meta-analysis. Both stage 1 and stage 2 T2DM risk scores were not associated with LOAD risk (odds ratio = 0.988; 95% confidence interval, 0.972-1.004; p = 0.144 and odds ratio = 0.993; 95% confidence interval, 0.983-1.003; p = 0.149 per allele, respectively). Contrary to expectation, genotype risk scores based on established T2DM candidates were not associated with increased risk of LOAD. The observed epidemiological associations between T2DM and LOAD could therefore be a consequence of secondary disease processes, pleiotropic mechanisms, and/or common environmental risk factors. Future work should focus on well-characterized longitudinal cohorts with extensive phenotypic and genetic data relevant to both LOAD and T2DM.