61 resultados para low risk population


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The role of antiplatelet therapy as primary prophylaxis of thrombosis in low-risk essential thrombocythemia has not been studied in randomized clinical trials. We assessed the benefit/risk of low-dose aspirin in 433 low-risk essential thrombocythemia patients (CALR-mutated n=271, JAK2V617F-mutated n=162) who were on antiplatelet therapy or observation only. After a 2215 person-years follow-up free from cytoreduction, 25 thrombotic and 17 bleeding episodes were recorded. In CALR-mutated patients, antiplatelet therapy did not affect the risk of thrombosis but was associated with a higher incidence of bleeding (12.9 vs. 1.8 x1000 patient-years, p=0.03). In JAK2V617F-mutated patients, low-dose aspirin was associated with a reduced incidence of venous thrombosis with no effect on the risk of bleeding. Coexistence of JAK2V617F-mutation and cardiovascular risk factors increased the risk of thrombosis, even after adjusting for treatment with low-dose aspirin (incidence rate ratio: 9.8; 95% confidence interval: 2.3-42.3; p=0.02). Time free from cytoreduction was significantly shorter in CALR-mutated than in JAK2V617F-mutated essential thrombocythemia (median time 5 years and 9.8 years, respectively; p=0.0002) usually to control extreme thrombocytosis. In conclusion, in patients with low-risk, CALR-mutated essential thrombocythemia, low-dose aspirin does not reduce the risk of thrombosis and may increase the risk of bleeding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A prevalence study of primary biliary cirrhosis was carried out in the state of Victoria, Australia, by means of a mail survey of specialist physicians and a review of hospital records. Eighty four cases were identified, giving a prevalence of 19.1 per million population (95% confidence limits (CI) 15.3, 23.7), which is among the lowest in published reports. The prevalence in the Australian born, at risk population (women over the age of 24) was 51 per million (95% CI 37.5, 67.9). Both these figures are considerably lower than those in populations of similar age distribution in the UK and northern Europe. Since most Victorians are descended from British or European settlers, the low prevalence of primary biliary cirrhosis in this study supports the hypothesis that local environmental factors may be important in the pathogenesis of this disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: A previously described economic model was based on average values for patients diagnosed with chronic periodontitis (CP). However, tooth loss varies among treated patients and factors for tooth loss include CP severity and risk. The model was refined to incorporate CP severity and risk to determine the cost of treating a specific level of CP severity and risk that is associated with the benefit of tooth preservation.

Methods: A population that received and another that did not receive periodontal treatment were used to determine treatment costs and tooth loss. The number of teeth preserved was the difference of the number of teeth lost between the two populations. The cost of periodontal treatment was divided by the number of teeth preserved for combinations of CP severity and risk.

Results: The cost of periodontal treatment divided by the number of teeth preserved ranged from (US) $ 1,405 to $ 4,895 for high or moderate risk combined with any severity of CP and was more than $ 8,639 for low risk combined with mild CP. The cost of a three-unit bridge was $ 3,416, and the cost of a single-tooth replacement was $ 4,787.

Conclusion: Periodontal treatment could be justified on the sole basis of tooth preservation when CP risk is moderate or high regardless of disease severity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The 'frequent exacerbator' is recognised as an important phenotype in COPD. Current understanding about this phenotype comes from prospective longitudinal clinical trials in secondary/tertiary care with little information reported in primary care populations.

AIMS: To characterize the frequent-exacerbator phenotype and identify associated risk factors in a large UK primary care COPD population.

METHODS: Using a large database of primary care patients from 80 UK general practices, patients were categorised using GOLD 2014 criteria into high and low risk groups based on exacerbation history. A multivariate logistic regression model was used to investigate covariates associated with the frequent-exacerbator phenotype and risk of experiencing a severe exacerbation (leading to hospitalisation).

RESULTS: Of the total study population (n = 9219), 2612 (28%) fulfilled the criteria for high risk frequent-exacerbators. Independent risk factors (adjusted odds ratio [95% CI]) for ≥2 exacerbations were: most severely impaired modified Medical Research Council (mMRC) dyspnoea score (mMRC grade 4: 4.37 [2.64-7.23]), lower FEV1 percent predicted (FEV1 <30%: 2.42 [1.61-3.65]), co-morbid cardiovascular disease (1.42 [1.19-1.68]), depression (1.56 [1.22-1.99]) or osteoporosis (1.54 [1.19-2.01]), and female gender (1.20 [1.01-1.43]). Older patients (≥75 years), those with most severe lung impairment (FEV1 <30%), those with highest mMRC score and those with co-morbid osteoporosis were identified as most at risk of experiencing exacerbations requiring hospitalisation.

CONCLUSIONS: Although COPD exacerbations occur across all grades of disease severity, female patients with high dyspnoea scores, more severely impaired lung function and co-morbidities are at greatest risk. Elderly patients, with severely impaired lung function, high mMRC scores and osteoporosis are associated with experience of severe exacerbations requiring hospitalisation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Attention-deficit hyperactivity disorder (ADHD) is a heritable childhood onset disorder that is marked by variability at multiple levels including clinical presentation, cognitive profile, and response to stimulant medications. It has been suggested that this variability may reflect etiological differences, particularly, at the level of underlying genetics. This study examined whether an attentional phenotype-spatial attentional bias could serve as a marker of symptom severity, genetic risk, and stimulant response in ADHD. A total of 96 children and adolescents with ADHD were assessed on the Landmark Task, which is a sensitive measure of spatial attentional bias. All children were genotyped for polymorphisms (30 untranslated (UTR) and intron 8 variable number of tandem repeats (VNTRs)) of the dopamine transporter gene (DAT1). Spatial attentional bias correlated with ADHD symptom levels and varied according to DAT1 genotype. Children who were homozygous for the 10-repeat allele of the DAT1 30-UTR VNTR displayed a rightward attentional bias and had higher symptom levels compared to those with the low-risk genotype. A total of 26 of these children who were medication naive performed the Landmark Task at baseline and then again after 6 weeks of stimulant medication. Left-sided inattention (rightward bias) at baseline was associated with an enhanced response to stimulants at 6 weeks. Moreover, changes in spatial bias with stimulant medications, varied as a function of DAT1 genotype. This study suggests an attentional phenotype that relates to symptom severity and genetic risk for ADHD, and may have utility in predicting stimulant response in ADHD.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Current evidence supports the use of exercise-based treatment for chronic low back pain that encourages the patient to assume an active role in their recovery. Walking has been shown it to be an acceptable type of exercise with a low risk of injury. However, it is not known whether structured physical activity programmes are any more effective than giving advice to remain active.

METHODS/DESIGN: The proposed study will test the feasibility of using a pedometer-driven walking programme, as an adjunct to a standard education and advice session in participants with chronic low back pain. Fifty adult participants will be recruited via a number of different sources. Baseline outcome measures including self reported function; objective physical activity levels; fear-avoidance beliefs and health-related quality of life will be recorded. Eligible participants will be randomly allocated under strict, double blind conditions to one of two treatments groups. Participants in group A will receive a single education and advice session with a physiotherapist based on the content of the 'Back Book'. Participants in group B will receive the same education and advice session. In addition, they will also receive a graded pedometer-driven walking programme prescribed by the physiotherapist. Follow up outcomes will be recorded by the same researcher, who will remain blinded to group allocation, at eight weeks and six months post randomisation. A qualitative exploration of participants' perception of walking will also be examined by use of focus groups at the end of the intervention. As a feasibility study, treatment effects will be represented by point estimates and confidence intervals. The assessment of participant satisfaction will be tabulated, as will adherence levels and any recorded difficulties or adverse events experienced by the participants or therapists. This information will be used to modify the planned interventions to be used in a larger randomised controlled trial.

DISCUSSION: This paper describes the rationale and design of a study which will test the feasibility of using a structured, pedometer-driven walking programme in participants with chronic low back pain.

TRIAL REGISTRATION: [ISRCTN67030896].

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract
Background: Automated closed loop systems may improve adaptation of the mechanical support to a patient's ventilatory needs and
facilitate systematic and early recognition of their ability to breathe spontaneously and the potential for discontinuation of
ventilation.

Objectives: To compare the duration of weaning from mechanical ventilation for critically ill ventilated adults and children when managed
with automated closed loop systems versus non-automated strategies. Secondary objectives were to determine differences
in duration of ventilation, intensive care unit (ICU) and hospital length of stay (LOS), mortality, and adverse events.

Search methods: We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library 2011, Issue 2); MEDLINE (OvidSP) (1948 to August 2011); EMBASE (OvidSP) (1980 to August 2011); CINAHL (EBSCOhost) (1982 to August 2011); and the Latin American and Caribbean Health Sciences Literature (LILACS). In addition we received and reviewed auto-alerts for our search strategy in MEDLINE, EMBASE, and CINAHL up to August 2012. Relevant published reviews were sought using the Database of Abstracts of Reviews of Effects (DARE) and the Health Technology Assessment Database (HTA Database). We also searched the Web of Science Proceedings; conference proceedings; trial registration websites; and reference lists of relevant articles.

Selection criteria: We included randomized controlled trials comparing automated closed loop ventilator applications to non-automated weaning
strategies including non-protocolized usual care and protocolized weaning in patients over four weeks of age receiving invasive mechanical ventilation in an intensive care unit (ICU).

Data collection and analysis: Two authors independently extracted study data and assessed risk of bias. We combined data into forest plots using random-effects modelling. Subgroup and sensitivity analyses were conducted according to a priori criteria.

Main results: Pooled data from 15 eligible trials (14 adult, one paediatric) totalling 1173 participants (1143 adults, 30 children) indicated that automated closed loop systems reduced the geometric mean duration of weaning by 32% (95% CI 19% to 46%, P =0.002), however heterogeneity was substantial (I2 = 89%, P < 0.00001). Reduced weaning duration was found with mixed or
medical ICU populations (43%, 95% CI 8% to 65%, P = 0.02) and Smartcare/PS™ (31%, 95% CI 7% to 49%, P = 0.02) but not in surgical populations or using other systems. Automated closed loop systems reduced the duration of ventilation (17%, 95% CI 8% to 26%) and ICU length of stay (LOS) (11%, 95% CI 0% to 21%). There was no difference in mortality rates or hospital LOS. Overall the quality of evidence was high with the majority of trials rated as low risk.

Authors' conclusions: Automated closed loop systems may result in reduced duration of weaning, ventilation, and ICU stay. Reductions are more
likely to occur in mixed or medical ICU populations. Due to the lack of, or limited, evidence on automated systems other than Smartcare/PS™ and Adaptive Support Ventilation no conclusions can be drawn regarding their influence on these outcomes. Due to substantial heterogeneity in trials there is a need for an adequately powered, high quality, multi-centre randomized
controlled trial in adults that excludes 'simple to wean' patients. There is a pressing need for further technological development and research in the paediatric population.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE To assess the association between circulating angiogenic and antiangiogenic factors in the second trimester and risk of preeclampsia in women with type 1 diabetes.

RESEARCH DESIGN AND METHODS Maternal plasma concentrations of placental growth factor (PlGF), soluble fms-like tyrosine kinase 1 (sFlt-1), and soluble endoglin (sEng) were available at 26 weeks of gestation in 540 women with type 1 diabetes enrolled in the Diabetes and Preeclampsia Intervention Trial.

RESULTS Preeclampsia developed in 17% of pregnancies (n = 94). At 26 weeks of gestation, women in whom preeclampsia developed later had significantly lower PlGF (median [interquartile range]: 231 pg/mL [120–423] vs. 365 pg/mL [237–582]; P < 0.001), higher sFlt-1 (1,522 pg/mL [1,108–3,393] vs. 1,193 pg/mL [844–1,630] P < 0.001), and higher sEng (6.2 ng/mL [4.9–7.9] vs. 5.1 ng/mL[(4.3–6.2]; P < 0.001) compared with women who did not have preeclampsia. In addition, the ratio of PlGF to sEng was significantly lower (40 [17–71] vs. 71 [44–114]; P < 0.001) and the ratio of sFlt-1 to PlGF was significantly higher (6.3 [3.4–15.7] vs. 3.1 [1.8–5.8]; P < 0.001) in women who later developed preeclampsia. The addition of the ratio of PlGF to sEng or the ratio of sFlt-1 to PlGF to a logistic model containing established risk factors (area under the curve [AUC], 0.813) significantly improved the predictive value (AUC, 0.850 and 0.846, respectively; P < 0.01) and significantly improved reclassification according to the integrated discrimination improvement index (IDI) (IDI scores 0.086 and 0.065, respectively; P < 0.001).

CONCLUSIONS These data suggest that angiogenic and antiangiogenic factors measured during the second trimester are predictive of preeclampsia in women with type 1 diabetes. The addition of the ratio of PlGF to sEng or the ratio of sFlt-1 to PlGF to established clinical risk factors significantly improves the prediction of preeclampsia in women with type 1 diabetes.

Preeclampsia is characterized by the development of hypertension and new-onset proteinuria during the second half of pregnancy (1,2), leading to increased maternal morbidity and mortality (3). Women with type 1 diabetes are at increased risk for development of preeclampsia during pregnancy, with rates being two-times to four-times higher than that of the background maternity population (4,5). Small advances have come from preventive measures, such as low-dose aspirin in women at high risk (6); however, delivery remains the only effective intervention, and preeclampsia is responsible for up to 15% of preterm births and a consequent increase in infant mortality and morbidity (7).

Although the etiology of preeclampsia remains unclear, abnormal placental vascular remodeling and placental ischemia, together with maternal endothelial dysfunction, hemodynamic changes, and renal pathology, contribute to its pathogenesis (8). In addition, over the past decade accumulating evidence has suggested that an imbalance between angiogenic factors, such as placental growth factor (PlGF), and antiangiogenic factors, such as soluble fms-like tyrosine kinase 1 (sFlt-1) and soluble endoglin (sEng), plays a key role in the pathogenesis of preeclampsia (8,9). In women at low risk (10–13) and women at high risk (14,15), concentrations of angiogenic and antiangiogenic factors are significantly different between women who later develop preeclampsia (lower PlGF, higher sFlt-1, and higher sEng levels) compared with women who do not.

Few studies have specifically focused on circulating angiogenic factors and risk of preeclampsia in women with diabetes, and the results have been conflicting. In a small study, higher sFlt-1 and lower PlGF were reported at the time of delivery in women with diabetes who developed preeclampsia (16). In a longitudinal prospective cohort of pregnant women with diabetes, Yu et al. (17) reported increased sFlt-1 and reduced PlGF in the early third trimester as potential predictors of preeclampsia in women with type 1 diabetes, but they did not show any difference in sEng levels in women with preeclampsia compared with women without preeclampsia. By contrast, Powers et al. (18) reported only increased sEng in the second trimester in women with pregestational diabetes who developed preeclampsia.

The aim of this study, which was significantly larger than the previous studies highlighted, was to assess the association between circulating angiogenic (PlGF) and antiangiogenic (sFlt-1 and sEng) factors and the risk of preeclampsia in women with type 1 diabetes. A further aim was to evaluate the added predictive ability and clinical usefulness of angiogenic factors and established risk factors for preeclampsia risk prediction in women with type 1 diabetes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Automated closed loop systems may improve adaptation of mechanical support for a patient's ventilatory needs and facilitate systematic and early recognition of their ability to breathe spontaneously and the potential for discontinuation of ventilation. This review was originally published in 2013 with an update published in 2014. Objectives The primary objective for this review was to compare the total duration of weaning from mechanical ventilation, defined as the time from study randomization to successful extubation (as defined by study authors), for critically ill ventilated patients managed with an automated weaning system versus no automated weaning system (usual care). Secondary objectives for this review were to determine differences in the duration of ventilation, intensive care unit (ICU) and hospital lengths of stay (LOS), mortality, and adverse events related to early or delayed extubation with the use of automated weaning systems compared to weaning in the absence of an automated weaning system. Search methods We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library 2013, Issue 8); MEDLINE (OvidSP) (1948 to September 2013); EMBASE (OvidSP) (1980 to September 2013); CINAHL (EBSCOhost) (1982 to September 2013); and the Latin American and Caribbean Health Sciences Literature (LILACS). Relevant published reviews were sought using the Database of Abstracts of Reviews of Effects (DARE) and the Health Technology Assessment Database (HTA Database). We also searched the Web of Science Proceedings; conference proceedings; trial registration websites; and reference lists of relevant articles. The original search was run in August 2011, with database auto-alerts up to August 2012. Selection criteria We included randomized controlled trials comparing automated closed loop ventilator applications to non-automated weaning strategies including non-protocolized usual care and protocolized weaning in patients over four weeks of age receiving invasive mechanical ventilation in an ICU. Data collection and analysis Two authors independently extracted study data and assessed risk of bias. We combined data in forest plots using random-effects modelling. Subgroup and sensitivity analyses were conducted according to a priori criteria. Main results We included 21 trials (19 adult, two paediatric) totaling 1676 participants (1628 adults, 48 children) in this updated review. Pooled data from 16 eligible trials reporting weaning duration indicated that automated closed loop systems reduced the geometric mean duration of weaning by 30% (95% confidence interval (CI) 13% to 45%), however heterogeneity was substantial (I2 = 87%, P < 0.00001). Reduced weaning duration was found with mixed or medical ICU populations (42%, 95% CI 10% to 63%) and Smartcare/PS™ (28%, 95% CI 7% to 49%) but not in surgical populations or using other systems. Automated closed loop systems reduced the duration of ventilation (10%, 95% CI 3% to 16%) and ICU LOS (8%, 95% CI 0% to 15%). There was no strong evidence of an effect on mortality rates, hospital LOS, reintubation rates, self-extubation and use of non-invasive ventilation following extubation. Prolonged mechanical ventilation > 21 days and tracheostomy were reduced in favour of automated systems (relative risk (RR) 0.51, 95% CI 0.27 to 0.95 and RR 0.67, 95% CI 0.50 to 0.90 respectively). Overall the quality of the evidence was high with the majority of trials rated as low risk. Authors' conclusions Automated closed loop systems may result in reduced duration of weaning, ventilation and ICU stay. Reductions are more likely to occur in mixed or medical ICU populations. Due to the lack of, or limited, evidence on automated systems other than Smartcare/PS™ and Adaptive Support Ventilation no conclusions can be drawn regarding their influence on these outcomes. Due to substantial heterogeneity in trials there is a need for an adequately powered, high quality, multi-centre randomized controlled trial in adults that excludes 'simple to wean' patients. There is a pressing need for further technological development and research in the paediatric population.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aim: To evaluate and summarize the current evidence on the effectiveness of complementary and alternative medicine for the management of low back pain and/or pelvic pain in pregnancy.

Background: International research demonstrates that 25-30% of women use complementary and alternative medicine to manage low back and pelvic pain in pregnancy without robust evidence demonstrating its effectiveness.

Design: A systematic review of randomized controlled trials to determine the effectiveness of complementary and alternative medicine for low back and/or pelvic pain in pregnancy.

Data Sources: Cochrane library (1898-2013), PubMed (1996-2013), MEDLINE (1946-2013), AMED (1985-2013), Embase (1974-2013), Cinahl (1937-2013), Index to Thesis (1716-2013) and Ethos (1914-2013).

Review Methods: Selected studies were written in English, randomized controlled trials, a group 1 or 2 therapy and reported pain reduction as an outcome measure. Study quality was reviewed using Risk of Bias and evidence strength the Cochrane Grading of Recommendations and Development Evaluation Tool.

Results: Eight studies were selected for full review. Two acupuncture studies with low risk of bias showed both clinically important changes and statistically significant results. There was evidence of effectiveness for osteopathy and chiropractic. However, osteopathy and chiropractic studies scored high for risk of bias. Strength of the evidence across studies was very low.

Conclusion: There is limited evidence supporting the use of general CAM for managing pregnancy-related low back and/or pelvic pain. However, the restricted availability of high-quality studies, combined with the very low evidence strength, makes it impossible to make evidence-based recommendations for practice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Acute promyelocytic leukaemia is a chemotherapy-sensitive subgroup of acute myeloid leukaemia characterised by the presence of the PML-RARA fusion transcript. The present standard of care, chemotherapy and all-trans retinoic acid (ATRA), results in a high proportion of patients being cured. In this study, we compare a chemotherapy-free ATRA and arsenic trioxide treatment regimen with the standard chemotherapy-based regimen (ATRA and idarubicin) in both high-risk and low-risk patients with acute promyelocytic leukaemia.

METHODS: In the randomised, controlled, multicentre, AML17 trial, eligible patients (aged ≥16 years) with acute promyelocytic leukaemia, confirmed by the presence of the PML-RARA transcript and without significant cardiac or pulmonary comorbidities or active malignancy, and who were not pregnant or breastfeeding, were enrolled from 81 UK hospitals and randomised 1:1 to receive treatment with ATRA and arsenic trioxide or ATRA and idarubicin. ATRA was given to participants in both groups in a daily divided oral dose of 45 mg/m(2) until remission, or until day 60, and then in a 2 weeks on-2 weeks off schedule. In the ATRA and idarubicin group, idarubicin was given intravenously at 12 mg/m(2) on days 2, 4, 6, and 8 of course 1, and then at 5 mg/m(2) on days 1-4 of course 2; mitoxantrone at 10 mg/m(2) on days 1-4 of course 3, and idarubicin at 12 mg/m(2) on day 1 of the final (fourth) course. In the ATRA and arsenic trioxide group, arsenic trioxide was given intravenously at 0·3 mg/kg on days 1-5 of each course, and at 0·25 mg/kg twice weekly in weeks 2-8 of course 1 and weeks 2-4 of courses 2-5. High-risk patients (those presenting with a white blood cell count >10 × 10(9) cells per L) could receive an initial dose of the immunoconjugate gemtuzumab ozogamicin (6 mg/m(2) intravenously). Neither maintenance treatment nor CNS prophylaxis was given to patients in either group. All patients were monitored by real-time quantitative PCR. Allocation was by central computer minimisation, stratified by age, performance status, and de-novo versus secondary disease. The primary endpoint was quality of life on the European Organisation for Research and Treatment of Cancer (EORTC) QLQ-C30 global health status. All analyses are by intention to treat. This trial is registered with the ISRCTN registry, number ISRCTN55675535.

FINDINGS: Between May 8, 2009, and Oct 3, 2013, 235 patients were enrolled and randomly assigned to ATRA and idarubicin (n=119) or ATRA and arsenic trioxide (n=116). Participants had a median age of 47 years (range 16-77; IQR 33-58) and included 57 high-risk patients. Quality of life did not differ significantly between the treatment groups (EORTC QLQ-C30 global functioning effect size 2·17 [95% CI -2·79 to 7·12; p=0·39]). Overall, 57 patients in the ATRA and idarubicin group and 40 patients in the ATRA and arsenic trioxide group reported grade 3-4 toxicities. After course 1 of treatment, grade 3-4 alopecia was reported in 23 (23%) of 98 patients in the ATRA and idarubicin group versus 5 (5%) of 95 in the ATRA and arsenic trioxide group, raised liver alanine transaminase in 11 (10%) of 108 versus 27 (25%) of 109, oral toxicity in 22 (19%) of 115 versus one (1%) of 109. After course 2 of treatment, grade 3-4 alopecia was reported in 25 (28%) of 89 patients in the ATRA and idarubicin group versus 2 (3%) of 77 in the ATRA and arsenic trioxide group; no other toxicities reached the 10% level. Patients in the ATRA and arsenic trioxide group had significantly less requirement for most aspects of supportive care than did those in the ATRA and idarubicin group.

INTERPRETATION: ATRA and arsenic trioxide is a feasible treatment in low-risk and high-risk patients with acute promyelocytic leukaemia, with a high cure rate and less relapse than, and survival not different to, ATRA and idarubicin, with a low incidence of liver toxicity. However, no improvement in quality of life was seen.


Relevância:

90.00% 90.00%

Publicador:

Resumo:

flatoxins are fungal toxins that possess acute life threatening toxicity, carcinogenic properties and other potential chronic adverse effects. Dietary exposure to aflatoxins is considered a major public health concern, especially for subsistence farming communities in sub-Saharan Africa and South Asia, where dietary staple food crops such as groundnuts and maize are often highly contaminated with aflatoxin due to hot and humid climates and poor storage, together with low awareness of risk and lack of enforcement of regulatory limits. Biomarkers have been developed and applied in many epidemiological studies assessing aflatoxin exposure and the associated health effects in these high-risk population groups. This review discusses the recent epidemiological evidence for aflatoxin exposure, co-exposure with other mycotoxins and associated health effects in order to provide evidence on risk assessment, and highlight areas where further research is necessary. Aflatoxin exposure can occur at any stage of life and is a major risk factor for hepatocellular carcinoma, especially when hepatitis B infection is present. Recent evidence suggests that aflatoxin may be an underlying determinant of stunted child growth, and may lower cell-mediated immunity, thereby increasing disease susceptibility. However, a causal relationship between aflatoxin exposure and these latter adverse health outcomes has not been established, and the biological mechanisms for these have not been elucidated, prompting further research. Furthermore, there is a dearth of information regarding the health effects of co-exposure to aflatoxin with other mycotoxins. Recent developments of biomarkers provide opportunities for important future research in this area.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Falls and fall-related injuries are symptomatic of an aging population. This study aimed to design, develop, and deliver a novel method of balance training, using an interactive game-based system to promote engagement, with the inclusion of older adults at both high and low risk of experiencing a fall.

STUDY DESIGN: Eighty-two older adults (65 years of age and older) were recruited from sheltered accommodation and local activity groups. Forty volunteers were randomly selected and received 5 weeks of balance game training (5 males, 35 females; mean, 77.18 ± 6.59 years), whereas the remaining control participants recorded levels of physical activity (20 males, 22 females; mean, 76.62 ± 7.28 years). The effect of balance game training was measured on levels of functional balance and balance confidence in individuals with and without quantifiable balance impairments.

RESULTS: Balance game training had a significant effect on levels of functional balance and balance confidence (P < 0.05). This was further demonstrated in participants who were deemed at high risk of falls. The overall pattern of results suggests the training program is effective and suitable for individuals at all levels of ability and may therefore play a role in reducing the risk of falls.

CONCLUSIONS: Commercial hardware can be modified to deliver engaging methods of effective balance assessment and training for the older population.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The identification of subjects at high risk for Alzheimer’s disease is important for prognosis and early intervention. We investigated the polygenic architecture of Alzheimer’s disease and the accuracy of Alzheimer’s disease prediction models, including and excluding the polygenic component in the model. This study used genotype data from the powerful dataset comprising 17 008 cases and 37 154 controls obtained from the International Genomics of Alzheimer’s Project (IGAP). Polygenic score analysis tested whether the alleles identified to associate with disease in one sample set were significantly enriched in the cases relative to the controls in an independent sample. The disease prediction accuracy was investigated in a subset of the IGAP data, a sample of 3049 cases and 1554 controls (for whom APOE genotype data were available) by means of sensitivity, specificity, area under the receiver operating characteristic curve (AUC) and positive and negative predictive values. We observed significant evidence for a polygenic component enriched in Alzheimer’s disease (P = 4.9 × 10−26). This enrichment remained significant after APOE and other genome-wide associated regions were excluded (P = 3.4 × 10−19). The best prediction accuracy AUC = 78.2% (95% confidence interval 77–80%) was achieved by a logistic regression model with APOE, the polygenic score, sex and age as predictors. In conclusion, Alzheimer’s disease has a significant polygenic component, which has predictive utility for Alzheimer’s disease risk and could be a valuable research tool complementing experimental designs, including preventative clinical trials, stem cell selection and high/low risk clinical studies. In modelling a range of sample disease prevalences, we found that polygenic scores almost doubles case prediction from chance with increased prediction at polygenic extremes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Normally, populations of brown trout are genetically highly variable. Two adjacent populations from NW Scotland, which had previously been found to be monomorphic for 46 protein-coding loci, were studied by higher resolution techniques. Analyses of mitochondrial DNA, multilocus DNA fingerprints and eight specific minisatellite loci revealed no genetic variation among individuals or genetic differences between the two populations. Continual low effective population sizes or severe repeated bottlenecks, as a result of low or variable recruitment, probably explain the atypical absence of genetic variation in these trout populations. Growth data do not provide any evidence of a reduction in fitness in trout from these populations.