282 resultados para acute diarrhea


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wilms' tumor gene 1 (WT1) is overexpressed in the majority (70-90%) of acute leukemias and has been identified as an independent adverse prognostic factor, a convenient minimal residual disease (MRD) marker and potential therapeutic target in acute leukemia. We examined WT1 expression patterns in childhood acute lymphoblastic leukemia (ALL), where its clinical implication remains unclear. Using a real-time quantitative PCR designed according to Europe Against Cancer Program recommendations, we evaluated WT1 expression in 125 consecutively enrolled patients with childhood ALL (106 BCP-ALL, 19 T-ALL) and compared it with physiologic WT1 expression in normal and regenerating bone marrow (BM). In childhood B-cell precursor (BCP)-ALL, we detected a wide range of WT1 levels (5 logs) with a median WT1 expression close to that of normal BM. WT1 expression in childhood T-ALL was significantly higher than in BCP-ALL (P<0.001). Patients with MLL-AF4 translocation showed high WT1 overexpression (P<0.01) compared to patients with other or no chromosomal aberrations. Older children (> or =10 years) expressed higher WT1 levels than children under 10 years of age (P<0.001), while there was no difference in WT1 expression in patients with peripheral blood leukocyte count (WBC) > or =50 x 10(9)/l and lower. Analysis of relapsed cases (14/125) indicated that an abnormal increase or decrease in WT1 expression was associated with a significantly increased risk of relapse (P=0.0006), and this prognostic impact of WT1 was independent of other main risk factors (P=0.0012). In summary, our study suggests that WT1 expression in childhood ALL is very variable and much lower than in AML or adult ALL. WT1, thus, will not be a useful marker for MRD detection in childhood ALL, however, it does represent a potential independent risk factor in childhood ALL. Interestingly, a proportion of childhood ALL patients express WT1 at levels below the normal physiological BM WT1 expression, and this reduced WT1 expression appears to be associated with a higher risk of relapse.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A 41-year-old woman received a syngeneic BMT for CLL and subsequently developed acute skin GVHD. Transfusion-related allogeneic GVHD was excluded on the basis of an unchanged HLA type in circulating lymphocytes. Short tandem repeat PCR was used to confirm syngeneicity between donor and recipient. The patient had a personal and family history of autoimmune disease which may have made her particularly susceptible to development of syngeneic GVHD. The distinction between allogeneic and syngeneic or autologous GVHD is important because of therapeutic implications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hematopoietic chimerism was analyzed in serial bone marrow samples taken from 28 children following T-cell depleted unrelated donor bone marrow transplants (UD BMT) for acute lymphoblastic leukemia (ALL). Chimeric status was determined by polymerase chain reaction (PCR) of simple tandem repeat (STR) sequences (maximal sensitivity, 0.1%). At least two serial samples were examined in 23 patients. Of these, two had evidence of complete donor engraftment at all times and eight showed stable low level mixed chimerism (MC) (<1% recipient hematopoiesis). All 10 of these patients remain in remission with a minimum follow-up of 24 months. By contrast, 13 patients demonstrated a progressive return of recipient hematopoiesis. Five of these relapsed (4 to 9 months post BMT), one died of cytomegalovirus pneumonitis and seven remain in remission with a minimum follow-up of 24 months. Five children were excluded from serial analysis as two serial samples were not collected before either relapse (3) or graft rejection (2). We conclude that as with sibling transplants, ex vivo T depleted UD BMT in children with ALL is associated with a high incidence of MC. Stable donor engraftment and low level MC always correlated with continued remission. However, detection of a progressive return of recipient cells did not universally correlate with relapse, but highlighted those patients at greatest risk. Serial chimerism analysis by PCR of STRs provides a rapid and simple screening technique for the detection of relapse and the identification of patients with progressive MC who might benefit from detailed molecular analysis for minimal residual disease following matched volunteer UD BMT for childhood ALL.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Dietary cocoa is an important source of flavonoids and is associated with favorable cardiovascular disease effects, such as improvements in vascular function and lipid profiles, in nondiabetic adults. Type 2 diabetes (T2D) is associated with adverse effects on postprandial serum glucose, lipids, inflammation, and vascular function.

OBJECTIVE: We examined the hypothesis that cocoa reduces metabolic stress in obese T2D adults after a high-fat fast-food-style meal.

METHODS: Adults with T2D [n = 18; age (means ± SEs): 56 ± 3 y; BMI (in kg/m(2)): 35.3 ± 2.0; 14 women; 4 men) were randomly assigned to receive cocoa beverage (960 mg total polyphenols; 480 mg flavanols) or flavanol-free placebo (110 mg total polyphenols; <0.1 mg flavanols) with a high-fat fast-food-style breakfast [766 kcal, 50 g fat (59% energy)] in a crossover trial. After an overnight fast (10-12 h), participants consumed the breakfast with cocoa or placebo, and blood sample collection [glucose, insulin, lipids, and high-sensitivity C-reactive protein (hsCRP)] and vascular measurements were conducted at 0.5, 1, 2, 4, and 6 h postprandially on each study day. Insulin resistance was evaluated by homeostasis model assessment.

RESULTS: Over the 6-h study, and specifically at 1 and 4 h, cocoa increased HDL cholesterol vs. placebo (overall Δ: 1.5 ± 0.8 mg/dL; P ≤ 0.01) but had no effect on total and LDL cholesterol, triglycerides, glucose, and hsCRP. Cocoa increased serum insulin concentrations overall (Δ: 5.2 ± 3.2 mU/L; P < 0.05) and specifically at 4 h but had no overall effects on insulin resistance (except at 4 h, P < 0.05), systolic or diastolic blood pressure, or small artery elasticity. However, large artery elasticity was overall lower after cocoa vs. placebo (Δ: -1.6 ± 0.7 mL/mm Hg; P < 0.05), with the difference significant only at 2 h.

CONCLUSION: Acute cocoa supplementation showed no clear overall benefit in T2D patients after a high-fat fast-food-style meal challenge. Although HDL cholesterol and insulin remained higher throughout the 6-h postprandial period, an overall decrease in large artery elasticity was found after cocoa consumption. This trial was registered at clinicaltrials.gov as NCT01886989.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rationale: IL-17A is purported to help drive early pathogenesis in acute respiratory distress syndrome (ARDS) by enhancing neutrophil recruitment. Whilst IL-17A is the archetypal cytokine of T helper (Th)17 cells, it is produced by a number of lymphocytes, the source during ARDS being unknown.

Objectives: To identify the cellular source and the role of IL17A in the early phase of lung injury

Methods: Lung injury was induced in WT (C57BL/6) and IL-17 KO mice with aerosolised LPS (100 µg) or Pseudomonas aeruginosa infection. Detailed phenotyping of the cells expressing RORγt, the transcriptional regulator of IL-17 production, in the mouse lung at 24 hours was carried out by flow cytometry.

Measurement and Main Results: A 100-fold reduction in neutrophil infiltration was observed in the lungs of the IL-17A KO compared to wild type (WT) mice. The majority of RORγt+ cells in the mouse lung were the recently identified type 3 innate lymphoid cells (ILC3). Detailed characterisation revealed these pulmonary ILC3s (pILC3s) to be discrete from those described in the gut. The critical role of these cells was verified by inducing injury in Rag2 KO mice which lack T cells but retain ILCs. No amelioration of pathology was observed in the Rag2 KO mice.

Conclusions: IL-17 is rapidly produced during lung injury and significantly contributes to early immunopathogenesis. This is orchestrated largely by a distinct population of pILC3 cells. Modulation of pILC3s’ activity may potentiate early control of the inflammatory dysregulation seen in ARDS, opening up new therapeutic targets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim The aim of the study is to evaluate factors that enable or constrain the implementation and service delivery of early warnings systems or acute care training in practice. Background To date there is limited evidence to support the effectiveness of acute care initiatives (early warning systems, acute care training, outreach) in reducing the number of adverse events (cardiac arrest, death, unanticipated Intensive Care admission) through increased recognition and management of deteriorating ward based patients in hospital [1-3]. The reasons posited are that previous research primarily focused on measuring patient outcomes following the implementation of an intervention or programme without considering the social factors (the organisation, the people, external influences) which may have affected the process of implementation and hence measured end-points. Further research which considers the social processes is required in order to understand why a programme works, or does not work, in particular circumstances [4]. Method The design is a multiple case study approach of four general wards in two acute hospitals where Early Warning Systems (EWS) and Acute Life-threatening Events Recognition and Treatment (ALERT) course have been implemented. Various methods are being used to collect data about individual capacities, interpersonal relationships and institutional balance and infrastructures in order to understand the intended and unintended process outcomes of implementing EWS and ALERT in practice. This information will be gathered from individual and focus group interviews with key participants (ALERT facilitators, nursing and medical ALERT instructors, ward managers, doctors, ward nurses and health care assistants from each hospital); non-participant observation of ward organisation and structure; audit of patients' EWS charts and audit of the medical notes of patients who deteriorated during the study period to ascertain whether ALERT principles were followed. Discussion & progress to date This study commenced in January 2007. Ethical approval has been granted and data collection is ongoing with interviews being conducted with key stakeholders. The findings from this study will provide evidence for policy-makers to make informed decisions regarding the direction for strategic and service planning of acute care services to improve the level of care provided to acutely ill patients in hospital. References 1. Esmonde L, McDonnell A, Ball C, Waskett C, Morgan R, Rashidain A et al. Investigating the effectiveness of Critical Care Outreach Services: A systematic review. Intensive Care Medicine 2006; 32: 1713-1721 2. McGaughey J, Alderdice F, Fowler R, Kapila A, Mayhew A, Moutray M. Outreach and Early Warning Systems for the prevention of Intensive Care admission and death of critically ill patients on general hospital wards. Cochrane Database of Systematic Reviews 2007, Issue 3. www.thecochranelibrary.com 3. Winters BD, Pham JC, Hunt EA, Guallar E, Berenholtz S, Pronovost PJ (2007) Rapid Response Systems: A systematic review. Critical Care Medicine 2007; 35 (5): 1238-43 4. Pawson R and Tilley N. Realistic Evaluation. London; Sage: 1997

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statement of purpose The purpose of this concurrent session is to present the main findings and recommendations from a five year study evaluating the implementation of Early Warning Systems (EWS) and the Acute Life-threatening Events: Recognition and Treatment (ALERT) course in Northern Ireland. The presentation will provide delegates with an understanding of those factors that enable and constrain successful implementation of EWS and ALERT in practice in order to provide an impetus for change. Methods The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory [1]. The stages included identifying the programme theories underpinning EWS and ALERT, generating hypotheses, gathering empirical evidence and refining the programme theories. This approach used a variety of mixed methods including individual and focus group interviews, observation and documentary analysis of EWS compliance data and ALERT training records. A within and across case comparison facilitated the development of mid-range theories from the research evidence. Results The official RRS theories developed from the realist synthesis were critically evaluated and compared with the study findings to develop a mid-range theory to explain what works, for whom in what circumstances. The findings of what works suggests that clinical experience, established working relationships, flexible implementation of protocols, ongoing experiential learning, empowerment and pre-emptive management are key to the success of EWS and ALERT implementation. Each concept is presented as ‘context, mechanism and outcome configurations’ to provide an understanding of how the context impacts on individual reasoning or behaviour to produce certain outcomes. Conclusion These findings highlight the combination of factors that can improve the implementation and sustainability of EWS and ALERT and in light of this evidence several recommendations are made to provide policymakers with guidance and direction for future policy development. References: 1. Pawson R and Tilley N. (1997) Realistic Evaluation. Sage Publications; London Type of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Acute promyelocytic leukaemia is a chemotherapy-sensitive subgroup of acute myeloid leukaemia characterised by the presence of the PML-RARA fusion transcript. The present standard of care, chemotherapy and all-trans retinoic acid (ATRA), results in a high proportion of patients being cured. In this study, we compare a chemotherapy-free ATRA and arsenic trioxide treatment regimen with the standard chemotherapy-based regimen (ATRA and idarubicin) in both high-risk and low-risk patients with acute promyelocytic leukaemia.

METHODS: In the randomised, controlled, multicentre, AML17 trial, eligible patients (aged ≥16 years) with acute promyelocytic leukaemia, confirmed by the presence of the PML-RARA transcript and without significant cardiac or pulmonary comorbidities or active malignancy, and who were not pregnant or breastfeeding, were enrolled from 81 UK hospitals and randomised 1:1 to receive treatment with ATRA and arsenic trioxide or ATRA and idarubicin. ATRA was given to participants in both groups in a daily divided oral dose of 45 mg/m(2) until remission, or until day 60, and then in a 2 weeks on-2 weeks off schedule. In the ATRA and idarubicin group, idarubicin was given intravenously at 12 mg/m(2) on days 2, 4, 6, and 8 of course 1, and then at 5 mg/m(2) on days 1-4 of course 2; mitoxantrone at 10 mg/m(2) on days 1-4 of course 3, and idarubicin at 12 mg/m(2) on day 1 of the final (fourth) course. In the ATRA and arsenic trioxide group, arsenic trioxide was given intravenously at 0·3 mg/kg on days 1-5 of each course, and at 0·25 mg/kg twice weekly in weeks 2-8 of course 1 and weeks 2-4 of courses 2-5. High-risk patients (those presenting with a white blood cell count >10 × 10(9) cells per L) could receive an initial dose of the immunoconjugate gemtuzumab ozogamicin (6 mg/m(2) intravenously). Neither maintenance treatment nor CNS prophylaxis was given to patients in either group. All patients were monitored by real-time quantitative PCR. Allocation was by central computer minimisation, stratified by age, performance status, and de-novo versus secondary disease. The primary endpoint was quality of life on the European Organisation for Research and Treatment of Cancer (EORTC) QLQ-C30 global health status. All analyses are by intention to treat. This trial is registered with the ISRCTN registry, number ISRCTN55675535.

FINDINGS: Between May 8, 2009, and Oct 3, 2013, 235 patients were enrolled and randomly assigned to ATRA and idarubicin (n=119) or ATRA and arsenic trioxide (n=116). Participants had a median age of 47 years (range 16-77; IQR 33-58) and included 57 high-risk patients. Quality of life did not differ significantly between the treatment groups (EORTC QLQ-C30 global functioning effect size 2·17 [95% CI -2·79 to 7·12; p=0·39]). Overall, 57 patients in the ATRA and idarubicin group and 40 patients in the ATRA and arsenic trioxide group reported grade 3-4 toxicities. After course 1 of treatment, grade 3-4 alopecia was reported in 23 (23%) of 98 patients in the ATRA and idarubicin group versus 5 (5%) of 95 in the ATRA and arsenic trioxide group, raised liver alanine transaminase in 11 (10%) of 108 versus 27 (25%) of 109, oral toxicity in 22 (19%) of 115 versus one (1%) of 109. After course 2 of treatment, grade 3-4 alopecia was reported in 25 (28%) of 89 patients in the ATRA and idarubicin group versus 2 (3%) of 77 in the ATRA and arsenic trioxide group; no other toxicities reached the 10% level. Patients in the ATRA and arsenic trioxide group had significantly less requirement for most aspects of supportive care than did those in the ATRA and idarubicin group.

INTERPRETATION: ATRA and arsenic trioxide is a feasible treatment in low-risk and high-risk patients with acute promyelocytic leukaemia, with a high cure rate and less relapse than, and survival not different to, ATRA and idarubicin, with a low incidence of liver toxicity. However, no improvement in quality of life was seen.