112 resultados para Hospitals
Resumo:
Blending Art and Science in Nurse Education: The Benefits and Impact of Creative Partnerships
This paper presents the benefits of an innovative education partnership between lecturers from the School of Nursing and Midwifery, Queens University Belfast and Arts Care, a unique Arts and Health Charity in Northern Ireland, to engage nursing students in life sciences
Nursing and Midwifery students often struggle to engage with life science modules because they lack confidence in their ability to study science.This project was funded by a Teaching Innovation Award from the School of Nursing and Midwifery, Queens University Belfast, to explore creative ways of engaging year one undergraduate nursing students in learning anatomy and physiology. The project was facilitated through collaboration between Teaching staff from the School of Nursing and Midwifery and Arts Care, Northern Ireland. This unique Arts and Health Charity believes in the benefits of creativity to well being.
RESEARCH OBJECTIVE(S)
To explore creative ways of engaging year one undergraduate nursing students in learning anatomy and physiology.
METHODS AND METHODLOGY
Students participated in a series of workshops designed to explore the cells, tissues and organs of the human body through the medium of felt. Facilitated by an Arts Care artist, and following self-directed preparation, students discussed and translated their learning of the cells, tissues and organs of the human body into striking felt images. During the project students kept a reflective journal of their experience to document how participation in the project enhanced their learning and professional development
RESULTS
Creativity transformed and brought to life the students learning of the cells, tissues and organs of the human body.
The project culminated in the exhibition of a unique body of artwork which has been exhibited across Northern Ireland in hospitals and galleries and viewed by fellow students, teaching staff, nurses from practice, artists, friends, family and members of the public.
CONCLUSION
The impact of creativity learning strategies in nurse education should be further explored.
REFERENCES
Bennett, M and Rogers, K.MA. (2014) First impressions matter: an active, innovative and engaging method to recruit student volunteers for a pedagogic project. Reflections, Available online at: QUB, Centre for Educational Development / Publications / Reflections Newsletter, Issue 18, June 2014.
Chickering,A.W. and Gamson,Z.F. (1987) Seven principles for good practice in undergraduate education The American Association for Higher Education Bulletin, March. http://www.aahea.org/aahea/articles/sevenprinciples1987.htm, accessed 8th August 2014
Fell, P., Borland, G., Lynne, V. (2012) Lab versus lectures: can lab based practical sessions improve nursing students’ learning of bioscience? Health and Social Care Education 3:1, 33-38
Resumo:
Introduction: In this cohort study, we explored the relationship between fluid balance, intradialytic hypotension and outcomes in critically ill patients with acute kidney injury (AKI) who received renal replacement therapy (RRT).
Methods: We analysed prospectively collected registry data on patients older than 16 years who received RRT for at least two days in an intensive care unit at two university-affiliated hospitals. We used multivariable logistic regression to determine the relationship between mean daily fluid balance and intradialytic hypotension, both over seven days following RRT initiation, and the outcomes of hospital mortality and RRT dependence in survivors.
Results: In total, 492 patients were included (299 male (60.8%), mean (standard deviation (SD)) age 62.9 (16.3) years); 251 (51.0%) died in hospital. Independent risk factors for mortality were mean daily fluid balance (odds ratio (OR) 1.36 per 1000 mL positive (95% confidence interval (CI) 1.18 to 1.57), intradialytic hypotension (OR 1.14 per 10% increase in days with intradialytic hypotension (95% CI 1.06 to 1.23)), age (OR 1.15 per five-year increase (95% CI 1.07 to 1.25)), maximum sequential organ failure assessment score on days 1 to 7 (OR 1.21 (95% CI 1.13 to 1.29)), and Charlson comorbidity index (OR 1.28 (95% CI 1.14 to 1.44)); higher baseline creatinine (OR 0.98 per 10 mu mol/L (95% CI 0.97 to 0.996)) was associated with lower risk of death. Of 241 hospital survivors, 61 (25.3%) were RRT dependent at discharge. The only independent risk factor for RRT dependence was pre-existing heart failure (OR 3.13 (95% CI 1.46 to 6.74)). Neither mean daily fluid balance nor intradialytic hypotension was associated with RRT dependence in survivors. Associations between these exposures and mortality were similar in sensitivity analyses accounting for immortal time bias and dichotomising mean daily fluid balance as positive or negative. In the subgroup of patients with data on pre-RRT fluid balance, fluid overload at RRT initiation did not modify the association of mean daily fluid balance with mortality.
Conclusions: In this cohort of patients with AKI requiring RRT, a more positive mean daily fluid balance and intradialytic hypotension were associated with hospital mortality but not with RRT dependence at hospital discharge in survivors.
Resumo:
Background
Despite the recognized importance of end-of-life (EOL) communication between patients and physicians, the extent and quality of such communication is lacking.
Objective
We sought to understand patient perspectives on physician behaviours during EOL communication.
Design
In this mixed methods study, we conducted quantitative and qualitative strands and then merged data sets during a mixed methods analysis phase. In the quantitative strand, we used the quality of communication tool (QOC) to measure physician behaviours that predict global rating of satisfaction in EOL communication skills, while in the qualitative strand we conducted semi-structured interviews. During the mixed methods analysis, we compared and contrasted qualitative and quantitative data.
Setting and Participants
Seriously ill inpatients at three tertiary care hospitals in Canada.
Results
We found convergence between qualitative and quantitative strands: patients desire candid information from their physician and a sense of familiarity. The quantitative results (n = 132) suggest a paucity of certain EOL communication behaviours in this seriously ill population with a limited prognosis. The qualitative findings (n = 16) suggest that at times, physicians did not engage in EOL communication despite patient readiness, while sometimes this may represent an appropriate deferral after assessment of a patient's lack of readiness.
Conclusions
Avoidance of certain EOL topics may not always be a failure if it is a result of an assessment of lack of patient readiness. This has implications for future tool development: a measure could be built in to assess whether physician behaviours align with patient readiness.
Resumo:
Past nuclear disasters, such as the atomic bombings in 1945 and major accidents at nuclear power plants, have highlighted similarities in potential public health effects of radiation in both circumstances, including health issues unrelated to radiation exposure. Although the rarity of nuclear disasters limits opportunities to undertake rigorous research of evidence-based interventions and strategies, identification of lessons learned and development of an effective plan to protect the public, minimise negative effects, and protect emergency workers from exposure to high-dose radiation is important. Additionally, research is needed to help decision makers to avoid premature deaths among patients already in hospitals and other vulnerable groups during evacuation. Since nuclear disasters can affect hundreds of thousands of people, a substantial number of people are at risk of physical and mental harm in each disaster. During the recovery period after a nuclear disaster, physicians might need to screen for psychological burdens and provide general physical and mental health care for many affected residents who might experience long-term displacement. Reliable communication of personalised risks has emerged as a challenge for health-care professionals beyond the need to explain radiation protection. To overcome difficulties of risk communication and provide decision aids to protect workers, vulnerable people, and residents after a nuclear disaster, physicians should receive training in nuclear disaster response. This training should include evidence-based interventions, support decisions to balance potential harms and benefits, and take account of scientific uncertainty in provision of community health care. An open and joint learning process is essential to prepare for, and minimise the effects of, future nuclear disasters.
Resumo:
This paper presents multilevel models that utilize the Coxian phase-type distribution in order to be able to include a survival component in the model. The approach is demonstrated by modeling patient length of stay and in-hospital mortality in geriatric wards in Italy. The multilevel model is used to provide a means of controlling for the existence of possible intra-ward correlations, which may make patients within a hospital more alike in terms of experienced outcome than patients coming from different hospitals, everything else being equal. Within this multilevel model we introduce the use of the Coxian phase-type distribution to create a covariate that represents patient length of stay or stage (of hospital care). Results demonstrate that the use of the multilevel model for representing the in-patient mortality is successful and further enhanced by the inclusion of the Coxian phase-type distribution variable (stage covariate).
Resumo:
Institutions (and how they work) have long been the object of many investigations in the fields of media, cultural, and organizational studies. More recently, there has been a “linguistic” turn in the study of institutions with many language-focused explo- rations of how power and discourse may function in specific institutional and organi- zational settings, such as schools, courtrooms, corporations, clinics, hospitals, and pris- ons. Many of these studies have been concerned with the ways in which language is used to create and shape institutions and how institutions in turn have the capacity to create, shape, and impose discourses on people. Institutions thus have considerable control over the organizing of our routine experiences of the world and the way we classify that world. They also have the power to foster particular kinds of identities to suit their own purposes.
Resumo:
Background: To study the differences in ophthalmology resident training between China and the Hong Kong Special Administrative Region (HKSAR).Methods: Training programs were selected from among the largest and best-known teaching hospitals. Ophthalmology residents were sent an anonymous 48-item questionnaire by mail. Work satisfaction, time allocation between training activities and volume of surgery performed were determined.Results: 50/75 residents (66.7 %) from China and 20/26 (76.9 %) from HKSAR completed the survey. Age (28.9 ± 2.5 vs. 30.2 ± 2.9 years, p = 0.15) and number of years in training (3.4 ± 1.6 vs. 2.8 ± 1.5, p = 0.19) were comparable between groups. The number of cataract procedures performed by HKSAR trainees (extra-capsular, median 80.0, quartile range: 30.0, 100.0; phacoemulsification, median: 20.0, quartile range: 0.0, 100.0) exceeded that for Chinese residents (extra-capsular: median = 0, p < 0.0001; phacoemulsification: median = 0, p < 0.0001). Chinese trainees spent more time completing medical charts (>50 % of time on charts: 62.5 % versus 5.3 %, p < 0.0001) and received less supervision (≥90 % of training supervised: 4.4 % versus 65 %, p < 0.0001). Chinese residents were more likely to feel underpaid (96.0 % vs. 31.6 %, p < 0.0001) and hoped their children would not practice medicine (69.4 % vs. 5.0 %, p = 0.0001) compared HKSAR residents.Conclusions: In this study, ophthalmology residents in China report strikingly less surgical experience and supervision, and lower satisfaction than HKSAR residents. The HKSAR model of hands-on resident training might be useful in improving the low cataract surgical rate in China.
Resumo:
Purpose: As resident work hours policies evolve, residents’ off-duty time remains poorly understood. Despite assumptions about how residents should be using their postcall, off-duty time, there is little research on how residents actually use this time and the reasoning underpinning their activities. This study sought to understand residents’ nonclinical postcall activities when they leave the hospital, their decision-making processes, and their perspectives on the relationship between these activities and their well-being or recovery.
Method: The study took place at a Liaison Committee on Medical Education–accredited Canadian medical school from 2012 to 2014. The authors recruited a purposive and convenience sample of postgraduate year 1–5 residents from six surgical and nonsurgical specialties at three hospitals affiliated with the medical school. Using a constructivist grounded theory approach, semistructured interviews were conducted, audio-taped, transcribed, anonymized, and combined with field notes. The authors analyzed interview transcripts using constant comparative analysis and performed post hoc member checking.
Results: Twenty-four residents participated. Residents characterized their predominant approach to postcall decision making as one of making trade-offs between multiple, competing, seemingly incompatible, but equally valuable, activities. Participants exhibited two different trade-off orientations: being oriented toward maintaining a normal life or toward mitigating fatigue.
Conclusions: The authors’ findings on residents’ trade-off orientations suggest a dual recovery model with postcall trade-offs motivated by the recovery of sleep or of self. This model challenges the dominant viewpoint in the current duty hours literature and suggests that the duty hours discussion must be broadened to include other recovery processes.
Resumo:
INTRODUCTION: Following the introduction of work-hour restrictions, residents' workload has become an important theme in postgraduate training. The efficacy of restrictions on workload, however, remains controversial, as most research has only examined objective workload. The purpose of this study was to explore the less clearly understood component of subjective workload and, in particular, the factors that influenced residents' subjective workload.
METHOD: This study was conducted in Japan at three community teaching hospitals. We recruited a convenience sample of 31 junior residents in seven focus groups at the three sites. Audio-recorded and transcribed data were read iteratively and analyzed thematically, identifying, analyzing and reporting themes within the data and developing an interpretive synthesis of the topic.
RESULTS: Seven factors influenced residents' subjective workload: (1) interaction within the professional community, (2) feedback from patients, (3) being in control, (4) professional development, (5) private life, (6) interest and (7) protected free time.
DISCUSSION AND CONCLUSION: Our findings indicate that residents who have good interaction with colleagues and patients, are competent enough to control their work, experience personal development through working, have greater interest in their work, and have fulfilling private lives will have the least subjective workload.
Resumo:
Background: Rapid Response Systems (RRS) have been implemented nationally and internationally to improve patient safety in hospital. However, to date the majority of the RRS research evidence has focused on measuring the effectiveness of the intervention on patient outcomes. To evaluate RRS it has been recommended that a multimodal approach is required to address the broad range of process and outcome measures required to determine the effectiveness of the RRS concept. Aim: The aim of this paper is to evaluate the official RRS programme theoretical assumptions regarding how the programme is meant to work against actual practice in order to determine what works. Methods: The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory [1]. This approach used a variety of mixed methods to test the programme theories including individual and focus group interviews with a purposive sample of 75 nurses and doctors, observation of ward practices and documentary analysis. The findings from the case studies were analysed and compared within and across cases to identify what works for whom and in what circumstances. Results: The RRS programme theories were critically evaluated and compared with study findings to develop a mid-range theory to explain what works, for whom in what circumstances. The findings of what works suggests that clinical experience, established working relationships, flexible implementation of protocols, ongoing experiential learning, empowerment and pre-emptive management are key to the success of RRS implementation. Conclusion:These findings highlight the combination of factors that can improve the implementation of RRS and in light of this evidence several recommendations are made to provide policymakers with guidance and direction for their success and sustainability.References: 1.Pawson R and Tilley N. (1997) Realistic Evaluation. Sage Publications; LondonType of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast
Resumo:
Realistic Evaluation of EWS and ALERT: factors enabling and constraining implementation Background The implementation of EWS and ALERT in practice is essential to the success of Rapid Response Systems but is dependent upon nurses utilising EWS protocols and applying ALERT best practice guidelines. To date there is limited evidence on the effectiveness of EWS or ALERT as research has primarily focused on measuring patient outcomes (cardiac arrests, ICU admissions) following the implementation of a Rapid Response Team. Complex interventions in healthcare aimed at changing service delivery and related behaviour of health professionals require a different research approach to evaluate the evidence. To understand how and why EWS and ALERT work, or might not work, research needs to consider the social, cultural and organisational influences that will impact on successful implementation in practice. This requires a research approach that considers both the processes and outcomes of complex interventions, such as EWS and ALERT, implemented in practice. Realistic Evaluation is such an approach and was used to explain the factors that enable and constrain the implementation of EWS and ALERT in practice [1]. Aim The aim of this study was to evaluate factors that enabled and constrained the implementation and service delivery of early warnings systems (EWS) and ALERT in practice in order to provide direction for enabling their success and sustainability. Methods The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory. This approach used a variety of mixed methods to test the programme theories including individual and focus group interviews, observation and documentary analysis in a two stage process. A purposive sample of 75 key informants participated in individual and focus group interviews. Observation and documentary analysis of EWS compliance data and ALERT training records provided further evidence to support or refute the interview findings. Data was analysed using NVIVO8 to categorise interview findings and SPSS for ALERT documentary data. These findings were further synthesised by undertaking a within and cross case comparison to explain the factors enabling and constraining EWS and ALERT. Results A cross case analysis highlighted similarities, differences and factors enabling or constraining successful implementation across the case study sites. Findings showed that personal (confidence; clinical judgement; personality), social (ward leadership; communication), organisational (workload and staffing issues; pressure from managers to complete EWS audit and targets), educational (constraints on training; no clinical educator on ward) and cultural (routine task delegated) influences impact on EWS and acute care training outcomes. There were also differences noted between medical and surgical wards across both case sites. Conclusions Realist Evaluation allows refinement and development of the RRS programme theory to explain the realities of practice. These refined RRS programme theories are capable of informing the planning of future service provision and provide direction for enabling their success and sustainability. References: 1. McGaughey J, Blackwood B, O’Halloran P, Trinder T. J. & Porter S. (2010) A realistic evaluation of Track and Trigger systems and acute care training for early recognition and management of deteriorating ward–based patients. Journal of Advanced Nursing 66 (4), 923-932. Type of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast
Resumo:
Aim The aim of the study is to evaluate factors that enable or constrain the implementation and service delivery of early warnings systems or acute care training in practice. Background To date there is limited evidence to support the effectiveness of acute care initiatives (early warning systems, acute care training, outreach) in reducing the number of adverse events (cardiac arrest, death, unanticipated Intensive Care admission) through increased recognition and management of deteriorating ward based patients in hospital [1-3]. The reasons posited are that previous research primarily focused on measuring patient outcomes following the implementation of an intervention or programme without considering the social factors (the organisation, the people, external influences) which may have affected the process of implementation and hence measured end-points. Further research which considers the social processes is required in order to understand why a programme works, or does not work, in particular circumstances [4]. Method The design is a multiple case study approach of four general wards in two acute hospitals where Early Warning Systems (EWS) and Acute Life-threatening Events Recognition and Treatment (ALERT) course have been implemented. Various methods are being used to collect data about individual capacities, interpersonal relationships and institutional balance and infrastructures in order to understand the intended and unintended process outcomes of implementing EWS and ALERT in practice. This information will be gathered from individual and focus group interviews with key participants (ALERT facilitators, nursing and medical ALERT instructors, ward managers, doctors, ward nurses and health care assistants from each hospital); non-participant observation of ward organisation and structure; audit of patients' EWS charts and audit of the medical notes of patients who deteriorated during the study period to ascertain whether ALERT principles were followed. Discussion & progress to date This study commenced in January 2007. Ethical approval has been granted and data collection is ongoing with interviews being conducted with key stakeholders. The findings from this study will provide evidence for policy-makers to make informed decisions regarding the direction for strategic and service planning of acute care services to improve the level of care provided to acutely ill patients in hospital. References 1. Esmonde L, McDonnell A, Ball C, Waskett C, Morgan R, Rashidain A et al. Investigating the effectiveness of Critical Care Outreach Services: A systematic review. Intensive Care Medicine 2006; 32: 1713-1721 2. McGaughey J, Alderdice F, Fowler R, Kapila A, Mayhew A, Moutray M. Outreach and Early Warning Systems for the prevention of Intensive Care admission and death of critically ill patients on general hospital wards. Cochrane Database of Systematic Reviews 2007, Issue 3. www.thecochranelibrary.com 3. Winters BD, Pham JC, Hunt EA, Guallar E, Berenholtz S, Pronovost PJ (2007) Rapid Response Systems: A systematic review. Critical Care Medicine 2007; 35 (5): 1238-43 4. Pawson R and Tilley N. Realistic Evaluation. London; Sage: 1997
Resumo:
Statement of purpose The purpose of this concurrent session is to present the main findings and recommendations from a five year study evaluating the implementation of Early Warning Systems (EWS) and the Acute Life-threatening Events: Recognition and Treatment (ALERT) course in Northern Ireland. The presentation will provide delegates with an understanding of those factors that enable and constrain successful implementation of EWS and ALERT in practice in order to provide an impetus for change. Methods The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory [1]. The stages included identifying the programme theories underpinning EWS and ALERT, generating hypotheses, gathering empirical evidence and refining the programme theories. This approach used a variety of mixed methods including individual and focus group interviews, observation and documentary analysis of EWS compliance data and ALERT training records. A within and across case comparison facilitated the development of mid-range theories from the research evidence. Results The official RRS theories developed from the realist synthesis were critically evaluated and compared with the study findings to develop a mid-range theory to explain what works, for whom in what circumstances. The findings of what works suggests that clinical experience, established working relationships, flexible implementation of protocols, ongoing experiential learning, empowerment and pre-emptive management are key to the success of EWS and ALERT implementation. Each concept is presented as ‘context, mechanism and outcome configurations’ to provide an understanding of how the context impacts on individual reasoning or behaviour to produce certain outcomes. Conclusion These findings highlight the combination of factors that can improve the implementation and sustainability of EWS and ALERT and in light of this evidence several recommendations are made to provide policymakers with guidance and direction for future policy development. References: 1. Pawson R and Tilley N. (1997) Realistic Evaluation. Sage Publications; London Type of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast
Resumo:
Symposium Chair: Dr Jennifer McGaughey
Title: Early Warning Systems: problems, pragmatics and potential
Early Warning Systems (EWS) provide a mechanism for staff to recognise, refer and manage deteriorating patients on general hospital wards. Implementation of EWS in practice has required considerable change in the delivery of critical care across hospitals. Drawing their experience of these changes the authors will demonstrate the problems and potential of using EWS to improve patient outcomes.
The first paper (Dr Jennifer McGaughey: Early Warning Systems: what works?) reviews the research evidence regarding the factors that support or constrain the implementation of Early Warning System (EWS) in practice. These findings explain those processes which impact on the successful achievement of patient outcomes. In order to improve detection and standardise practice National EWS have been implemented in the United Kingdom. The second paper (Catherine Plowright: The implementation of the National EWS in a District General Hospital) focuses on the process of implementing and auditing a National EWS. This process improvement is essential to contribute to future collaborative research and collection of robust datasets to improve patient safety as recommended by the Royal College of Physicians (RCP 2012). To successfully implement NEWS in practice requires strategic planning and staff education. The practical issues of training staff is discussed in the third paper. This paper (Collette Laws-Chapman: Simulation as a modality to embed the use of Early Warning Systems) focuses on using simulation and structured debrief to enhance learning in the early recognition and management of deteriorating patients. This session emphasises the importance of cognitive and social skills developed alongside practical skills in the simulated setting.
Resumo:
BACKGROUND: Acute promyelocytic leukaemia is a chemotherapy-sensitive subgroup of acute myeloid leukaemia characterised by the presence of the PML-RARA fusion transcript. The present standard of care, chemotherapy and all-trans retinoic acid (ATRA), results in a high proportion of patients being cured. In this study, we compare a chemotherapy-free ATRA and arsenic trioxide treatment regimen with the standard chemotherapy-based regimen (ATRA and idarubicin) in both high-risk and low-risk patients with acute promyelocytic leukaemia.
METHODS: In the randomised, controlled, multicentre, AML17 trial, eligible patients (aged ≥16 years) with acute promyelocytic leukaemia, confirmed by the presence of the PML-RARA transcript and without significant cardiac or pulmonary comorbidities or active malignancy, and who were not pregnant or breastfeeding, were enrolled from 81 UK hospitals and randomised 1:1 to receive treatment with ATRA and arsenic trioxide or ATRA and idarubicin. ATRA was given to participants in both groups in a daily divided oral dose of 45 mg/m(2) until remission, or until day 60, and then in a 2 weeks on-2 weeks off schedule. In the ATRA and idarubicin group, idarubicin was given intravenously at 12 mg/m(2) on days 2, 4, 6, and 8 of course 1, and then at 5 mg/m(2) on days 1-4 of course 2; mitoxantrone at 10 mg/m(2) on days 1-4 of course 3, and idarubicin at 12 mg/m(2) on day 1 of the final (fourth) course. In the ATRA and arsenic trioxide group, arsenic trioxide was given intravenously at 0·3 mg/kg on days 1-5 of each course, and at 0·25 mg/kg twice weekly in weeks 2-8 of course 1 and weeks 2-4 of courses 2-5. High-risk patients (those presenting with a white blood cell count >10 × 10(9) cells per L) could receive an initial dose of the immunoconjugate gemtuzumab ozogamicin (6 mg/m(2) intravenously). Neither maintenance treatment nor CNS prophylaxis was given to patients in either group. All patients were monitored by real-time quantitative PCR. Allocation was by central computer minimisation, stratified by age, performance status, and de-novo versus secondary disease. The primary endpoint was quality of life on the European Organisation for Research and Treatment of Cancer (EORTC) QLQ-C30 global health status. All analyses are by intention to treat. This trial is registered with the ISRCTN registry, number ISRCTN55675535.
FINDINGS: Between May 8, 2009, and Oct 3, 2013, 235 patients were enrolled and randomly assigned to ATRA and idarubicin (n=119) or ATRA and arsenic trioxide (n=116). Participants had a median age of 47 years (range 16-77; IQR 33-58) and included 57 high-risk patients. Quality of life did not differ significantly between the treatment groups (EORTC QLQ-C30 global functioning effect size 2·17 [95% CI -2·79 to 7·12; p=0·39]). Overall, 57 patients in the ATRA and idarubicin group and 40 patients in the ATRA and arsenic trioxide group reported grade 3-4 toxicities. After course 1 of treatment, grade 3-4 alopecia was reported in 23 (23%) of 98 patients in the ATRA and idarubicin group versus 5 (5%) of 95 in the ATRA and arsenic trioxide group, raised liver alanine transaminase in 11 (10%) of 108 versus 27 (25%) of 109, oral toxicity in 22 (19%) of 115 versus one (1%) of 109. After course 2 of treatment, grade 3-4 alopecia was reported in 25 (28%) of 89 patients in the ATRA and idarubicin group versus 2 (3%) of 77 in the ATRA and arsenic trioxide group; no other toxicities reached the 10% level. Patients in the ATRA and arsenic trioxide group had significantly less requirement for most aspects of supportive care than did those in the ATRA and idarubicin group.
INTERPRETATION: ATRA and arsenic trioxide is a feasible treatment in low-risk and high-risk patients with acute promyelocytic leukaemia, with a high cure rate and less relapse than, and survival not different to, ATRA and idarubicin, with a low incidence of liver toxicity. However, no improvement in quality of life was seen.