113 resultados para Insane hospitals
Resumo:
Background
Treatments for open-angle glaucoma aim to prevent vision loss through lowering of intraocular pressure, but to our knowledge no placebo-controlled trials have assessed visual function preservation, and the observation periods of previous (unmasked) trials have typically been at least 5 years. We assessed vision preservation in patients given latanoprost compared with those given placebo.
Methods
In this randomised, triple-masked, placebo-controlled trial, we enrolled patients with newly diagnosed open-angle glaucoma at ten UK centres (tertiary referral centres, teaching hospitals, and district general hospitals). Eligible patients were randomly allocated (1:1) with a website-generated randomisation schedule, stratified by centre and with a permuted block design, to receive either latanoprost 0·005% (intervention group) or placebo (control group) eye drops. Drops were administered from identical bottles, once a day, to both eyes. The primary outcome was time to visual field deterioration within 24 months. Analyses were done in all individuals with follow-up data. The Data and Safety Monitoring Committee (DSMC) recommended stopping the trial on Jan 6, 2011 (last patient visit July, 2011), after an interim analysis, and suggested a change in primary outcome from the difference in proportions of patients with incident progression between groups to time to visual field deterioration within 24 months. This trial is registered, number ISRCTN96423140.
Findings
We enrolled 516 individuals between Dec 1, 2006, and March 16, 2010. Baseline mean intraocular pressure was 19·6 mm Hg (SD 4·6) in 258 patients in the latanoprost group and 20·1 mm Hg (4·8) in 258 controls. At 24 months, mean reduction in intraocular pressure was 3·8 mm Hg (4·0) in 231 patients assessed in the latanoprost group and 0·9 mm Hg (3·8) in 230 patients assessed in the placebo group. Visual field preservation was significantly longer in the latanoprost group than in the placebo group: adjusted hazard ratio (HR) 0·44 (95% CI 0·28–0·69; p=0·0003). We noted 18 serious adverse events, none attributable to the study drug.
Interpretation
This is the first randomised placebo-controlled trial to show preservation of the visual field with an intraocular-pressure-lowering drug in patients with open-angle glaucoma. The study design enabled significant differences in vision to be assessed in a relatively short observation period
Resumo:
Importance: Seriously ill hospitalized patients have identified communication and decision making about goals of care as high priorities for quality improvement in end-of-life care. Interventions to improve care are more likely to succeed if tailored to existing barriers.
Objective: To determine, from the perspective of hospital-based clinicians, (1) barriers impeding communication and decision making about goals of care with seriously ill hospitalized patients and their families and (2) their own willingness and the acceptability for other clinicians to engage in this process.
Design, Setting, and Participants: Multicenter survey of medical teaching units of nurses, internal medicine residents, and staff physicians from participating units at 13 university-based hospitals from 5 Canadian provinces.
Main Outcomes and Measures: Importance of 21 barriers to goals of care discussions rated on a 7-point scale (1 = extremely unimportant; 7 = extremely important).
Results: Between September 2012 and March 2013, questionnaires were returned by 1256 of 1617 eligible clinicians, for an overall response rate of 77.7% (512 of 646 nurses [79.3%], 484 of 634 residents [76.3%], 260 of 337 staff physicians [77.2%]). The following family member-related and patient-related factors were consistently identified by all 3 clinician groups as the most important barriers to goals of care discussions: family members' or patients' difficulty accepting a poor prognosis (mean [SD] score, 5.8 [1.2] and 5.6 [1.3], respectively), family members' or patients' difficulty understanding the limitations and complications of life-sustaining treatments (5.8 [1.2] for both groups), disagreement among family members about goals of care (5.8 [1.2]), and patients' incapacity to make goals of care decisions (5.6 [1.2]). Clinicians perceived their own skills and system factors as less important barriers. Participants viewed it as acceptable for all clinician groups to engage in goals of care discussions-including a role for advance practice nurses, nurses, and social workers to initiate goals of care discussions and be a decision coach.
Conclusions and Relevance: Hospital-based clinicians perceive family member-related and patient-related factors as the most important barriers to goals of care discussions. All health care professionals were viewed as playing important roles in addressing goals of care. These findings can inform the design of future interventions to improve communication and decision making about goals of care.
Resumo:
Objectives: To investigate the quality of end-of-life care for patients with metastatic non-small cell lung cancer (NSCLC). Design and participants: Retrospective cohort study of patients from first hospitalisation for metastatic disease until death, using hospital, emergency department and death registration data from Victoria, Australia, between 1 July 2003 and 30 June 2010. Main outcome measures: Emergency department and hospital use; aggressiveness of care including intensive care and chemotherapy in last 30 days; palliative and supportive care provision; and place of death. Results: Metastatic NSCLC patients underwent limited aggressive treatment such as intensive care (5%) and chemotherapy (< 1%) at the end of life; however, high numbers died in acute hospitals (42%) and 61% had a length of stay of greater than 14 days in the last month of life. Although 62% were referred to palliative care services, this occurred late in the illness. In a logistic regression model adjusted for year of metastasis, age, sex, metastatic site and survival, the odds ratio (OR) of dying in an acute hospital bed compared with death at home or in a hospice unit decreased with receipt of palliative care (OR, 0.25; 95% CI, 0.21–0.30) and multimodality supportive care (OR, 0.65; 95% CI, 0.56–0.75). Conclusion: Because early palliative care for patients with metastatic NSCLC is recommended, we propose that this group be considered a benchmark of quality end-of-life care. Future work is required to determine appropriate quality-of-care targets in this and other cancer patient cohorts, with particular focus on the timeliness of palliative care engagement.
Resumo:
Background: There is growing interest in the potential utility of real-time polymerase chain reaction (PCR) in diagnosing bloodstream infection by detecting pathogen deoxyribonucleic acid (DNA) in blood samples within a few hours. SeptiFast (Roche Diagnostics GmBH, Mannheim, Germany) is a multipathogen probe-based system targeting ribosomal DNA sequences of bacteria and fungi. It detects and identifies the commonest pathogens causing bloodstream infection. As background to this study, we report a systematic review of Phase III diagnostic accuracy studies of SeptiFast, which reveals uncertainty about its likely clinical utility based on widespread evidence of deficiencies in study design and reporting with a high risk of bias.
Objective: Determine the accuracy of SeptiFast real-time PCR for the detection of health-care-associated bloodstream infection, against standard microbiological culture.
Design: Prospective multicentre Phase III clinical diagnostic accuracy study using the standards for the reporting of diagnostic accuracy studies criteria.
Setting: Critical care departments within NHS hospitals in the north-west of England.
Participants: Adult patients requiring blood culture (BC) when developing new signs of systemic inflammation.
Main outcome measures: SeptiFast real-time PCR results at species/genus level compared with microbiological culture in association with independent adjudication of infection. Metrics of diagnostic accuracy were derived including sensitivity, specificity, likelihood ratios and predictive values, with their 95% confidence intervals (CIs). Latent class analysis was used to explore the diagnostic performance of culture as a reference standard.
Results: Of 1006 new patient episodes of systemic inflammation in 853 patients, 922 (92%) met the inclusion criteria and provided sufficient information for analysis. Index test assay failure occurred on 69 (7%) occasions. Adult patients had been exposed to a median of 8 days (interquartile range 4–16 days) of hospital care, had high levels of organ support activities and recent antibiotic exposure. SeptiFast real-time PCR, when compared with culture-proven bloodstream infection at species/genus level, had better specificity (85.8%, 95% CI 83.3% to 88.1%) than sensitivity (50%, 95% CI 39.1% to 60.8%). When compared with pooled diagnostic metrics derived from our systematic review, our clinical study revealed lower test accuracy of SeptiFast real-time PCR, mainly as a result of low diagnostic sensitivity. There was a low prevalence of BC-proven pathogens in these patients (9.2%, 95% CI 7.4% to 11.2%) such that the post-test probabilities of both a positive (26.3%, 95% CI 19.8% to 33.7%) and a negative SeptiFast test (5.6%, 95% CI 4.1% to 7.4%) indicate the potential limitations of this technology in the diagnosis of bloodstream infection. However, latent class analysis indicates that BC has a low sensitivity, questioning its relevance as a reference test in this setting. Using this analysis approach, the sensitivity of the SeptiFast test was low but also appeared significantly better than BC. Blood samples identified as positive by either culture or SeptiFast real-time PCR were associated with a high probability (> 95%) of infection, indicating higher diagnostic rule-in utility than was apparent using conventional analyses of diagnostic accuracy.
Conclusion: SeptiFast real-time PCR on blood samples may have rapid rule-in utility for the diagnosis of health-care-associated bloodstream infection but the lack of sensitivity is a significant limiting factor. Innovations aimed at improved diagnostic sensitivity of real-time PCR in this setting are urgently required. Future work recommendations include technology developments to improve the efficiency of pathogen DNA extraction and the capacity to detect a much broader range of pathogens and drug resistance genes and the application of new statistical approaches able to more reliably assess test performance in situation where the reference standard (e.g. blood culture in the setting of high antimicrobial use) is prone to error.
Resumo:
The emergence of multidrug-resistant pathogens within the clinical environment is presenting a mounting problem in hospitals worldwide. The 'ESKAPE' pathogens (Enterococcusfaecium, Staphylococcus aureus, Klebsiella pneumoniae, Acinetobacter baumannii, Pseudomonas aeruginosa and Enterobacter spp.) have been highlighted as a group of causative organisms in a majority of nosocomial infections, presenting a serious health risk due to widespread antimicrobial resistance. The stagnating pipeline of new antibiotics requires alternative approaches to the control and treatment of nosocomial infections. Atmospheric pressure nonthermal plasma (APNTP) is attracting growing interest as an alternative infection control approach within the clinical setting. This study presents a comprehensive bactericidal assessment of an in-house-designed APNTP jet both against biofilms and planktonic bacteria of the ESKAPE pathogens. Standard plate counts and the XTT metabolic assay were used to evaluate the antibacterial effect of APNTP, with both methods demonstrating comparable eradication times. APNTP exhibited rapid antimicrobial activity against all of the ESKAPE pathogens in the planktonic mode of growth and provided efficient and complete eradication of ESKAPE pathogens in the biofilm mode of growth within 360 s, with the exception of A. baumannii where a >4log reduction in biofilm viability was observed. This demonstrates its effectiveness as a bactericidal treatment against these pathogens and further highlights its potential application in the clinical environment for the control of highly antimicrobial-resistant pathogens.
Resumo:
Blending Art and Science in Nurse Education: The Benefits and Impact of Creative Partnerships
This paper presents the benefits of an innovative education partnership between lecturers from the School of Nursing and Midwifery, Queens University Belfast and Arts Care, a unique Arts and Health Charity in Northern Ireland, to engage nursing students in life sciences
Nursing and Midwifery students often struggle to engage with life science modules because they lack confidence in their ability to study science.This project was funded by a Teaching Innovation Award from the School of Nursing and Midwifery, Queens University Belfast, to explore creative ways of engaging year one undergraduate nursing students in learning anatomy and physiology. The project was facilitated through collaboration between Teaching staff from the School of Nursing and Midwifery and Arts Care, Northern Ireland. This unique Arts and Health Charity believes in the benefits of creativity to well being.
RESEARCH OBJECTIVE(S)
To explore creative ways of engaging year one undergraduate nursing students in learning anatomy and physiology.
METHODS AND METHODLOGY
Students participated in a series of workshops designed to explore the cells, tissues and organs of the human body through the medium of felt. Facilitated by an Arts Care artist, and following self-directed preparation, students discussed and translated their learning of the cells, tissues and organs of the human body into striking felt images. During the project students kept a reflective journal of their experience to document how participation in the project enhanced their learning and professional development
RESULTS
Creativity transformed and brought to life the students learning of the cells, tissues and organs of the human body.
The project culminated in the exhibition of a unique body of artwork which has been exhibited across Northern Ireland in hospitals and galleries and viewed by fellow students, teaching staff, nurses from practice, artists, friends, family and members of the public.
CONCLUSION
The impact of creativity learning strategies in nurse education should be further explored.
REFERENCES
Bennett, M and Rogers, K.MA. (2014) First impressions matter: an active, innovative and engaging method to recruit student volunteers for a pedagogic project. Reflections, Available online at: QUB, Centre for Educational Development / Publications / Reflections Newsletter, Issue 18, June 2014.
Chickering,A.W. and Gamson,Z.F. (1987) Seven principles for good practice in undergraduate education The American Association for Higher Education Bulletin, March. http://www.aahea.org/aahea/articles/sevenprinciples1987.htm, accessed 8th August 2014
Fell, P., Borland, G., Lynne, V. (2012) Lab versus lectures: can lab based practical sessions improve nursing students’ learning of bioscience? Health and Social Care Education 3:1, 33-38
Resumo:
Introduction: In this cohort study, we explored the relationship between fluid balance, intradialytic hypotension and outcomes in critically ill patients with acute kidney injury (AKI) who received renal replacement therapy (RRT).
Methods: We analysed prospectively collected registry data on patients older than 16 years who received RRT for at least two days in an intensive care unit at two university-affiliated hospitals. We used multivariable logistic regression to determine the relationship between mean daily fluid balance and intradialytic hypotension, both over seven days following RRT initiation, and the outcomes of hospital mortality and RRT dependence in survivors.
Results: In total, 492 patients were included (299 male (60.8%), mean (standard deviation (SD)) age 62.9 (16.3) years); 251 (51.0%) died in hospital. Independent risk factors for mortality were mean daily fluid balance (odds ratio (OR) 1.36 per 1000 mL positive (95% confidence interval (CI) 1.18 to 1.57), intradialytic hypotension (OR 1.14 per 10% increase in days with intradialytic hypotension (95% CI 1.06 to 1.23)), age (OR 1.15 per five-year increase (95% CI 1.07 to 1.25)), maximum sequential organ failure assessment score on days 1 to 7 (OR 1.21 (95% CI 1.13 to 1.29)), and Charlson comorbidity index (OR 1.28 (95% CI 1.14 to 1.44)); higher baseline creatinine (OR 0.98 per 10 mu mol/L (95% CI 0.97 to 0.996)) was associated with lower risk of death. Of 241 hospital survivors, 61 (25.3%) were RRT dependent at discharge. The only independent risk factor for RRT dependence was pre-existing heart failure (OR 3.13 (95% CI 1.46 to 6.74)). Neither mean daily fluid balance nor intradialytic hypotension was associated with RRT dependence in survivors. Associations between these exposures and mortality were similar in sensitivity analyses accounting for immortal time bias and dichotomising mean daily fluid balance as positive or negative. In the subgroup of patients with data on pre-RRT fluid balance, fluid overload at RRT initiation did not modify the association of mean daily fluid balance with mortality.
Conclusions: In this cohort of patients with AKI requiring RRT, a more positive mean daily fluid balance and intradialytic hypotension were associated with hospital mortality but not with RRT dependence at hospital discharge in survivors.
Resumo:
Background
Despite the recognized importance of end-of-life (EOL) communication between patients and physicians, the extent and quality of such communication is lacking.
Objective
We sought to understand patient perspectives on physician behaviours during EOL communication.
Design
In this mixed methods study, we conducted quantitative and qualitative strands and then merged data sets during a mixed methods analysis phase. In the quantitative strand, we used the quality of communication tool (QOC) to measure physician behaviours that predict global rating of satisfaction in EOL communication skills, while in the qualitative strand we conducted semi-structured interviews. During the mixed methods analysis, we compared and contrasted qualitative and quantitative data.
Setting and Participants
Seriously ill inpatients at three tertiary care hospitals in Canada.
Results
We found convergence between qualitative and quantitative strands: patients desire candid information from their physician and a sense of familiarity. The quantitative results (n = 132) suggest a paucity of certain EOL communication behaviours in this seriously ill population with a limited prognosis. The qualitative findings (n = 16) suggest that at times, physicians did not engage in EOL communication despite patient readiness, while sometimes this may represent an appropriate deferral after assessment of a patient's lack of readiness.
Conclusions
Avoidance of certain EOL topics may not always be a failure if it is a result of an assessment of lack of patient readiness. This has implications for future tool development: a measure could be built in to assess whether physician behaviours align with patient readiness.
Resumo:
Past nuclear disasters, such as the atomic bombings in 1945 and major accidents at nuclear power plants, have highlighted similarities in potential public health effects of radiation in both circumstances, including health issues unrelated to radiation exposure. Although the rarity of nuclear disasters limits opportunities to undertake rigorous research of evidence-based interventions and strategies, identification of lessons learned and development of an effective plan to protect the public, minimise negative effects, and protect emergency workers from exposure to high-dose radiation is important. Additionally, research is needed to help decision makers to avoid premature deaths among patients already in hospitals and other vulnerable groups during evacuation. Since nuclear disasters can affect hundreds of thousands of people, a substantial number of people are at risk of physical and mental harm in each disaster. During the recovery period after a nuclear disaster, physicians might need to screen for psychological burdens and provide general physical and mental health care for many affected residents who might experience long-term displacement. Reliable communication of personalised risks has emerged as a challenge for health-care professionals beyond the need to explain radiation protection. To overcome difficulties of risk communication and provide decision aids to protect workers, vulnerable people, and residents after a nuclear disaster, physicians should receive training in nuclear disaster response. This training should include evidence-based interventions, support decisions to balance potential harms and benefits, and take account of scientific uncertainty in provision of community health care. An open and joint learning process is essential to prepare for, and minimise the effects of, future nuclear disasters.
Resumo:
This paper presents multilevel models that utilize the Coxian phase-type distribution in order to be able to include a survival component in the model. The approach is demonstrated by modeling patient length of stay and in-hospital mortality in geriatric wards in Italy. The multilevel model is used to provide a means of controlling for the existence of possible intra-ward correlations, which may make patients within a hospital more alike in terms of experienced outcome than patients coming from different hospitals, everything else being equal. Within this multilevel model we introduce the use of the Coxian phase-type distribution to create a covariate that represents patient length of stay or stage (of hospital care). Results demonstrate that the use of the multilevel model for representing the in-patient mortality is successful and further enhanced by the inclusion of the Coxian phase-type distribution variable (stage covariate).
Resumo:
Institutions (and how they work) have long been the object of many investigations in the fields of media, cultural, and organizational studies. More recently, there has been a “linguistic” turn in the study of institutions with many language-focused explo- rations of how power and discourse may function in specific institutional and organi- zational settings, such as schools, courtrooms, corporations, clinics, hospitals, and pris- ons. Many of these studies have been concerned with the ways in which language is used to create and shape institutions and how institutions in turn have the capacity to create, shape, and impose discourses on people. Institutions thus have considerable control over the organizing of our routine experiences of the world and the way we classify that world. They also have the power to foster particular kinds of identities to suit their own purposes.
Resumo:
Background: To study the differences in ophthalmology resident training between China and the Hong Kong Special Administrative Region (HKSAR).Methods: Training programs were selected from among the largest and best-known teaching hospitals. Ophthalmology residents were sent an anonymous 48-item questionnaire by mail. Work satisfaction, time allocation between training activities and volume of surgery performed were determined.Results: 50/75 residents (66.7 %) from China and 20/26 (76.9 %) from HKSAR completed the survey. Age (28.9 ± 2.5 vs. 30.2 ± 2.9 years, p = 0.15) and number of years in training (3.4 ± 1.6 vs. 2.8 ± 1.5, p = 0.19) were comparable between groups. The number of cataract procedures performed by HKSAR trainees (extra-capsular, median 80.0, quartile range: 30.0, 100.0; phacoemulsification, median: 20.0, quartile range: 0.0, 100.0) exceeded that for Chinese residents (extra-capsular: median = 0, p < 0.0001; phacoemulsification: median = 0, p < 0.0001). Chinese trainees spent more time completing medical charts (>50 % of time on charts: 62.5 % versus 5.3 %, p < 0.0001) and received less supervision (≥90 % of training supervised: 4.4 % versus 65 %, p < 0.0001). Chinese residents were more likely to feel underpaid (96.0 % vs. 31.6 %, p < 0.0001) and hoped their children would not practice medicine (69.4 % vs. 5.0 %, p = 0.0001) compared HKSAR residents.Conclusions: In this study, ophthalmology residents in China report strikingly less surgical experience and supervision, and lower satisfaction than HKSAR residents. The HKSAR model of hands-on resident training might be useful in improving the low cataract surgical rate in China.
Resumo:
Purpose: As resident work hours policies evolve, residents’ off-duty time remains poorly understood. Despite assumptions about how residents should be using their postcall, off-duty time, there is little research on how residents actually use this time and the reasoning underpinning their activities. This study sought to understand residents’ nonclinical postcall activities when they leave the hospital, their decision-making processes, and their perspectives on the relationship between these activities and their well-being or recovery.
Method: The study took place at a Liaison Committee on Medical Education–accredited Canadian medical school from 2012 to 2014. The authors recruited a purposive and convenience sample of postgraduate year 1–5 residents from six surgical and nonsurgical specialties at three hospitals affiliated with the medical school. Using a constructivist grounded theory approach, semistructured interviews were conducted, audio-taped, transcribed, anonymized, and combined with field notes. The authors analyzed interview transcripts using constant comparative analysis and performed post hoc member checking.
Results: Twenty-four residents participated. Residents characterized their predominant approach to postcall decision making as one of making trade-offs between multiple, competing, seemingly incompatible, but equally valuable, activities. Participants exhibited two different trade-off orientations: being oriented toward maintaining a normal life or toward mitigating fatigue.
Conclusions: The authors’ findings on residents’ trade-off orientations suggest a dual recovery model with postcall trade-offs motivated by the recovery of sleep or of self. This model challenges the dominant viewpoint in the current duty hours literature and suggests that the duty hours discussion must be broadened to include other recovery processes.
Resumo:
INTRODUCTION: Following the introduction of work-hour restrictions, residents' workload has become an important theme in postgraduate training. The efficacy of restrictions on workload, however, remains controversial, as most research has only examined objective workload. The purpose of this study was to explore the less clearly understood component of subjective workload and, in particular, the factors that influenced residents' subjective workload.
METHOD: This study was conducted in Japan at three community teaching hospitals. We recruited a convenience sample of 31 junior residents in seven focus groups at the three sites. Audio-recorded and transcribed data were read iteratively and analyzed thematically, identifying, analyzing and reporting themes within the data and developing an interpretive synthesis of the topic.
RESULTS: Seven factors influenced residents' subjective workload: (1) interaction within the professional community, (2) feedback from patients, (3) being in control, (4) professional development, (5) private life, (6) interest and (7) protected free time.
DISCUSSION AND CONCLUSION: Our findings indicate that residents who have good interaction with colleagues and patients, are competent enough to control their work, experience personal development through working, have greater interest in their work, and have fulfilling private lives will have the least subjective workload.
Resumo:
Background: Rapid Response Systems (RRS) have been implemented nationally and internationally to improve patient safety in hospital. However, to date the majority of the RRS research evidence has focused on measuring the effectiveness of the intervention on patient outcomes. To evaluate RRS it has been recommended that a multimodal approach is required to address the broad range of process and outcome measures required to determine the effectiveness of the RRS concept. Aim: The aim of this paper is to evaluate the official RRS programme theoretical assumptions regarding how the programme is meant to work against actual practice in order to determine what works. Methods: The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory [1]. This approach used a variety of mixed methods to test the programme theories including individual and focus group interviews with a purposive sample of 75 nurses and doctors, observation of ward practices and documentary analysis. The findings from the case studies were analysed and compared within and across cases to identify what works for whom and in what circumstances. Results: The RRS programme theories were critically evaluated and compared with study findings to develop a mid-range theory to explain what works, for whom in what circumstances. The findings of what works suggests that clinical experience, established working relationships, flexible implementation of protocols, ongoing experiential learning, empowerment and pre-emptive management are key to the success of RRS implementation. Conclusion:These findings highlight the combination of factors that can improve the implementation of RRS and in light of this evidence several recommendations are made to provide policymakers with guidance and direction for their success and sustainability.References: 1.Pawson R and Tilley N. (1997) Realistic Evaluation. Sage Publications; LondonType of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast