302 resultados para Nursing-home Patients
Resumo:
BACKGROUND Chemotherapy-induced nausea and vomiting (CINV) remain prevalent among cancer patients despite pharmacological advances in CINV therapy. Patients can initiate nonpharmacologic strategies, which potentially play an important role as adjuncts to pharmacological agents in alleviating CINV. Some studies have explored nausea and vomiting self-management (NVSM) behaviors among patients in Western settings; however, little is known about the NVSM behaviors of patients in China. OBJECTIVES This study examines NVSM behaviors of Chinese cancer patients. METHODS A cross-sectional survey was conducted in a specialist cancer hospital in southeast China. RESULTS A sample of 255 cancer patients was recruited. A mean of 8.56 (±3.15) NVSM behaviors was reported. Most NVSM behaviors were rated as moderately effective and were implemented with moderate self-efficacy. Higher distress levels, better functional status, previous similar symptom experiences, receiving chemotherapy as an inpatient, and greater support from multiple levels were related to greater engagement in NVSM; higher self-efficacy levels pertaining to NVSM behaviors were associated with reports of more relief from specific NVSM behaviors. CONCLUSIONS A range of NVSM strategies was initiated by Chinese cancer patients and provided some relief. A range of individual, health status, and environmental factors influenced engagement with and relief from NVSM behaviors. IMPLICATIONS FOR PRACTICE To enhance Chinese patients' NVSM, patients should be supported to engage in behaviors including taking antiemetics, modifying their diet, using psychological strategies, and creating a pleasant environment. The findings highlight the importance of enhancing patients' self-efficacy in NVSM, alleviating symptom distress, and improving social support to achieve better outcomes.
Resumo:
There is increasing momentum in cancer care to implement a two stage assessment process that accurately determines the ability of older patients to cope with, and benefit from, chemotherapy. The two-step approach aims to ensure that patients clearly fit for chemotherapy can be accurately identified and referred for treatment without undergoing a time- and resource-intensive comprehensive geriatric assessment (CGA). Ideally, this process removes the uncertainty of how to classify and then appropriately treat the older cancer patient. After trialling a two-stage screen and CGA process in the Division of Cancer Services at Princess Alexandra Hospital (PAH) in 2011-2012, we implemented a model of oncogeriatric care based on our findings. In this paper, we explore the methodological and practical aspects of implementing the PAH model and outline further work needed to refine the process in our treatment context.
Resumo:
Twelve patients receiving acute in-patient psychiatric care in Queensland, Australia, participated in semi-structured interviews to elicit their perceptions of seclusion. All respondents had experienced time in seclusion within the 7 days prior to interview. Interviews were audiotaped, transcribed and analysed using content analysis. Five major themes emerged: use of seclusion, emotional impact, sensory deprivation, maintaining control and staff-patient interaction. The prevailing negativity towards seclusion underscores the need for ongoing critical review of its use. In particular, the relationship between patient responses to seclusion and the circumstances in which seclusion takes place requires greater consideration. Interventions such as providing information to patients about seclusion, increased interaction with patients during seclusion, attention to privacy and effective debriefing following seclusion may help to reduce the emotional impact of the practice.
Resumo:
Introduction Malnutrition is common among hospitalised patients, with poor follow-up of nutrition support post-discharge. Published studies on the efficacy of ambulatory nutrition support (ANS) for malnourished patients post-discharge are scarce. The aims of this study were to evaluate the rate of dietetics follow-up of malnourished patients post-discharge, before (2008) and after (2010) implementation of a new ANS service, and to evaluate nutritional outcomes post-implementation. Materials and Methods Consecutive samples of 261 (2008) and 163 (2010) adult inpatients referred to dietetics and assessed as malnourished using Subjective Global Assessment (SGA) were enrolled. All subjects received inpatient nutrition intervention and dietetic outpatient clinic follow-up appointments. For the 2010 cohort, ANS was initiated to provide telephone follow-up and home visits for patients who failed to attend the outpatient clinic. Subjective Global Assessment, body weight, quality of life (EQ-5D VAS) and handgrip strength were measured at baseline and five months post-discharge. Paired t-test was used to compare pre- and post-intervention results. Results In 2008, only 15% of patients returned for follow-up with a dietitian within four months post-discharge. After implementation of ANS in 2010, the follow-up rate was 100%. Mean weight improved from 44.0 ± 8.5kg to 46.3 ± 9.6kg, EQ-5D VAS from 61.2 ± 19.8 to 71.6 ± 17.4 and handgrip strength from 15.1 ± 7.1 kg force to 17.5 ± 8.5 kg force; p<0.001 for all. Seventy-four percent of patients improved in SGA score. Conclusion Ambulatory nutrition support resulted in significant improvements in follow-up rate, nutritional status and quality of life of malnourished patients post-discharge.
Resumo:
BACKGROUND: The prevalence of protein-energy malnutrition in older adults is reported to be as high as 60% and is associated with poor health outcomes. Inadequate feeding assistance and mealtime interruptions may contribute to malnutrition and poor nutritional intake during hospitalisation. Despite being widely implemented in practice in the United Kingdom and increasingly in Australia, there have been few studies examining the impact of strategies such as Protected Mealtimes and dedicated feeding assistant roles on nutritional outcomes of elderly inpatients. AIMS: The aim of this research was to implement and compare three system-level interventions designed to specifically address mealtime barriers and improve energy intakes of medical inpatients aged ≥65 years. This research also aimed to evaluate the sustainability of any changes to mealtime routines six months post-intervention and to gain an understanding of staff perceptions of the post-intervention mealtime experience. METHODS: Three mealtime assistance interventions were implemented in three medical wards at Royal Brisbane and Women's Hospital: AIN-only: Additional assistant-in-nursing (AIN) with dedicated nutrition role. PM-only: Multidisciplinary approach to meals, including Protected Mealtimes. PM+AIN: Combined intervention: AIN + multidisciplinary approach to meals. An action research approach was used to carefully design and implement the three interventions in partnership with ward staff and managers. Significant time was spent in consultation with staff throughout the implementation period to facilitate ownership of the interventions and increase likelihood of successful implementation. A pre-post design was used to compare the implementation and nutritional outcomes of each intervention to a pre-intervention group. Using the same wards, eligible participants (medical inpatients aged ≥65 years) were recruited to the preintervention group between November 2007 and March 2008 and to the intervention groups between January and June 2009. The primary nutritional outcome was daily energy and protein intake, which was determined by visually estimating plate waste at each meal and mid-meal on Day 4 of admission. Energy and protein intakes were compared between the pre and post intervention groups. Data were collected on a range of covariates (demographics, nutritional status and known risk factors for poor food intake), which allowed for multivariate analysis of the impact of the interventions on nutritional intake. The provision of mealtime assistance to participants and activities of ward staff (including mealtime interruptions) were observed in the pre-intervention and intervention groups, with staff observations repeated six months post-intervention. Focus groups were conducted with nursing and allied health staff in June 2009 to explore their attitudes and behaviours in response to the three mealtime interventions. These focus group discussions were analysed using thematic analysis. RESULTS: A total of 254 participants were recruited to the study (pre-intervention: n=115, AIN-only: n=58, PM-only: n=39, PM+AIN: n=42). Participants had a mean age of 80 years (SD 8), and 40% (n=101) were malnourished on hospital admission, 50% (n=108) had anorexia and 38% (n=97) required some assistance at mealtimes. Occasions of mealtime assistance significantly increased in all interventions (p<0.01). However, no change was seen in mealtime interruptions. No significant difference was seen in mean total energy and protein intake between the preintervention and intervention groups. However, when total kilojoule intake was compared with estimated requirements at the individual level, participants in the intervention groups were more likely to achieve adequate energy intake (OR=3.4, p=0.01), with no difference noted between interventions (p=0.29). Despite small improvements in nutritional adequacy, the majority of participants in the intervention groups (76%, n=103) had inadequate energy intakes to meet their estimated energy requirements. Patients with cognitive impairment or feeding dependency appeared to gain substantial benefit from mealtime assistance interventions. The increase in occasions of mealtime assistance by nursing staff during the intervention period was maintained six-months post-intervention. Staff focus groups highlighted the importance of clearly designating and defining mealtime responsibilities in order to provide adequate mealtime care. While the purpose of the dedicated feeding assistant was to increase levels of mealtime assistance, staff indicated that responsibility for mealtime duties may have merely shifted from nursing staff to the assistant. Implementing the multidisciplinary interventions empowered nursing staff to "protect" the mealtime from external interruptions, but further work is required to empower nurses to prioritise mealtime activities within their own work schedules. Staff reported an increase in the profile of nutritional care on all wards, with additional non-nutritional benefits noted including improved mobility and functional independence, and better identification of swallowing difficulties. IMPLICATIONS: The PhD research provides clinicians with practical strategies to immediately introduce change to deliver better mealtime care in the hospital setting, and, as such, has initiated local and state-wide roll-out of mealtime assistance programs. Improved nutritional intakes of elderly inpatients was observed; however given the modest effect size and reducing lengths of hospital stays, better nutritional outcomes may be achieved by targeting the hospital-to-home transition period. Findings from this study suggest that mealtime assistance interventions for elderly inpatients with cognitive impairment and/or functional dependency show promise.
Resumo:
Background Nutrition screening is usually administered by nurses. However, most studies on nutrition screening tools have not used nurses to validate the tools. The 3-Minute Nutrition Screening (3-MinNS) assesses weight loss, dietary intake and muscle wastage, with the composite score of each used to determine risk of malnutrition. The aim of the study was to determine the validity and reliability of 3-MinNS administered by nurses, who are the intended assessors. Methods In this cross sectional study, three ward-based nurses screened 121 patients aged 21 years and over using 3-MinNS in three wards within 24 hours of admission. A dietitian then assessed the patients’ nutritional status using Subjective Global Assessment within 48 hours of admission, whilst blinded to the results of the screening. To assess the reliability of 3-MinNS, 37 patients screened by the first nurse were re-screened by a second nurse within 24 hours, who was blinded to the results of the first nurse. The sensitivity, specificity and best cutoff score for 3-MinNS were determined using the Receiver Operator Characteristics Curve. Results The best cutoff score to identify all patients at risk of malnutrition using 3-MinNS was three, with sensitivity of 89% and specificity of 88%. This cutoff point also identified all (100%) severely malnourished patients. There was strong correlation between 3-MinNS and SGA (r=0.78, p<0.001). The agreement between two nurses conducting the 3-MinNS tool was 78.3%. Conclusion 3-Minute Nutrition Screening is a valid and reliable tool for nurses to identify patients at risk of malnutrition.
Resumo:
Introduction Malnutrition is common among hospitalised patients, with poor follow-up of nutrition support post-discharge. Published studies on the efficacy of ambulatory nutrition support (ANS) for malnourished patients post-discharge are scarce. The aims of this study were to evaluate the rate of dietetics follow-up of malnourished patients post-discharge, before (2008) and after (2010) implementation of a new ANS service, and to evaluate nutritional outcomes post-implementation. Materials and Methods Consecutive samples of 261 (2008) and 163 (2010) adult inpatients referred to dietetics and assessed as malnourished using Subjective Global Assessment (SGA) were enrolled. All subjects received inpatient nutrition intervention and dietetic outpatient clinic follow-up appointments. For the 2010 cohort, ANS was initiated to provide telephone follow-up and home visits for patients who failed to attend the outpatient clinic. Subjective Global Assessment, body weight, quality of life (EQ-5D VAS) and handgrip strength were measured at baseline and five months post-discharge. Paired t-test was used to compare pre- and post-intervention results. Results In 2008, only 15% of patients returned for follow-up with a dietitian within four months post-discharge. After implementation of ANS in 2010, the follow-up rate was 100%. Mean weight improved from 44.0 ± 8.5kg to 46.3 ± 9.6kg, EQ-5D VAS from 61.2 ± 19.8 to 71.6 ± 17.4 and handgrip strength from 15.1 ± 7.1 kg force to 17.5 ± 8.5 kg force; p<0.001 for all. Seventy-four percent of patients improved in SGA score. Conclusion Ambulatory nutrition support resulted in significant improvements in follow-up rate, nutritional status and quality of life of malnourished patients post-discharge.
Resumo:
The morbidity and mortality rates of renal disease in Indigenous Australians are significantly higher than those of non-Indigenous Australians, and are increasing. The dominant discourses of renal disease currently predicate this as essentially a client problem, rather than (for example) a health care system problem. These discourses are indicative of the dominant “white” paradigm of health care, which fosters an expectation of assimilation by the marginalised “other.” In this paper, we draw upon a sociological methodology (the actor network approach) and a qualitative method (discourse analysis) to tease out these issues in Indigenous renal disease. Based on empirical data, we explore on the one hand the requirements of the discourses, technologies and practices that have been developed for a particular type of renal patient and health system in Australia. On the other, we examine the cultural and practical specificities entailed in the performance of these technologies and practices in the Indigenous Australian context. The meeting of the praxiographic orientation of the actor network approach—which has been called “the politics of what” (Mol 2002)—and the sociocultural concerns of discourse analysis does provide a useful guide as to “what to do” when confronted with issues in health care that currently seems unfathomable. Our praxiographic analysis of the discourse enabled us to understand the difficulties involved in translating renal health care networks across cultural contexts in Australia and to understand the dynamic and contested nature of these networks. The actor network approach has its limitations, however, particularly in the articulation of possible strategies to align two disparate systems in a way that would ensure better health care for Indigenous renal patients. In this paper we will discuss some of the problems we encountered in drawing on this methodology in our attempt to unearth practical solutions to the conundrums our data presented.
Resumo:
This project was an observational study of outpatients following lower limb surgical procedures for removal of skin cancers. Findings highlight a previously unreported high surgical site failure rate. Results also identified four potential risk factors (increasing age, presence of leg pain, split skin graft and haematoma) which negatively impact on surgical site healing in this population.
Resumo:
Objective To compare the diagnostic accuracy of the interRAI Acute Care (AC) Cognitive Performance Scale (CPS2) and the Mini-Mental State Examination (MMSE), against independent clinical diagnosis for detecting dementia in older hospitalized patients. Design, Setting, and Participants The study was part of a prospective observational cohort study of patients aged ≥70 years admitted to four acute hospitals in Queensland, Australia, between 2008 and 2010. Recruitment was consecutive and patients expected to remain in hospital for ≥48 hours were eligible to participate. Data for 462 patients were available for this study. Measurements Trained research nurses completed comprehensive geriatric assessments and administered the interRAI AC and MMSE to patients. Two physicians independently reviewed patients’ medical records and assessments to establish the diagnosis of dementia. Indicators of diagnostic accuracy included sensitivity, specificity, predictive values, likelihood ratios and areas under receiver (AUC) operating characteristic curves. Results 85 patients (18.4%) were considered to have dementia according to independent clinical diagnosis. The sensitivity of the CPS2 [0.68 (95%CI: 0.58–0.77)] was not statistically different to the MMSE [0.75 (0.64–0.83)] in predicting physician diagnosed dementia. The AUCs for the 2 instruments were also not statistically different: CPS2 AUC = 0.83 (95%CI: 0.78–0.89) and MMSE AUC = 0.87 (95%CI: 0.83–0.91), while the CPS2 demonstrated higher specificity [0.92 95%CI: 0.89–0.95)] than the MMSE [0.82 (0.77–0.85)]. Agreement between the CPS2 and clinical diagnosis was substantial (87.4%; κ=0.61). Conclusion The CPS2 appears to be a reliable screening tool for assessing cognitive impairment in acutely unwell older hospitalized patients. These findings add to the growing body of evidence supporting the utility of the interRAI AC, within which the CPS2 is embedded. The interRAI AC offers the advantage of being able to accurately screen for both dementia and delirium without the need to use additional assessments, thus increasing assessment efficiency.
Resumo:
Aim Few Australian studies have examined the impact of dementia on hospital outcomes. The aim of this study was to determine the relative contribution of dementia to adverse outcomes in older hospital patients. Method Prospective observational cohort study (n = 493) of patients aged ≥70 years admitted to four acute hospitals in Queensland. Trained research nurses completed comprehensive geriatric assessments using standardised instruments and collected data regarding adverse outcomes. The diagnosis of dementia was established by independent physician review of patients' medical records and assessments. Results Patients with dementia (n = 102, 20.7%) were significantly older (P = 0.01), had poorer functional ability (P < 0.01), and were more likely to have delirium at admission (P < 0.01) than patients without dementia. Dementia (odds ratio = 4.8, P < 0.001) increased the risk of developing delirium during the hospital stay. Conclusion Older patients with dementia are more impaired and vulnerable than patients without dementia and are at greater risk of adverse outcomes when hospitalised.
Resumo:
Lung cancer patients face poor survival and experience co-occurring chronic physical and psychological symptoms. These symptoms can result in significant burden, impaired physical and social function and poor quality of life. This paper provides a review of evidence based interventions that support best practice supportive and palliative care for patients with lung cancer. Specifically, interventions to manage dyspnoea, one of the most common symptoms experienced by this group, are discussed to illustrate the emerging evidence base in the field. The evidence base for the pharmacological management of dyspnoea report systemic opioids have the best available evidence to support their use. In particular, the evidence strongly supports systemic morphine preferably initiated and continued as a once daily sustained release preparation. Evidence supporting the use of a range of other adjunctive non-pharmacological interventions in managing the symptom is also emerging. Interventions to improve breathing efficiency that have been reported to be effective include pursed lip breathing, diaphragmatic breathing, positioning and pacing techniques. Psychosocial interventions seeking to reduce anxiety and distress can also improve the management of breathlessness although further studies are needed. In addition, evidence reviews have concluded that case management approaches and nurse led follow-up programs are effective in reducing breathlessness and psychological distress, providing a useful model for supporting implementation of evidence based symptom management strategies. Optimal outcomes from supportive and palliative care interventions thus require a multilevel approach, involving interventions at the patient, health professional and health service level.
Resumo:
Anaemia is a chronic problem in patients with renal insufficiency, especially chronic renal failure (CRF). In patients with CRF, anaemia is primarily due to a deficiency in erythropoietin (EPO), a glycoprotein growth factor that stimulates RBC production. The long-term effects and burden of anaemia for patients with CRF can be physical, emotional and financial. With efficient, systematic management of anaemia, clinicians have the potential to realise not only better clinical outcomes for CRF patients but also significant cost savings for them and the health system. During the last decade, significant advances have been made in clinicians’ understanding of how best to manage anaemia in this vulnerable population. One of the most important efforts to improve clinical practice has been the development of best practice guidelines.
Resumo:
It has been postulated that susceptible individuals may acquire infection with nontuberculous mycobacteria (NTM) from water and aerosol exposure. This study examined household water and shower aerosols of patients with NTM pulmonary disease. The mycobacteria isolated from clinical samples from 20 patients included M. avium (5 patients), M. intracellulare (12 patients), M. abscessus (7 patients), M. gordonae (1 patient), M. lentiflavum (1 patient), M. fortuitum (1 patient), M. peregrinum (1 patient), M. chelonae (1 patient), M. triplex (1 patient), and M. kansasii (1 patient). One-liter water samples and swabs were collected from all taps, and swimming pools or rainwater tanks. Shower aerosols were sampled using Andersen six-stage cascade impactors. For a subgroup of patients, real-time PCR was performed and high-resolution melt profiles were compared to those of ATCC control strains. Pathogenic mycobacteria were isolated from 19 homes. Species identified in the home matched that found in the patient in seven (35%) cases: M. abscessus (3 cases), M. avium (1 case), M. gordonae (1 case), M. lentiflavum (1 case), and M. kansasii (1 case). In an additional patient with M. abscessus infection, this species was isolated from potable water supplying her home. NTM grown from aerosols included M. abscessus (3 homes), M. gordonae (2 homes), M. kansasii (1 home), M. fortuitum complex (4 homes), M. mucogenicum (1 home), and M. wolinskyi (1 home). NTM causing human disease can be isolated from household water and aerosols. The evidence appears strongest for M. avium, M. kansasii, M. lentiflavum, and M. abscessus. Despite a predominance of disease due to M. intracellulare, we found no evidence for acquisition of infection from household water for this species.
Resumo:
Background: There is a paucity of research assessing health-related quality of life (HRQoL) and self-efficacy in caregivers of relatives with dementia in mainland China. Aims: To compare the level of HRQoL between caregivers and the general population in mainland China and to assess the role of caregiver self-efficacy in the relationship between caregiver social support and HRQoL. Methods: A cross-sectional study was conducted in Shanghai, China. The caregivers were recruited from the outpatient department of a teaching hospital. A total of 195 participants were interviewed, using a survey package including the Chinese version of the 36-Item Short-Form Health Survey (SF-36), demographic data, the variables associated with the impairments of care recipients, perceived social support and caregiver self-efficacy. The caregivers' SF-36 scores were compared with those of the general population in China. Results: The results indicated that the HRQoL of the caregivers was poorer compared with that of the general population when matched for age and gender. Multiple regression analyses revealed that caregiver self-efficacy is a partial mediator between social support and HRQoL, and a partial mediator between behavioral and psychological symptoms of dementia (BPSD) and caregiver mental health. Conclusion: Assisting with managing BPSD and enhancing caregiver self-efficacy can be considered integral parts of interventions to improve caregiver HRQoL.