130 resultados para Associated management
Resumo:
It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.
Resumo:
It is only in recent years that the critical role that spatial data can play in disaster management and strengthening community resilience has been recognised. The recognition of this importance is singularly evident from the fact that in Australia spatial data is considered as soft infrastructure. In the aftermath of every disaster this importance is being increasingly strengthened with state agencies paying greater attention to ensuring the availability of accurate spatial data based on the lessons learnt. For example, the major flooding in Queensland during the summer of 2011 resulted in a comprehensive review of responsibilities and accountability for the provision of spatial information during such natural disasters. A high level commission of enquiry completed a comprehensive investigation of the 2011 Brisbane flood inundation event and made specific recommendations concerning the collection of and accessibility to spatial information for disaster management and for strengthening community resilience during and after a natural disaster. The lessons learnt and processes implemented were subsequently tested by natural disasters during subsequent years. This paper provides an overview of the practical implementation of the recommendations of the commission of enquiry. It focuses particularly on the measures adopted by the state agencies with the primary role for managing spatial data and the evolution of this role in Queensland State, Australia. The paper concludes with a review of the development of the role and the increasing importance of spatial data as an infrastructure for disaster planning and management which promotes the strengthening of community resilience.
Resumo:
BACKGROUND Chemotherapy-induced nausea and vomiting (CINV) remain prevalent among cancer patients despite pharmacological advances in CINV therapy. Patients can initiate nonpharmacologic strategies, which potentially play an important role as adjuncts to pharmacological agents in alleviating CINV. Some studies have explored nausea and vomiting self-management (NVSM) behaviors among patients in Western settings; however, little is known about the NVSM behaviors of patients in China. OBJECTIVES This study examines NVSM behaviors of Chinese cancer patients. METHODS A cross-sectional survey was conducted in a specialist cancer hospital in southeast China. RESULTS A sample of 255 cancer patients was recruited. A mean of 8.56 (±3.15) NVSM behaviors was reported. Most NVSM behaviors were rated as moderately effective and were implemented with moderate self-efficacy. Higher distress levels, better functional status, previous similar symptom experiences, receiving chemotherapy as an inpatient, and greater support from multiple levels were related to greater engagement in NVSM; higher self-efficacy levels pertaining to NVSM behaviors were associated with reports of more relief from specific NVSM behaviors. CONCLUSIONS A range of NVSM strategies was initiated by Chinese cancer patients and provided some relief. A range of individual, health status, and environmental factors influenced engagement with and relief from NVSM behaviors. IMPLICATIONS FOR PRACTICE To enhance Chinese patients' NVSM, patients should be supported to engage in behaviors including taking antiemetics, modifying their diet, using psychological strategies, and creating a pleasant environment. The findings highlight the importance of enhancing patients' self-efficacy in NVSM, alleviating symptom distress, and improving social support to achieve better outcomes.
Resumo:
Since the pioneering work of Hough in 1902 (1) the term ‘delayed onset muscle soreness (DOMS)’ has dominated the field of athletic recovery. DOMS typically occurs after exercise induced muscle damage (EIMD), particularly if the exercise is unaccustomed or involves a large amount of eccentric (muscle lengthening) contractions. The symptoms of EIMD manifest as a temporary reduction in muscle force, disturbed proprioceptive acuity, increases in inflammatory markers both within the injured muscle and in the blood as well as increased muscle soreness, stiffness and swelling. The intensity of discomfort and soreness associated with DOMS increases within the first 24 hours, peaks between 24 and 72 hours, before subsiding and eventually disappearing 5-7 days after the exercise. Consequently, DOMS may interfere with athletic training or competition and several recovery interventions have been utilised by athletes and coaches in an attempt to offset the negative effects...
Resumo:
This paper extends research on the corporate governance practices of transitional economies by examining whether the ability of the audit committee to constrain earnings management in Chinese firms is associated with the listing environment and the presence of government officials on the audit committee. Despite considerable regulatory reforms by the Chinese Securities Regulatory Commission, there remain incentives for Chinese firms to manage earnings. However, government initiatives to encourage domestic firms to cross-list on the Hong Kong Stock Exchange are accompanied by improved governance. We find that the expertise and independence of the audit committee for cross-listed (CL) Chinese firms are associated with lower abnormal accruals, our measure of earnings management. Both domestic only listed firms and CL Chinese firms appoint government officials as independent members on the audit committee. However, due to the political connection between government officials and the controlling shareholder (the State), these appointments can severely mitigate audit committee independence. Subsequently, we find a significant and positive association between audit committee independence and experience and earnings management when there are government officials on the audit committee.
Resumo:
Building Information Modeling (BIM) is the use of virtual building information models to develop building design solutions and design documentation and to analyse construction processes. Recent advances in IT have enabled advanced knowledge management, which in turn facilitates sustainability and improves asset management in the civil construction industry. There are several important qualifiers and some disadvantages of the current suite of technologies. This paper outlines the benefits, enablers, and barriers associated with BIM and makes suggestions about how these issues may be addressed. The paper highlights the advantages of BIM, particularly the increased utility and speed, enhanced fault finding in all construction phases, and enhanced collaborations and visualisation of data. The paper additionally identifies a range of issues concerning the implementation of BIM as follows: IP, liability, risks, and contracts and the authenticity of users. Implementing BIM requires investment in new technology, skills training, and development of new ways of collaboration and Trade Practices concerns. However, when these challenges are overcome, BIM as a new information technology promises a new level of collaborative engineering knowledge management, designed to facilitate sustainability and asset management issues in design, construction, asset management practices, and eventually decommissioning for the civil engineering industry.
Resumo:
The Australian Commission on Safety and Quality in Health Care commissioned this rapid review to identify recent evidence in relation to three key questions: 1. What is the current evidence of quality and safety issues regarding the hospital experience of people with cognitive impairment (dementia/delirium)? 2. What are the existing evidence-based pathways, best practice or guidelines for cognitive impairment in hospitals? 3. What are the key components of an ideal patient journey for a person with dementia and/or delirium? The purpose of this review is to identify best practice in caring for patients with cognitive impairment (CI) in acute hospital settings. CI refers to patients with dementia and delirium but can include other conditions. For the purposes of this report, ‘Hospitals’ is defined as acute care settings and includes care provided by acute care institutions in other settings (e.g. Multipurpose Services and Hospital in the Home). It does not include residential aged care settings nor palliative care services that are not part of a service provided by an acute care institution. Method Both peer-reviewed publications and the grey literature were comprehensively searched for recent (primarily post 2010) publications, reports and guidelines that addressed the three key questions. The literature was evaluated and graded according to the National Health and Medical Research Council (NHMRC) levels of criteria (see Evidence Summary – Appendix B). Results Thirty-one recent publications were retrieved in relation to quality and safety issues faced by people with CI in acute hospitals. The results indicate that CI is a common problem in hospitals (upwards of 30% - the rate increases with increasing patient age), although this is likely to be an underestimate, in part, due to numbers of patients without a formal dementia diagnosis. There is a large body of evidence showing that patients with CI have worse outcomes than patients without CI following hospitalisation including increased mortality, more complications, longer hospital stays, increased system costs as well as functional and cognitive decline. 4 To improve the care of patients with CI in hospital, best practice guidelines have been developed, of which sixteen recent guidelines/position statements/standards were identified in this review (Table 2). Four guidelines described standards or quality indicators for providing optimal care for the older person with CI in hospital, in general, while three focused on delirium diagnosis, prevention and management. The remaining guidelines/statements focused on specific issues in relation to the care of patients with CI in acute hospitals including hydration, nutrition, wandering and care in the Emergency Department (ED). A key message in several of the guidelines was that older patients should be assessed for CI at admission and this is particularly important in the case of delirium, which can indicate an emergency, in order to implement treatment. A second clear mess...
Resumo:
Early detection, clinical management and disease recurrence monitoring are critical areas in cancer treatment in which specific biomarker panels are likely to be very important in each of these key areas. We have previously demonstrated that levels of alpha-2-heremans-schmid-glycoprotein (AHSG), complement component C3 (C3), clusterin (CLI), haptoglobin (HP) and serum amyloid A (SAA) are significantly altered in serum from patients with squamous cell carcinoma of the lung. Here, we report the abundance levels for these proteins in serum samples from patients with advanced breast cancer, colorectal cancer (CRC) and lung cancer compared to healthy controls (age and gender matched) using commercially available enzyme-linked immunosorbent assay kits. Logistic regression (LR) models were fitted to the resulting data, and the classification ability of the proteins was evaluated using receiver-operating characteristic curve and leave-one-out cross-validation (LOOCV). The most accurate individual candidate biomarkers were C3 for breast cancer [area under the curve (AUC) = 0.89, LOOCV = 73%], CLI for CRC (AUC = 0.98, LOOCV = 90%), HP for small cell lung carcinoma (AUC = 0.97, LOOCV = 88%), C3 for lung adenocarcinoma (AUC = 0.94, LOOCV = 89%) and HP for squamous cell carcinoma of the lung (AUC = 0.94, LOOCV = 87%). The best dual combination of biomarkers using LR analysis were found to be AHSG + C3 (AUC = 0.91, LOOCV = 83%) for breast cancer, CLI + HP (AUC = 0.98, LOOCV = 92%) for CRC, C3 + SAA (AUC = 0.97, LOOCV = 91%) for small cell lung carcinoma and HP + SAA for both adenocarcinoma (AUC = 0.98, LOOCV = 96%) and squamous cell carcinoma of the lung (AUC = 0.98, LOOCV = 84%). The high AUC values reported here indicated that these candidate biomarkers have the potential to discriminate accurately between control and cancer groups both individually and in combination with other proteins. Copyright © 2011 UICC.
Resumo:
Background Hallux valgus (HV) has been linked to functional disability and increased falls risk in older adults. However, specific gait alterations in individuals with HV are unclear. This systematic review investigated gait parameters associated with HV in otherwise healthy adults. Methods Electronic databases (Medline, Embase, CINAHL) were searched to October 2011, including cross-sectional studies with clearly defined HV and non-HV comparison groups. Two investigators independently rated studies for methodological quality. Effect sizes (95% confidence intervals (CI)) were calculated as standardized mean differences (SMD) for continuous data and risk ratios (RR) for dichotomous data. Results Nine studies included a total of 589 participants. Three plantar pressure studies reported increased hallux loading (SMD 0.56 to 1.78) and medial forefoot loading (SMD 0.62 to 1.21), while one study found reduced first metatarsal loading (SMD −0.61, CI −1.19 to −0.03) in HV participants. HV participants demonstrated less ankle and rearfoot motion during terminal stance (SMD −0.81 to −0.63) and increased intrinsic muscle activity (RR 1.6, 1.1 to 2.2). Most studies reported no differences in spatio-temporal parameters; however, one study found reduced speed (SMD −0.73, -1.25 to −0.20), step length (SMD −0.66 to −0.59) and less stable gait patterns (SMD −0.86 to −0.78) in older adults with HV. Conclusions HV impacts on particular gait parameters, and further understanding of potentially modifiable factors is important for prevention and management of HV. Cause and effect relationships cannot be inferred from cross-sectional studies, thus prospective studies are warranted to elucidate the relationship between HV and functional disability.
Resumo:
Objective Analgesia and early quality of recovery may be improved by epidural analgesia. We aimed to assess the effect of receiving epidural analgesia on surgical adverse events and quality of life after laparotomy for endometrial cancer. Methods Patients were enrolled in an international, multicentre, prospective randomised trial of outcomes for laparoscopic versus open surgical treatment for the management of apparent stage I endometrial cancer (LACE trial). The current analysis focussed on patients who received an open abdominal hysterectomy via vertical midline incision only (n = 257), examining outcomes in patients who did (n = 108) and did not (n = 149) receive epidural analgesia. Results Baseline characteristics were comparable between patients with or without epidural analgesia. More patients without epidural (34%) ceased opioid analgesia 3–5 days after surgery compared to patients who had an epidural (7%; p < 0.01). Postoperative complications (any grade) occurred in 86% of patients with and in 66% of patients without an epidural (p < 0.01) but there was no difference in serious adverse events (p = 0.19). Epidural analgesia was associated with increased length of stay (up to 48 days compared to up to 34 days in the non-epidural group). There was no difference in postoperative quality of life up to six months after surgery. Conclusions Epidural analgesia was associated with an increase in any, but not serious, postoperative complications and length of stay after abdominal hysterectomy. Randomised controlled trials are needed to examine the effect of epidural analgesia on surgical adverse events, especially as the present data do not support a quality of life benefit with epidural analgesia. Keywords Endometrial cancer; Hysterectomy; Epidural; Adverse events
Resumo:
Akt, a Serine/Threonine protein kinase, mediates growth factor-associated cell survival. Constitutive activation of Akt (phosphorylated Akt, P-Akt) has been observed in several human cancers, including lung cancer and may be associated with poor prognosis and chemotherapy and radiotherapy resistance. The clinical relevance of P-Akt in non-small cell lung cancer (NSCLC) is not well described. In the present study, we examined 82 surgically resected snap-frozen and paraffin-embedded stage I to IIIA NSCLC samples for P-Akt and Akt by Western blotting and for P-Akt by immunohistochemistry. P-Akt protein levels above the median, measured using reproducible semiquantitative band densitometry, correlated with a favorable outcome (P = 0.007). Multivariate analysis identified P-Akt as a significant independent favorable prognostic factor (P = 0.004). Although associated with a favorable prognosis, high P-Akt levels correlated with high tumor grade (P = 0.02). Adenocarcinomas were associated with low P-Akt levels (P = 0.039). Akt was not associated with either outcome or clinicopathologic variables. Cytoplasmic (CP-Akt) and nuclear (NP-Akt) P-Akt tumor cell staining was detected in 96% and 42% of cases, respectively. Both CP-Akt and NP-Akt correlated with well-differentiated tumors (P = 0.008 and 0.017, respectively). NP-Akt also correlated with nodal metastases (P = 0.022) and squamous histology (P = 0.037). These results suggest P-Akt expression is a favorable prognostic factor in NSCLC. Immunolocalization of P-Akt, however, may be relevant as NP-Akt was associated with nodal metastases, a known poor prognostic feature in this disease. P-Akt may be a potential novel therapeutic target for the management of NSCLC. © 2005 American Association for Cancer Research.
Resumo:
Metastatic breast cancer (MBC) may present de novo but more commonly develops in women initially presenting with early breast cancer despite the widespread use of adjuvant hormonal and cytotoxic chemotherapy. MBC is incurable. Hormone sensitive MBC eventually becomes resistant to endocrine therapy in most women. Anthracyclines are the agents of choice in the treatment of endocrine resistant MBC. With the widespread use of anthracyclines in the adjuvant setting, taxanes have become the agents of choice for many patients. Recently capecitabine has become established as a standard of care for patients pretreated with anthracyclines and taxanes. However, a range of agents have activity as third line treatment. These include gemcitabine, vinorelbine and platinum analogues. The sequential use of non-cross resistant single agents rather than combination therapy is preferable in most women with MBC. Even though combination therapy can improve response rates and increase progression free interval, there is no robust evidence to indicate an advantage in terms of overall survival. Moreover, combination therapy is associated with a higher toxicity rate and poor quality of life. There is no role for dose-intense therapy, high dose therapy or maintenance chemotherapy outside the context of a clinical trial. The introduction of trastuzumab, monoclonal antibody targeting growth factor receptors, has improved the therapeutic options for women with tumours overexpressing HER2/neu. DNA micro-array profiles of tumours can potentially help to individualise therapy in future. Molecular targeted therapy has the potential to revolutionise the management of MBC.
Resumo:
Background: Charcot Neuro-Arthropathy (CN) is one of the more devastating complications of diabetes. To the best of the authors' knowledge, it appears that no clinical tools based on a systematic review of existing literature have been developed to manage acute CN. Thus, the aim of this paper was to systematically review existing literature and develop an evidence-based clinical pathway for the assessment, diagnosis and management of acute CN in patients with diabetes. Methods: Electronic databases (Medline, PubMed, CINAHL, Embase and Cochrane Library), reference lists, and relevant key websites were systematically searched for literature discussing the assessment, diagnosis and/or management of acute CN published between 2002-2012. At least two independent investigators then quality rated and graded the evidence of each included paper. Consistent recommendations emanating from the included papers were then fashioned in a clinical pathway. Results: The systematic search identified 267 manuscripts, of which 117 (44%) met the inclusion criteria for this study. Most manuscripts discussing the assessment, diagnosis and/or management of acute CN constituted level IV (case series) or EO (expert opinion) evidence. The included literature was used to develop an evidence-based clinical pathway for the assessment, investigations, diagnosis and management of acute CN. Conclusions: This research has assisted in developing a comprehensive, evidence-based clinical pathway to promote consistent and optimal practice in the assessment, diagnosis and management of acute CN. The pathway aims to support health professionals in making early diagnosis and providing appropriate immediate management of acute CN, ultimately reducing its associated complications such as amputations and hospitalisations.
Resumo:
In 2011, 366 million people suffered from diabetes worldwide, resulting in 4.6 million deaths at a cost of US$465 billion in direct healthcare expenditures1. India has the world’s second largest diabetic population at 61.8 million (8.3% of total population)1, while in Australia 8.1% of the population have been diagnosed with diabetes1. Diabetic foot ulcers (DFUs) affect up to 25% of diabetic patients, precipitating 85% of all diabetic amputations2,3. DFUs have significant social and economic impacts associated with increased hospitalisation rates, cost of care, and the reduced capacity of patients and carers to work. In isolated regions of Australia and India the incidence of DFU and associated infection is substantially increased, resulting in hospitalisation rates up to 4- fold that of major cities...
Resumo:
Aims This paper is a report on the effectiveness of a self-management programme based on the self-efficacy construct, in older people with heart failure. Background Heart failure is a major health problem worldwide, with high mortality and morbidity, making it a leading cause of hospitalization. Heart failure is associated with a complex set of symptoms that arise from problems in fluid and sodium retention. Hence, managing salt and fluid intake is important and can be enhanced by improving patients' self-efficacy in changing their behaviour. Design Randomized controlled trial. Methods Heart failure patients attending cardiac clinics in northern Taiwan from October 2006–May 2007 were randomly assigned to two groups: control (n = 46) and intervention (n = 47). The intervention group received a 12-week self-management programme that emphasized self-monitoring of salt/fluid intake and heart failure-related symptoms. Data were collected at baseline as well as 4 and 12 weeks later. Data analysis to test the hypotheses used repeated-measures anova models. Results Participants who received the intervention programme had significantly better self-efficacy for salt and fluid control, self-management behaviour and their heart failure-related symptoms were significantly lower than participants in the control group. However, the two groups did not differ significantly in health service use. Conclusion The self-management programme improved self-efficacy for salt and fluid control, self-management behaviours, and decreased heart failure-related symptoms in older Taiwanese outpatients with heart failure. Nursing interventions to improve health-related outcomes for patients with heart failure should emphasize self-efficacy in the self-management of their disease.