381 resultados para Median voter


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – This research has been conducted with the aim of determining if celebrity endorsers in political party advertising have a significant impact on UK voter intentions. The use of celebrity endorsements is commonplace in the USA, but little is known about its effects in the UK. This research also aims to incorporate the use of celebrity endorsements in political party advertising with the political salience construct. Political salience represents how prominent politics and political issues are in the minds of the eligible voter. Design/methodology/approach – A 2 (endorser: celebrity; non-celebrity) £ 2 (political salience: high; low) between-subjects factorial design experiment was used. The results show that celebrity endorsements do play a significant role in attitudes towards the political advert, attitudes towards the endorser and voter intention. However, this effect is significantly moderated by political salience. Findings – The results show that low political salience respondents were significantly more likely to vote for the political party when a celebrity endorser is used. However, the inverse effect is found for high political salience respondents. Practical implications – The results offer significant insights into the effect that celebrity endorsers could have in future elections and the importance that political salience plays in the effectiveness of celebrity endorsement. If political parties are to target those citizens that do not actively engage with politics then the use of celebrity endorsements would make a significant impact, given the results of this research. Originality/value – This research would be of particular interest to political party campaigners as well as academics studying the effects of advertising and identity salience.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Exercise for Health was a pragmatic, randomised, controlled trial comparing the effect of an eight-month exercise intervention on function, treatment-related side effects and quality of life following breast cancer, compared with usual care. The intervention commenced six weeks post-surgery, and two modes of delivering the same intervention was compared with usual care. The purpose of this paper is to describe the study design, along with outcomes related to recruitment, retention and representativeness, and intervention participation. Methods: Women newly diagnosed with breast cancer and residing in a major metropolitan city of Queensland, Australia, were eligible to participate. Consenting women were randomised to a face-to-face-delivered exercise group (FtF, n=67), telephone-delivered exercise group (Tel, n=67) or usual care group (UC, n=60) and were assessed pre-intervention (5-weeks post-surgery), mid-intervention (6 months post-surgery) and 10 weeks post-intervention (12 months post-surgery). Each intervention arm entailed 16 sessions with an Exercise Physiologist. Results: Of 318 potentially eligible women, 63% (n=200) agreed to participate, with a 12-month retention rate of 93%. Participants were similar to the Queensland breast cancer population with respect to disease characteristics, and the randomisation procedure was mostly successful at attaining group balance, with the few minor imbalances observed unlikely to influence intervention effects given balance in other related characteristics. Median participation was 14 (min, max: 0, 16) and 13 (min, max: 3, 16) intervention sessions for the FtF and Tel, respectively, with 68% of those in Tel and 82% in FtF participating in at least 75% of sessions. Discussion: Participation in both intervention arms during and following treatment for breast cancer was feasible and acceptable to women. Future work, designed to inform translation into practice, will evaluate the quality of life, clinical, psychosocial and behavioural outcomes associated with each mode of delivery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The figure Beets took exception to displays sex‐ and age‐specific median values of aggregated published expected values for pedometer determined physical activity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To assemble expected values for free-living steps/day in special populations living with chronic illnesses and disabilities. Method Studies identified since 2000 were categorized into similar illnesses and disabilities, capturing the original reference, sample descriptions, descriptions of instruments used (i.e., pedometers, piezoelectric pedometers, accelerometers), number of days worn, and mean and standard deviation of steps/day. Results Sixty unique studies represented: 1) heart and vascular diseases, 2) chronic obstructive lung disease, 3) diabetes and dialysis, 4) breast cancer, 5) neuromuscular diseases, 6) arthritis, joint replacement, and fibromyalgia, 7) disability (including mental retardation/intellectual difficulties), and 8) other special populations. A median steps/day was calculated for each category. Waist-mounted and ankle-mounted instruments were considered separately due to fundamental differences in assessment properties. For waist-mounted instruments, the lowest median values for steps/day are found in disabled older adults (1214 steps/day) followed by people living with COPD (2237 steps/day). The highest values were seen in individuals with Type 1 diabetes (8008 steps/day), mental retardation/intellectual disability (7787 steps/day), and HIV (7545 steps/day). Conclusion This review will be useful to researchers/practitioners who work with individuals living with chronic illness and disability and require such information for surveillance, screening, intervention, and program evaluation purposes. Keywords: Exercise; Walking; Ambulatory monitoring

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Ambulance ramping within the Emergency Department (ED) is a common problem both internationally and in Australia. Previous research has focused on various issues associated with ambulance ramping such as access block, ED overcrowding and ambulance bypass. However, limited research has been conducted on ambulance ramping and its effects on patient outcomes. ----- ----- Methods: A case-control design was used to describe, compare and predict patient outcomes of 619 ramped (cases) vs. 1238 non-ramped (control) patients arriving to one ED via ambulance from 1 June 2007 to 31 August 2007. Cases and controls were matched (on a 1:2 basis) on age, gender and presenting problem. Outcome measures included ED length of stay and in-hospital mortality. ----- ----- Results: The median ramp time for all 1857 patients was 11 (IQR 6—21) min. Compared to nonramped patients, ramped patients had significantly longer wait time to be triaged (10 min vs. 4 min). Ramped patients also comprised significantly higher proportions of those access blocked (43% vs. 34%). No significant difference in the proportion of in-hospital deaths was identified (2%vs. 3%). Multivariate analysis revealed that the likelihood of having an ED length of stay greater than eight hours was 34% higher among patients who were ramped (OR 1.34, 95% CI 1.06—1.70, p = 0.014). In relation to in-hospital mortality age was the only significant independent predictor of mortality (p < 0.0001). ----- ----- Conclusion: Ambulance ramping is one factor that contributes to prolonged ED length of stay and adds additional strain on ED service provision. The potential for adverse patient outcomes that may occur as a result of ramping warrants close attention by health care service providers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND. Physical symptoms are common in pregnancy and are predominantly associated with normal physiological changes. These symptoms have a social and economic cost, leading to absenteeism from work and additional medical interventions. There is currently no simple method for identifying common pregnancy related problems in the antenatal period. A validated tool, for use by pregnancy care providers would be useful. AIM: The aim of the project was to develop and validate a Pregnancy Symptoms Inventory for use by healthcare professionals (HCPs). METHODS: A list of symptoms was generated via expert consultation with midwives and obstetrician gynaecologists. Focus groups were conducted with pregnant women in their first, second or third trimester. The inventory was then tested for face validity and piloted for readability and comprehension. For test-re-test reliability, it was administered to the same women 2 to 3 days apart. Finally, outpatient midwives trialled the inventory for 1 month and rated its usefulness on a 10cm visual analogue scale (VAS). The number of referrals to other health care professionals was recorded during this month. RESULTS: Expert consultation and focus group discussions led to the generation of a 41-item inventory. Following face validity and readability testing, several items were modified. Individual item test re-test reliability was between .51 to 1 with the majority (34 items) scoring .0.70. During the testing phase, 211 surveys were collected in the 1 month trial. Tiredness (45.5%), poor sleep (27.5%) back pain (19.5%) and nausea (12.6%) were experienced often. Among the women surveyed, 16.2% claimed to sometimes or often be incontinent. Referrals to the incontinence nurse increased > 8 fold during the study period. The median rating by midwives of the ‘usefulness’ of the inventory was 8.4 (range 0.9 to 10). CONCLUSIONS: The Pregnancy Symptoms Inventory (PSI) was well accepted by women in the 1 month trial and may be a useful tool for pregnancy care providers and aids clinicians in early detection and subsequent treatment of symptoms. It shows promise for use in the research community for assessing the impact of lifestyle intervention in pregnancy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim To identify relationships between preventive activities, psychosocial factors and leg ulcer recurrence in patients with chronic venous leg ulcers. Background Chronic venous leg ulcers are slow to heal and frequently recur, resulting in years of suffering and intensive use of health care resources. Methods A prospective longitudinal study was undertaken with a sample of 80 patients with a venous leg ulcer recruited when their ulcer healed. Data were collected from 2006–2009 from medical records on demographics, medical history and ulcer history; and from self-report questionnaires on physical activity, nutrition, preventive activities and psychosocial measures. Follow-up data were collected via questionnaires every three months for 12 months after healing. Median time to recurrence was calculated using the Kaplan-Meier method. A Cox proportional-hazards regression model was used to adjust for potential confounders and determine effects of preventive strategies and psychosocial factors on recurrence. Results: There were 35 recurrences in a sample of 80 participants. Median time to recurrence was 27 weeks. After adjustment for potential confounders, a Cox proportional hazards regression model found that at least an hour/day of leg elevation, six or more days/week in Class 2 (20–25mmHg) or 3 (30–40mmHg) compression hosiery, higher social support scale scores and higher General Self-Efficacy scores remained significantly associated (p<0.05) with a lower risk of recurrence, while male gender and a history of DVT remained significant risk factors for recurrence. Conclusion Results indicate that leg elevation, compression hosiery, high levels of self-efficacy and strong social support will help prevent recurrence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract As regional and continental carbon balances of terrestrial ecosystems become available, it becomes clear that the soils are the largest source of uncertainty. Repeated inventories of soil organic carbon (SOC) organized in soil monitoring networks (SMN) are being implemented in a number of countries. This paper reviews the concepts and design of SMNs in ten countries, and discusses the contribution of such networks to reducing the uncertainty of soil carbon balances. Some SMNs are designed to estimate country-specific land use or management effects on SOC stocks, while others collect soil carbon and ancillary data to provide a nationally consistent assessment of soil carbon condition across the major land-use/soil type combinations. The former use a single sampling campaign of paired sites, while for the latter both systematic (usually grid based) and stratified repeated sampling campaigns (5–10 years interval) are used with densities of one site per 10–1,040 km². For paired sites, multiple samples at each site are taken in order to allow statistical analysis, while for the single sites, composite samples are taken. In both cases, fixed depth increments together with samples for bulk density and stone content are recommended. Samples should be archived to allow for re-measurement purposes using updated techniques. Information on land management, and where possible, land use history should be systematically recorded for each site. A case study of the agricultural frontier in Brazil is presented in which land use effect factors are calculated in order to quantify the CO2 fluxes from national land use/management conversion matrices. Process-based SOC models can be run for the individual points of the SMN, provided detailed land management records are available. These studies are still rare, as most SMNs have been implemented recently or are in progress. Examples from the USA and Belgium show that uncertainties in SOC change range from 1.6–6.5 Mg C ha−1 for the prediction of SOC stock changes on individual sites to 11.72 Mg C ha−1 or 34% of the median SOC change for soil/land use/climate units. For national SOC monitoring, stratified sampling sites appears to be the most straightforward attribution of SOC values to units with similar soil/land use/climate conditions (i.e. a spatially implicit upscaling approach). Keywords Soil monitoring networks - Soil organic carbon - Modeling - Sampling design

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Loss of home is common to all people from a refugee background yet we have little understanding of the diversity of meaning associated with this important concept. A phenomenological approach was used to explore experiences of home amongst Karen and Chin refugees residing in Brisbane. In-depth, semi-structured interviews were conducted with nine participants from Karen and Chin backgrounds. The participants comprised five females and four males (mean age 40 years, median length of time in Australia 1.33 years). Participants described their migration stories, including pre- and post-migration history. Analysis was conducted using interpretative phenomenological analysis. Three superordinate themes, explicating the meaning of home for participants, were identified: home as the experience of a psychological space of safety and retreat; home as the socio-emotional space of relatedness to family; and home as geographical-emotional landscape. Loss of home was experienced as a multidimensional loss associated with emotional and physical disturbances. These findings, based upon a phenomenological paradigm, enhance understanding of the experience of being a refugee and of the suffering engendered by loss of home. They open up the possibility for conceptualizing refugee responses in terms of human suffering and meaning making.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective Laser Doppler imaging (LDI) was compared to wound outcomes in children's burns, to determine if the technology could be used to predict these outcomes. Methods Forty-eight patients with a total of 85 burns were included in the study. Patient median age was 4 years 10 months and scans were taken 0–186 h post-burn using the fast, low-resolution setting on the Moor LDI2 laser Doppler imager. Wounds were managed by standard practice, without taking into account the scan results. Time until complete re-epithelialisation and whether or not grafting and scar management were required were recorded for each wound. If wounds were treated with Silvazine™ or Acticoat™ prior to the scan, this was also recorded. Results The predominant colour of the scan was found to be significantly related to the re-epithelialisation, grafting and scar management outcomes and could be used to predict those outcomes. The prior use of Acticoat™ did not affect the scan relationship to outcomes, however, the use of Silvazine™ did complicate the relationship for light blue and green scanned partial thickness wounds. Scans taken within the 24-h window after-burn also appeared to be accurate predictors of wound outcome. Conclusion Laser Doppler imaging is accurate and effective in a paediatric population with a low-resolution fast-scan.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION: Anorexia nervosa (AN) is a growing problem among young female Singaporeans. We studied the demographics and follow-up data of AN patients referred to dietitians for nutritional intervention. METHODS: A retrospective nutritional notes review was done on 94 patients seen from 1992 to 2004. All patients were given nutritional intervention, which included individualised counselling for weight gain, personalised diet plan, correction of poor dietary intake and correction of perception towards healthy eating. We collected data on body mass index (BMI), patient demographics and outcome. RESULTS: 96 percent of the patients were female and 86.2 percent were Chinese. The median BMI at initial consultation was 14.7 kilogramme per square metre (range, 8.6-18.8 kilogramme per square metre). 76 percent were between 13 and 20 years old. 83 percent of the patients came back for follow-up appointments with the dietitians in addition to consultation with the psychiatrist. Overall, there was significant improvement in weight and BMI from an average 37 kg to 41 kg and 14.7 kilogramme per square metre to 16.4 kilogramme per square metre, respectively, between the fi rst and fi nal consultations (p-value is less than 0.001). The average duration of followup was about eight months. Among the patients on follow-up, 68 percent showed improvement with an average weight gain of 6 kg. Patients that improved had more outpatient follow-up sessions with the dietitians (4.2 consultations versus 1.6 consultations; p-value is less than 0.05), lower BMI at presentation (14.2 kilogramme per square metre versus 15.7 kilogramme per square metre; p-value is less than 0.01) and shorter duration of disease at presentation (one year versus three years; p-value is less than 0.05) compared with those who did not improve. Seven patients with the disease for more than two years did not show improvement with follow-up. CONCLUSION: We gained valuable understanding of the AN patients referred to our tertiary hospital for treatment, two-thirds of whom improved with adequate follow-up treatment. Patients that had suffered AN longer before seeking help appeared more resistant to improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Older adults, especially those acutely ill, are vulnerable to developing malnutrition due to a range of risk factors. The high prevalence and extensive consequences of malnutrition in hospitalised older adults have been reported extensively. However, there are few well-designed longitudinal studies that report the independent relationship between malnutrition and clinical outcomes after adjustment for a wide range of covariates. Acutely ill older adults are exceptionally prone to nutritional decline during hospitalisation, but few reports have studied this change and impact on clinical outcomes. In the rapidly ageing Singapore population, all this evidence is lacking, and the characteristics associated with the risk of malnutrition are also not well-documented. Despite the evidence on malnutrition prevalence, it is often under-recognised and under-treated. It is therefore crucial that validated nutrition screening and assessment tools are used for early identification of malnutrition. Although many nutrition screening and assessment tools are available, there is no universally accepted method for defining malnutrition risk and nutritional status. Most existing tools have been validated amongst Caucasians using various approaches, but they are rarely reported in the Asian elderly and none has been validated in Singapore. Due to the multiethnicity, cultural, and language differences in Singapore older adults, the results from non-Asian validation studies may not be applicable. Therefore it is important to identify validated population and setting specific nutrition screening and assessment methods to accurately detect and diagnose malnutrition in Singapore. The aims of this study are therefore to: i) characterise hospitalised elderly in a Singapore acute hospital; ii) describe the extent and impact of admission malnutrition; iii) identify and evaluate suitable methods for nutritional screening and assessment; and iv) examine changes in nutritional status during admission and their impact on clinical outcomes. A total of 281 participants, with a mean (+SD) age of 81.3 (+7.6) years, were recruited from three geriatric wards in Tan Tock Seng Hospital over a period of eight months. They were predominantly Chinese (83%) and community-dwellers (97%). They were screened within 72 hours of admission by a single dietetic technician using four nutrition screening tools [Tan Tock Seng Hospital Nutrition Screening Tool (TTSH NST), Nutritional Risk Screening 2002 (NRS 2002), Mini Nutritional Assessment-Short Form (MNA-SF), and Short Nutritional Assessment Questionnaire (SNAQ©)] that were administered in no particular order. The total scores were not computed during the screening process so that the dietetic technician was blinded to the results of all the tools. Nutritional status was assessed by a single dietitian, who was blinded to the screening results, using four malnutrition assessment methods [Subjective Global Assessment (SGA), Mini Nutritional Assessment (MNA), body mass index (BMI), and corrected arm muscle area (CAMA)]. The SGA rating was completed prior to computation of the total MNA score to minimise bias. Participants were reassessed for weight, arm anthropometry (mid-arm circumference, triceps skinfold thickness), and SGA rating at discharge from the ward. The nutritional assessment tools and indices were validated against clinical outcomes (length of stay (LOS) >11days, discharge to higher level care, 3-month readmission, 6-month mortality, and 6-month Modified Barthel Index) using multivariate logistic regression. The covariates included age, gender, race, dementia (defined using DSM IV criteria), depression (defined using a single question “Do you often feel sad or depressed?”), severity of illness (defined using a modified version of the Severity of Illness Index), comorbidities (defined using Charlson Comorbidity Index, number of prescribed drugs and admission functional status (measured using Modified Barthel Index; MBI). The nutrition screening tools were validated against the SGA, which was found to be the most appropriate nutritional assessment tool from this study (refer section 5.6) Prevalence of malnutrition on admission was 35% (defined by SGA), and it was significantly associated with characteristics such as swallowing impairment (malnourished vs well-nourished: 20% vs 5%), poor appetite (77% vs 24%), dementia (44% vs 28%), depression (34% vs 22%), and poor functional status (MBI 48.3+29.8 vs 65.1+25.4). The SGA had the highest completion rate (100%) and was predictive of the highest number of clinical outcomes: LOS >11days (OR 2.11, 95% CI [1.17- 3.83]), 3-month readmission (OR 1.90, 95% CI [1.05-3.42]) and 6-month mortality (OR 3.04, 95% CI [1.28-7.18]), independent of a comprehensive range of covariates including functional status, disease severity and cognitive function. SGA is therefore the most appropriate nutritional assessment tool for defining malnutrition. The TTSH NST was identified as the most suitable nutritional screening tool with the best diagnostic performance against the SGA (AUC 0.865, sensitivity 84%, specificity 79%). Overall, 44% of participants experienced weight loss during hospitalisation, and 27% had weight loss >1% per week over median LOS 9 days (range 2-50). Wellnourished (45%) and malnourished (43%) participants were equally prone to experiencing decline in nutritional status (defined by weight loss >1% per week). Those with reduced nutritional status were more likely to be discharged to higher level care (adjusted OR 2.46, 95% CI [1.27-4.70]). This study is the first to characterise malnourished hospitalised older adults in Singapore. It is also one of the very few studies to (a) evaluate the association of admission malnutrition with clinical outcomes in a multivariate model; (b) determine the change in their nutritional status during admission; and (c) evaluate the validity of nutritional screening and assessment tools amongst hospitalised older adults in an Asian population. Results clearly highlight that admission malnutrition and deterioration in nutritional status are prevalent and are associated with adverse clinical outcomes in hospitalised older adults. With older adults being vulnerable to risks and consequences of malnutrition, it is important that they are systematically screened so timely and appropriate intervention can be provided. The findings highlighted in this thesis provide an evidence base for, and confirm the validity of the current nutrition screening and assessment tools used among hospitalised older adults in Singapore. As the older adults may have developed malnutrition prior to hospital admission, or experienced clinically significant weight loss of >1% per week of hospitalisation, screening of the elderly should be initiated in the community and continuous nutritional monitoring should extend beyond hospitalisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim: To review the management of heart failure in patients not enrolled in specialist multidisciplinary programs. Method: A prospective clinical audit of patients admitted to hospital with either a current or past diagnosis of heart failure and not enrolled in a specialist heart failure program or under the direct care of the cardiology unit. Results: 81 eligible patients were enrolled (1 August to 1 October 2008). The median age was 81 9.4 years and 48% were male. Most patients (63%) were in New York Heart Association Class II or Class III heart failure. On discharge, 59% of patients were prescribed angiotensin converting enzyme inhibitors and 43% were prescribed beta-blockers. During hospitalisation, 8.6% of patients with a past diagnosis of heart failure were started on an angiotensin converting enzyme inhibitor and 4.9% on a beta-blocker. There was evidence of suboptimal dosage on admission and discharge for angiotensin converting enzyme inhibitors (19% and 7.4%) and beta-blockers (29% and 17%). The results compared well with international reports regarding the under-treatment of heart failure. Conclusion: The demonstrated practice gap provides excellent opportunities for the involvement of pharmacists to improve the continuation of care for heart failure patients discharged from hospital in the areas of medication management review, dose titration and monitoring.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background:  The aims of this study were to determine the documentation of pharmacotherapy optimization goals in the discharge letters of patients with the principal diagnosis of chronic heart failure. Methods:  A retrospective practice audit of 212 patients discharged to the care of their local general practitioner from general medical units of a large tertiary hospital. Details of recommendations regarding ongoing pharmacological and non-pharmacological management were reviewed. The doses of medications on discharge were noted and whether they met current guidelines recommending titration of angiotensin-converting enzyme inhibitors and beta-blockers. Ongoing arrangements for specialist follow up were also reviewed. Results:  The mean age of patients whose letters were reviewed was 78.4 years (standard deviation ± 8.6); 50% were men. Patients had an overall median of six comorbidities and eight regular medications on discharge. Mean length of stay for each admission was 6 days. Discharge letters were posted a median of 4 days after discharge, with 25% not posted at 10 days. No discharge letter was sent in 9.4% (20) of the cases. Only six (2.8%) letters had any recommendations regarding future titration of angiotensin-converting enzyme inhibitors and 6.6% (14) for beta-blockers. Recommendations for future non-pharmacological management, for example, diuretic action plans, regular weight monitoring and exercise plans were not found in the letters in this audit. Conclusion:  Hospital discharge is an opportunity to communicate management plans for treatment optimization effectively, and while this opportunity is spurned, implementation gaps in the management of cardiac failure will probably remain.