923 resultados para Ward Round
Resumo:
Background and aims The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients consume ≤50% of the offered food in Australian and New Zealand hospitals. After controlling for confounders (nutritional status, age, disease type and severity), the ANCDS also established an independent association between poor food intake and increased in-hospital mortality. This study aimed to evaluate if medical nutrition therapy (MNT) could improve dietary intake in hospital patients eating poorly. Methods An exploratory pilot study was conducted in the respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, percentage food intake (0%, 25%, 50%, 75%, and 100%) was evaluated for each main meal and snack for a 24-hour period in patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT. Food intake was re-evaluated on the seventh day following recruitment (post-MNT). Results 184 patients were observed over four weeks; 32 patients were referred for MNT. Although baseline and post-MNT data for 20 participants (68±17years, 65% females) indicated a significant increase in median energy and protein intake post-MNT (3600kJ/day, 40g/day) versus baseline (2250kJ/day, 25g/day) (p<0.05), the increased intake met only 50% of dietary requirements. Persistent nutrition impact symptoms affected intake. Conclusion In this pilot study whilst dietary intake improved, it remained inadequate to meet participants’ estimated requirements due to ongoing nutrition-impact symptoms. Appropriate medical management and early enteral feeding could be a possible solution for such patients.
Resumo:
Background Research is a major driver of health care improvement and evidence-based practice is becoming the foundation of health care delivery. For health professions to develop within emerging models of health care delivery, it would seem imperative to develop and monitor the research capacity and evidence-based literacy of the health care workforce. This observational paper aims to report the research capacity levels of statewide populations of public-sector podiatrists at two different time points twelve-months apart. Methods The Research Capacity & Culture (RCC) survey was electronically distributed to all Queensland Health (Australia) employed podiatrists in January 2011 (n = 58) and January 2012 (n = 60). The RCC is a validated tool designed to measure indicators of research skill in health professionals. Participants rate skill levels against each individual, team and organisation statement on a 10-point scale (one = lowest, ten = highest). Chi-squared and Mann Whitney U tests were used to determine any differences between the results of the two survey samples. A minimum significance of p < 0.05 was used throughout. Results Thirty-seven (64%) podiatrists responded to the 2011 survey and 33 (55%) the 2012 survey. The 2011 survey respondents reported low skill levels (Median < 4) on most aspects of individual research aspects, except for their ability to locate and critically review research literature (Median > 6). Whereas, most reported their organisation’s skills to perform and support research at much higher levels (Median > 6). The 2012 survey respondents reported significantly higher skill ratings compared to the 2011 survey in individuals’ ability to secure research funding, submit ethics applications, and provide research advice, plus, in their organisation’s skills to support, fund, monitor, mentor and engage universities to partner their research (p < 0.05). Conclusions This study appears to report the research capacity levels of the largest populations of podiatrists published. The 2011 survey findings indicate podiatrists have similarly low research capacity skill levels to those reported in the allied health literature. The 2012 survey, compared to the 2011 survey, suggests podiatrists perceived higher skills and support to initiate research in 2012. This improvement coincided with the implementation of research capacity building strategies.
Resumo:
Aim To develop clinical practice guidelines for nurse-administered procedural sedation and analgesia in the cardiac catheterisation laboratory. Background Numerous studies have reported that nurse-administered procedural sedation and analgesia is safe. However, the broad scope of existing guidelines for the administration and monitoring of patients who receive sedation during medical procedures without an anaesthetist presents means there is a lack of specific guidance regarding optimal nursing practices for the unique circumstances in which nurse-administered procedural sedation and analgesia is used in the cardiac catheterisation laboratory. Methods A sequential mixed methods design was utilised. Initial recommendations were produced from three studies conducted by the authors: an integrative review; a qualitative study; and a cross-sectional survey. The recommendations were revised in accordance with responses from a modified Delphi study. The first Delphi round was completed by nine senior cardiac catheterisation laboratory nurses. All but one of the draft recommendations met the pre-determined cut-off point for inclusion. There were a total of 59 responses to the second round. Consensus was reached on all recommendations. Implications for nursing The guidelines that were derived from the Delphi study offer twenty four recommendations within six domains of nursing practice: Pre-procedural assessment; Pre-procedural patient and family education; Pre-procedural patient comfort; Intra-procedural patient comfort; Intra-procedural patient assessment and monitoring; and Post-procedural patient assessment and monitoring. Conclusion These guidelines provide an important foundation towards the delivery of safe, consistent and evidence-based nursing care for the many patients who receive sedation in the cardiac catheterisation laboratory setting.
Resumo:
“Mental illness is a tough illness to survive, it is incurable but manageable. Living with the illness when at its full potency can disrupt your life at any moment.” Intensive care for patients experiencing acute psychiatric distress is an essential yet complex part of mental health services as a whole system. Psychiatric intensive care units remain a source of controversy; despite promising developments to health services incorporating recovery goals and processes outlined by people with a mental illness themselves. In past decades changes in the provision of mental health services have focused on the restoration of a meaningful and empowered life with choice and hope as a defining attribute of recovery. Yet, what does recovery mean and how are recovery principles accomplished in psychiatric intensive care arrangements for someone experiencing acute psychiatric distress?
Resumo:
Defence projects are typically undertaken within a multi-project-management environment where a common agenda of project managers is to achieve higher project efficiency. This study adopted a multi-facet qualitative approach to investigate factors contributing to or impeding project efficiency in the Defence sector. Semi-structured interviews were undertaken to identify additional factors to those compiled from the literature survey. This was followed by a three-round Delphi study to examine the perceived critical factors of project efficiency. The results showed that project efficiency in the Defence sector went beyond its traditional internally focused scope to one that is externally focused. As a result, efforts are needed on not only those factors related to individual projects but also those factors related to project inter-dependencies and external customers. The management of these factors will help to enhance the efficiency of a project within the Defence sector.
The electrochemical corrosion behaviour of quaternary gold alloys when exposed to 3.5% NaCl solution
Resumo:
Lower carat gold alloys, specifically 9 carat gold alloys, containing less than 40 % gold, and alloying additions of silver, copper and zinc, are commonly used in many jewellery applications, to offset high costs and poor mechanical properties associated with pure gold. While gold is considered to be chemically inert, the presence of active alloying additions raises concerns about certain forms of corrosion, particularly selective dissolution of these alloys. The purpose of this study was to systematically study the corrosion behaviour of a series of quaternary gold–silver–copper–zinc alloys using dc potentiodynamic scanning in saline (3.5 % NaCl) environment. Full anodic/cathodic scans were conducted to determine the overall corrosion characteristics of the alloy, followed by selective anodic scans and subsequent morphological and compositional analysis of the alloy surface and corroding media to determine the extent of selective dissolution. Varying degrees of selective dissolution and associated corrosion rates were observed after anodic polarisation in 3.5 % NaCl, depending on the alloy composition. The corrosion behaviour of the alloys was determined by the extent of anodic reactions which induce (1) formation of oxide scales on the alloy surface and or (2) dissolution of Zn and Cu species. In general, the improved corrosion characteristics of alloy #3 was attributed to the composition of Zn/Cu in the alloy and thus favourable microstructure promoting the formation of protective oxide/chloride scales and reducing the extent of Cu and Zn dissolution.
Resumo:
BACKGROUND: The prevalence of protein-energy malnutrition in older adults is reported to be as high as 60% and is associated with poor health outcomes. Inadequate feeding assistance and mealtime interruptions may contribute to malnutrition and poor nutritional intake during hospitalisation. Despite being widely implemented in practice in the United Kingdom and increasingly in Australia, there have been few studies examining the impact of strategies such as Protected Mealtimes and dedicated feeding assistant roles on nutritional outcomes of elderly inpatients. AIMS: The aim of this research was to implement and compare three system-level interventions designed to specifically address mealtime barriers and improve energy intakes of medical inpatients aged ≥65 years. This research also aimed to evaluate the sustainability of any changes to mealtime routines six months post-intervention and to gain an understanding of staff perceptions of the post-intervention mealtime experience. METHODS: Three mealtime assistance interventions were implemented in three medical wards at Royal Brisbane and Women's Hospital: AIN-only: Additional assistant-in-nursing (AIN) with dedicated nutrition role. PM-only: Multidisciplinary approach to meals, including Protected Mealtimes. PM+AIN: Combined intervention: AIN + multidisciplinary approach to meals. An action research approach was used to carefully design and implement the three interventions in partnership with ward staff and managers. Significant time was spent in consultation with staff throughout the implementation period to facilitate ownership of the interventions and increase likelihood of successful implementation. A pre-post design was used to compare the implementation and nutritional outcomes of each intervention to a pre-intervention group. Using the same wards, eligible participants (medical inpatients aged ≥65 years) were recruited to the preintervention group between November 2007 and March 2008 and to the intervention groups between January and June 2009. The primary nutritional outcome was daily energy and protein intake, which was determined by visually estimating plate waste at each meal and mid-meal on Day 4 of admission. Energy and protein intakes were compared between the pre and post intervention groups. Data were collected on a range of covariates (demographics, nutritional status and known risk factors for poor food intake), which allowed for multivariate analysis of the impact of the interventions on nutritional intake. The provision of mealtime assistance to participants and activities of ward staff (including mealtime interruptions) were observed in the pre-intervention and intervention groups, with staff observations repeated six months post-intervention. Focus groups were conducted with nursing and allied health staff in June 2009 to explore their attitudes and behaviours in response to the three mealtime interventions. These focus group discussions were analysed using thematic analysis. RESULTS: A total of 254 participants were recruited to the study (pre-intervention: n=115, AIN-only: n=58, PM-only: n=39, PM+AIN: n=42). Participants had a mean age of 80 years (SD 8), and 40% (n=101) were malnourished on hospital admission, 50% (n=108) had anorexia and 38% (n=97) required some assistance at mealtimes. Occasions of mealtime assistance significantly increased in all interventions (p<0.01). However, no change was seen in mealtime interruptions. No significant difference was seen in mean total energy and protein intake between the preintervention and intervention groups. However, when total kilojoule intake was compared with estimated requirements at the individual level, participants in the intervention groups were more likely to achieve adequate energy intake (OR=3.4, p=0.01), with no difference noted between interventions (p=0.29). Despite small improvements in nutritional adequacy, the majority of participants in the intervention groups (76%, n=103) had inadequate energy intakes to meet their estimated energy requirements. Patients with cognitive impairment or feeding dependency appeared to gain substantial benefit from mealtime assistance interventions. The increase in occasions of mealtime assistance by nursing staff during the intervention period was maintained six-months post-intervention. Staff focus groups highlighted the importance of clearly designating and defining mealtime responsibilities in order to provide adequate mealtime care. While the purpose of the dedicated feeding assistant was to increase levels of mealtime assistance, staff indicated that responsibility for mealtime duties may have merely shifted from nursing staff to the assistant. Implementing the multidisciplinary interventions empowered nursing staff to "protect" the mealtime from external interruptions, but further work is required to empower nurses to prioritise mealtime activities within their own work schedules. Staff reported an increase in the profile of nutritional care on all wards, with additional non-nutritional benefits noted including improved mobility and functional independence, and better identification of swallowing difficulties. IMPLICATIONS: The PhD research provides clinicians with practical strategies to immediately introduce change to deliver better mealtime care in the hospital setting, and, as such, has initiated local and state-wide roll-out of mealtime assistance programs. Improved nutritional intakes of elderly inpatients was observed; however given the modest effect size and reducing lengths of hospital stays, better nutritional outcomes may be achieved by targeting the hospital-to-home transition period. Findings from this study suggest that mealtime assistance interventions for elderly inpatients with cognitive impairment and/or functional dependency show promise.
Resumo:
Background Nutrition screening is usually administered by nurses. However, most studies on nutrition screening tools have not used nurses to validate the tools. The 3-Minute Nutrition Screening (3-MinNS) assesses weight loss, dietary intake and muscle wastage, with the composite score of each used to determine risk of malnutrition. The aim of the study was to determine the validity and reliability of 3-MinNS administered by nurses, who are the intended assessors. Methods In this cross sectional study, three ward-based nurses screened 121 patients aged 21 years and over using 3-MinNS in three wards within 24 hours of admission. A dietitian then assessed the patients’ nutritional status using Subjective Global Assessment within 48 hours of admission, whilst blinded to the results of the screening. To assess the reliability of 3-MinNS, 37 patients screened by the first nurse were re-screened by a second nurse within 24 hours, who was blinded to the results of the first nurse. The sensitivity, specificity and best cutoff score for 3-MinNS were determined using the Receiver Operator Characteristics Curve. Results The best cutoff score to identify all patients at risk of malnutrition using 3-MinNS was three, with sensitivity of 89% and specificity of 88%. This cutoff point also identified all (100%) severely malnourished patients. There was strong correlation between 3-MinNS and SGA (r=0.78, p<0.001). The agreement between two nurses conducting the 3-MinNS tool was 78.3%. Conclusion 3-Minute Nutrition Screening is a valid and reliable tool for nurses to identify patients at risk of malnutrition.
Resumo:
Natural disasters can have adverse effect on human lives. To raise the awareness of research and better combat future events, it is important to identify recent research trends in the area of post disaster reconstruction (PDR). The authors used a three-round literature review strategy to study journal papers published in the last decade that are related to PDR with specific conditions using the Scopus search engine. A wide range of PDR related papers from a general perspective was examined in the first two rounds while the final round established 88 papers as target publications through visual examination of the abstracts, keywords and as necessary, main texts. These papers were analysed in terms of research origins, active researchers, research organisations, most cited papers, regional concerns, major themes and deliverables, for clues of the past trends and future directions. The need for appropriate PDR research is increasingly recognised. The publication number multiplied 5 times from 2002 to 2012. For PDR research with a construction perspective, the increase is sixfold. Developing countries such as those in Asia attract almost 50% researchers' attention for regional concerns while the US is the single most concentrated (24%) country. Africa is hardly represented. Researchers in developed countries lead in worldwide PDR research. This contrasts to the need for expertise in developing countries. Past works focused on waste management, stakeholder analysis, resourcing, infrastructure issue, resilience and vulnerability, reconstruction approach, sustainable reconstruction and governance issues. Future research should respond to resourcing, integrated development, sustainability and resilience building to cover the gaps. By means of a holistic summary and structured analysis of key patterns, the authors hope to provide a streamlined access to existing research findings and make predictions of future trends. They also hope to encourage a more holistic approach to PDR research and international collaborations.
Resumo:
The main focus of ‘Kaleidoscope: Reframing evaluation through a stakeholder approach to sustainable, cultural change in Higher Education’ is to develop a set of principles to guide user-led engagement in widespread organisational change and maximise its impact. The word kaleidoscope represents the unique lens through which each institution will need to view their cultural specificity and local context through an extensive process of collaboration and engagement, followed by communication and dissemination. Kaleidoscope has particular relevance when new approaches to learning and teaching evaluation are introduced by tertiary institutions. Building on the Reframe Project, which involved three years of user-led consultation and was designed to meet stakeholders’ needs, QUT successfully introduced a new evaluation framework in 2013 across the university. Reframe was evidence based, involved scholarly reflection and was founded on a strong theoretical framework. The evolution of the evaluation framework included analysis of scholarly literature and environmental scans across the higher education sector (Alderman, et al., 2012), researched development of conceptual theory (Alderman, et al., in press 2013), incorporated the stakeholder voice and framed within project management principles (Alderman & Melanie, 2012). Kaleidoscope’s objectives are for QUT to develop its research-based stakeholder approach to distil the successful experience exhibited in the Reframe Project into a transferable set of guidelines for use by other tertiary institutions across the sectors. These guidelines will assist others to design, develop, and deploy, their own culturally specific widespread organisational change informed by stakeholder engagement and organisational buy-in. It is intended that these guidelines will promote, support and enable other tertiary institutions to embark on their own projects and maximise the impact. In correlation with a our conference paper, this round table presents the Draft Guidelines and Framework ready for external peer review by evaluation practitioners, as part of Kaleidoscope’s dissemination (Hinton & Gannaway, 2011) applying illuminative evaluation theory (Parlett & Hamilton, 1976), through conference workshops and linked round table discussions (Shapiro, et al., 1983; Jacobs, 2000).
Resumo:
Purpose: This randomized, multicenter trial compared first-line trastuzumab plus docetaxel versus docetaxel alone in patients with human epidermal growth factor receptor 2 (HER2)-positive metastatic breast cancer (MBC). Patients and Methods: Patients were randomly assigned to six cycles of docetaxel 100 mg/m 2 every 3 weeks, with or without trastuzumab 4 mg/kg loading dose followed by 2 mg/kg weekly until disease progression. Results: A total of 186 patients received at least one dose of the study drug. Trastuzumab plus docetaxel was significantly superior to docetaxel alone in terms of overall response rate (61% v 34%; P = .0002), overall survival (median, 31.2 v 22.7 months; P = .0325), time to disease progression (median, 11.7 v 6.1 months; P = .0001), time to treatment failure (median, 9.8 v 5.3 months; P = .0001), and duration of response (median, 11.7 v 5.7 months; P = .009). There was little difference in the number and severity of adverse events between the arms. Grade 3 to 4 neutropenia was seen more commonly with the combination (32%) than with docetaxel alone (22%), and there was a slightly higher incidence of febrile neutropenia in the combination arm (23% v 17%). One patient in the combination arm experienced symptomatic heart failure (1%). Another patient experienced symptomatic heart failure 5 months after discontinuation of trastuzumab because of disease progression, while being treated with an investigational anthracycline for 4 months. Conclusion: Trastuzumab combined with docetaxel is superior to docetaxel alone as first-line treatment of patients with HER2-positive MBC in terms of overall survival, response rate, response duration, time to progression, and time to treatment failure, with little additional toxicity. © 2005 by American Society of Clinical Oncology.
Resumo:
Current housing design and construction practices do not meet the needs of many people with disability and older people, and limits their inclusion and participation in community and family life. In spite of a decade of advocacy for regulation of access within residential environments, the Australian government has opted for a voluntary approach where the housing industry takes responsibility. Housing industry leaders have indicated that they are willing to transform their established practice, if it makes good business to do so, and if there is a demand from home buyers. To date, there has been minimal demand. In 2010, housing industry and community leaders formalised this commitment in an agreement, called Livable Housing Design, to transform housing design and construction practices, with a target of all new housing providing minimal access by 2020. This paper reports on a study which examined the assumption behind Livable Housing Design agreement; that is, individuals in the housing industry will respond voluntarily and take responsibility for the provision of inclusive housing. From interviews with developers, designers and builders in Brisbane, Queensland, the study found a complex picture of competing demands and responsibilities. Instead of changing their design and construction practices voluntarily to meet the future needs of users over the life of housing, they are more likely to focus on their immediate contractual obligations and to maintain the status quo. Contrary to the view of the government and industry leaders, participants identified that an external regulatory framework would be required if Livable Housing Design’s 2020 goal was to be met.
Resumo:
Osteocytes are the mature cells and perform as mechanosensors within the bone. The mechanical property of osteocytes plays an important role to fulfill these functions. However, little researches have been done to investigate the mechanical deformation properties of single osteocytes. Atomic Force Microscopy (AFM) is a state-of-art experimental facility for high resolution imaging of tissues, cells and any surfaces as well as for probing mechanical properties of the samples both qualitatively and quantitatively. In this paper, the experimental study based on AFM is firstly used to obtain forceindentation curves of single round osteocytes. The porohyperelastic (PHE) model of a single osteocyte is then developed by using the inverse finite element analysis (FEA) to identify and extract mechanical properties from the experiment results. It has been found that the PHE model is a good candidature for biomechanics studies of osteocytes.
Resumo:
Black et al. (2004) identified a systematic difference between LA–ICP–MS and TIMS measurements of 206Pb/238U in zircons, which they correlated with the incompatible trace element content of the zircon. We show that the offset between the LA–ICP–MS and TIMS measured 206Pb/238U correlates more strongly with the total radiogenic Pb than with any incompatible trace element. This suggests that the cause of the 206Pb/238U offset is related to differences in the radiation damage (alpha dose) between the reference and unknowns. We test this hypothesis in two ways. First, we show that there is a strong correlation between the difference in the LA–ICP–MS and TIMS measured 206Pb/238U and the difference in the alpha dose received by unknown and reference zircons. The LA–ICP–MS ages for the zircons we have dated can be as much as 5.1% younger than their TIMS age to 2.1% older, depending on whether the unknown or reference received the higher alpha dose. Second, we show that by annealing both reference and unknown zircons at 850 °C for 48 h in air we can eliminate the alpha-dose-induced differences in measured 206Pb/238U. This was achieved by analyzing six reference zircons a minimum of 16 times in two round robin experiments: the first consisting of unannealed zircons and the second of annealed grains. The maximum offset between the LA–ICP–MS and TIMS measured 206Pb/238U for the unannealed zircons was 2.3%, which reduced to 0.5% for the annealed grains, as predicted by within-session precision based on counting statistics. Annealing unknown zircons and references to the same state prior to analysis holds the promise of reducing the 3% external error for the measurement of 206Pb/238U of zircon by LA–ICP–MS, indicated by Klötzli et al. (2009), to better than 1%, but more analyses of annealed zircons by other laboratories are required to evaluate the true potential of the annealing method.
Resumo:
Objectives Current evidence to support non-medical prescribing is predominantly qualitative, with little evaluation of accuracy, safety and appropriateness. Our aim was to evaluate a new model of service for the Australia healthcare system, of inpatient medication prescribing by a pharmacist in an elective surgery preadmission clinic (PAC) against usual care, using an endorsed performance framework. Design Single centre, randomised controlled, two-arm trial. Setting Elective surgery PAC in a Brisbane-based tertiary hospital. Participants 400 adults scheduled for elective surgery were randomised to intervention or control. Intervention A pharmacist generated the inpatient medication chart to reflect the patient's regular medication, made a plan for medication perioperatively and prescribed venous thromboembolism (VTE) prophylaxis. In the control arm, the medication chart was generated by the Resident Medical Officers. Outcome measures Primary outcome was frequency of omissions and prescribing errors when compared against the medication history. The clinical significance of omissions was also analysed. Secondary outcome was appropriateness of VTE prophylaxis prescribing. Results There were significantly less unintended omissions of medications: 11 of 887 (1.2%) intervention orders compared with 383 of 1217 (31.5%) control (p<0.001). There were significantly less prescribing errors involving selection of drug, dose or frequency: 2 in 857 (0.2%) intervention orders compared with 51 in 807 (6.3%) control (p<0.001). Orders with at least one component of the prescription missing, incorrect or unclear occurred in 208 of 904 (23%) intervention orders and 445 of 1034 (43%) controls (p<0.001). VTE prophylaxis on admission to the ward was appropriate in 93% of intervention patients and 90% controls (p=0.29). Conclusions Medication charts in the intervention arm contained fewer clinically significant omissions, and prescribing errors, when compared with controls. There was no difference in appropriateness of VTE prophylaxis on admission between the two groups.