735 resultados para Training and pruning
Resumo:
Background Few cancers pose greater challenges than head and neck (H&N) cancer. Residual effects following treatment include body image changes, pain, fatigue and difficulties with appetite, swallowing and speech. Depression is a common comorbidity. There is limited evidence about ways to assist patients to achieve optimal adjustment after completion of treatment. In this study, we aim to examine the effectiveness and feasibility of a model of survivorship care to improve the quality of life of patients who have completed treatment for H&N cancer. Methods This is a preliminary study in which 120 patients will be recruited. A prospective randomised controlled trial of the H&N Cancer Survivor Self-management Care Plan (HNCP) involving pre- and post-intervention assessments will be used. Consecutive patients who have completed a defined treatment protocol for H&N cancer will be recruited from two large cancer services and randomly allocated to one of three study arms: (1) usual care, (2) information in the form of a written resource or (3) the HNCP delivered by an oncology nurse who has participated in manual-based training and skill development in patient self-management support. The trained nurses will meet patients in a face-to-face interview lasting up to 60 minutes to develop an individualised HNCP, based on principles of chronic disease self-management. Participants will be assessed at baseline, 3 and 6 months. The primary outcome measure is quality of life. The secondary outcome measures include mood, self-efficacy and health-care utilisation. The feasibility of implementing this intervention in routine clinical care will be assessed through semistructured interviews with participating nurses, managers and administrators. Interviews with patients who received the HNCP will explore their perceptions of the HNCP, including factors that assisted them in achieving behavioural change. Discussion In this study, we aim to improve the quality of life of a patient population with unique needs by means of a tailored self-management care plan developed upon completion of treatment. Delivery of the intervention by trained oncology nurses is likely to be acceptable to patients and, if successful, will be a model of care that can be implemented for diverse patient populations.
Resumo:
We hypothesized that Industry based learning and teaching, especially through industry assigned student projects or training programs, is an integral part of science, technology, engineering and mathematics (STEM) education. In this paper we show that industry-based student training and experience increases students’ academic performances independent to the organizational parameters and contexts. The literature on industry-based student training focuses on employability and the industry dimension, and neglects in many ways the academic dimension. We observed that the association factors between academic attributes and contributions of industry-based student training are central and vital to the technological learning experiences. We explore international initiatives and statistics collected of student projects in two categories: Industry based learning performances and on campus performances. The data collected were correlated to five (5) universities in different industrialized countries, e.g., Australia N=545, Norway N=279, Germany N=74, France N=107 and Spain N=802 respectively. We analyzed industry-based student training along with company assigned student projects compared with in comparisons to campus performance. The data that suggests a strong correlation between industry-based student training per se and improved performance profiles or increasing motivation shows that industry-based student training increases student academic performance independent of organizational parameters and contexts. The programs we augmented were orthogonal to each other however, the trend of the students’ academic performances are identical. An isolated cohort for the reported countries that opposed our hypothesis warrants further investigation.
Resumo:
The formality and informality of HRM practices in small firms Rowena Barrett and Susan Mayson Introduction The nature of human resource management in small firms is understood to be characterized by ad hoc and idiosyncratic practices. The liability of smallness (Heneman and Berkley, 1999) and resource poverty (Welsh and White, 1981) presents unique challenges to managing human resources in small firms. The inability to achieve economies of scale can mean that implementing formalized HRM practices is costly in terms of time and money for small firms (Sels et al., 2006a; 2006b). These, combined with small firm owner–managers’ lack of strategic capabilities and awareness (Hannon and Atherton, 1998) and a lack of managerial resources and expertise in HRM (Cardon and Stevens, 2004) can lead to informal and ad hoc HRM practices. For some this state of affairs is interpreted as problematic as the normative and formalized HRM practices in the areas of recruitment, selection, appraisal, training and rewards are not present (see Marlow, 2006 and Taylor, 2006 for a critique). However, a more nuanced analysis of the small firm and its practices in their context can tell a different story (Barrett and Rainnie, 2002; Harney and Dundon, 2006). In this chapter we contribute to our understanding of small firm management practices by investigating a series of questions in relation to HRM in small firms.
Resumo:
Purpose Is eccentric hamstring strength and between limb imbalance in eccentric strength, measured during the Nordic hamstring exercise, a risk factor for hamstring strain injury (HSI)? Methods Elite Australian footballers (n=210) from five different teams participated. Eccentric hamstring strength during the Nordic was taken at the commencement and conclusion of preseason training and in season. Injury history and demographic data were also collected. Reports on prospectively occurring HSIs were completed by team medical staff. Relative risk (RR) was determined for univariate data and logistic regression was employed for multivariate data. Results Twenty-eight HSIs were recorded. Eccentric hamstring strength below 256N at the start of preseason and 279N at the end of preseason increased risk of future HSI 2.7 (relative risk, 2.7; 95% confidence interval, 1.3 to 5.5; p = 0.006) and 4.3 fold (relative risk, 4.3; 95% confidence interval, 1.7 to 11.0; p = 0.002) respectively. Between limb imbalance in strength of greater than 10% did not increase the risk of future HSI. Univariate analysis did not reveal a significantly greater relative risk for future HSI in athletes who had sustained a lower limb injury of any kind within the last 12 months. Logistic regression revealed interactions between both athlete age and history of HSI with eccentric hamstring strength, whereby the likelihood of future HSI in older athletes or athletes with a history of HSI was reduced if an athlete had high levels of eccentric strength. Conclusion Low levels of eccentric hamstring strength increased the risk of future HSI. Interaction effects suggest that the additional risk of future HSI associated with advancing age or previous injury was mitigated by higher levels of eccentric hamstring strength.
Resumo:
This research is focused on realizing productivity benefits for the delivery of transport infrastructure in the Australian construction industry through the use of building information modeling (BIM), virtual design and construction (VDC) and integrated project delivery (IPD). Specific objectives include: (I) building an understanding of the institutional environment, business systems and support mechanisms (e.g., training and skilling) which impact on the uptake of BIM/VDC; (II) gathering data to undertake a cross-country analysis of these environments; and (III) providing strategic and practical outcomes to guide the uptake of such processes in Australia. Activities which will inform this research include a review of academic literature and industry documentation, semi-formal interviews in Australia and Sweden, and a cross-country comparative analysis to determine factors affecting uptake and associated productivity improvements. These activities will seek to highlight the gaps between current-practice and best-practice which are impacting on widespread adoption of BIM/VDC and IPD. Early findings will be discussed with intended outcomes of this research being used to: inform a national public procurement strategy; provide guidelines for new contractual frameworks; and contribute to closing skill gaps. Keywords: building information modeling (BIM); virtual design and construction (VDC); integrated project delivery (IPD); transport infrastructure; Australia; procurement
Resumo:
Pain is common in residential aged care facilities (RACFs). In 2005, the Australian Pain Society developed 27 recommendations for good practice in the identification, assessment, and management of pain in these settings. This study aimed to address implementation of the standards and evaluate outcomes. Five facilities in Australia participated in a comprehensive evaluation of RACF pain practice and outcomes. Pre-existing pain management practices were compared with the 27 recommendations, before an evidence-based pain management program was introduced that included training and education for staff and revised in-house pain-management procedures. Post-implementation audits evaluated the program's success. Aged care staff teams also were assessed on their reports of self-efficacy in pain management. The results show that before the implementation program, the RACFs demonstrated full compliance on 6 to 12 standards. By the project's completion, RACFs demonstrated full compliance with 10 to 23 standards and major improvements toward compliance in the remaining standards. After implementation, the staff also reported better understanding of the standards (p < .001) or of facility pain management guidelines (p < .001), increased confidence in therapies for pain management (p < .001), and increased confidence in their training to assess pain (p < .001) and recognize pain in residents with dementia who are nonverbal (p = .003). The results show that improved evidence-based practice in RACFs can be achieved with appropriate training and education. Investing resources in the aged care workforce via this implementation program has shown improvements in staff self-efficacy and practice.
Resumo:
Pain is common in individuals living in residential aged care facilities (RACFs), and a number of obstacles have been identified as recurring barriers to adequate pain management. To address this, the Australian Pain Society developed 27 recommendations for comprehensive good practice in the identification, assessment, and management of pain. This study reviewed preexisting pain management practice at five Australian RACFs and identified changes needed to implement the recommendations and then implemented an evidence-based program that aimed to facilitate better pain management. The program involved staff training and education and revised in-house pain-management procedures. Reviews occurred before and after the program and included the assessment of 282 residents for analgesic use and pain status. Analgesic use improved after the program (P<.001), with a decrease in residents receiving no analgesics (from 15% to 6%) and an increase in residents receiving around-the-clock plus as-needed analgesics (from 24% to 43%). There were improvements in pain relief for residents with scores indicative of pain, with Abbey pain scale (P=.005), Pain Assessment in Advanced Dementia Scale (P=.001), and Non-communicative Patient's Pain Assessment Instrument scale (P<.001) scores all improving. Although physical function declined as expected, Medical Outcomes Study 36-item Short-Form Survey bodily pain scores also showed improvement (P=.001). Better evidence-based practice and outcomes in RACFs can be achieved with appropriate training and education. Investing resources in the aged care workforce using this program improved analgesic practice and pain relief in participating sites. Further attention to the continued targeted pain management training of aged care staff is likely to improve pain-focused care for residents.
Resumo:
BACKGROUND: Over the past 10 years, the use of saliva as a diagnostic fluid has gained attention and has become a translational research success story. Some of the current nanotechnologies have been demonstrated to have the analytical sensitivity required for the use of saliva as a diagnostic medium to detect and predict disease progression. However, these technologies have not yet been integrated into current clinical practice and work flow. CONTENT: As a diagnostic fluid, saliva offers advantages over serum because it can be collected noninvasively by individuals with modest training, and it offers a cost-effective approach for the screening of large populations. Gland-specific saliva can also be used for diagnosis of pathology specific to one of the major salivary glands. There is minimal risk of contracting infections during saliva collection, and saliva can be used in clinically challenging situations, such as obtaining samples from children or handicapped or anxious patients, in whom blood sampling could be a difficult act to perform. In this review we highlight the production of and secretion of saliva, the salivary proteome, transportation of biomolecules from blood capillaries to salivary glands, and the diagnostic potential of saliva for use in detection of cardiovascular disease and oral and breast cancers. We also highlight the barriers to application of saliva testing and its advancement in clinical settings. SUMMARY: Saliva has the potential to become a first-line diagnostic sample of choice owing to the advancements in detection technologies coupled with combinations of biomolecules with clinical relevance. (C) 2011 American Association for Clinical Chemistry
Resumo:
Over the past 10 years, the use of saliva as a diagnostic fluid has gained attention and has become a translational research success story. Some of the current nanotechnologies have been demonstrated to have the analytical sensitivity required for the use of saliva as a diagnostic medium to detect and predict disease progression. However, these technologies have not yet been integrated into current clinical practice and work flow. As a diagnostic fluid, saliva offers advantages over serum because it can be collected noninvasively by individuals with modest training, and it offers a cost-effective approach for the screening of large populations. Gland-specific saliva can also be used for diagnosis of pathology specific to one of the major salivary glands. There is minimal risk of contracting infections during saliva collection, and saliva can be used in clinically challenging situations, such as obtaining samples from children or handicapped or anxious patients, in whom blood sampling could be a difficult act to perform. In this review we highlight the production of and secretion of saliva, the salivary proteome, transportation of biomolecules from blood capillaries to salivary glands, and the diagnostic potential of saliva for use in detection of cardiovascular disease and oral and breast cancers. We also highlight the barriers to application of saliva testing and its advancement in clinical settings. Saliva has the potential to become a first-line diagnostic sample of choice owing to the advancements in detection technologies coupled with combinations of biomolecules with clinical relevance.
Resumo:
There is an increasing desire and emphasis to integrate assessment tools into the everyday training environment of athletes. These tools are intended to fine-tune athlete development, enhance performance and aid in the development of individualised programmes for athletes. The areas of workload monitoring, skill development and injury assessment are expected to benefit from such tools. This paper describes the development of an instrumented leg press and its application to testing leg dominance with a cohort of athletes. The developed instrumented leg press is a 45° reclining sled-type leg press with dual force plates, a displacement sensor and a CCD camera. A custom software client was developed using C#. The software client enabled near-real-time display of forces beneath each limb together with displacement of the quad track roller system and video feedback of the exercise. In recording mode, the collection of athlete particulars is prompted at the start of the exercise, and pre-set thresholds are used subsequently to separate the data into epochs from each exercise repetition. The leg press was evaluated in a controlled study of a cohort of physically active adults who performed a series of leg press exercises. The leg press exercises were undertaken at a set cadence with nominal applied loads of 50%, 100% and 150% of body weight without feedback. A significant asymmetry in loading of the limbs was observed in healthy adults during both the eccentric and concentric phases of the leg press exercise (P < .05). Mean forces were significantly higher beneath the non-dominant limb (4–10%) and during the concentric phase of the muscle action (5%). Given that symmetrical loading is often emphasized during strength training and remains a common goal in sports rehabilitation, these findings highlight the clinical potential for this instrumented leg press system to monitor symmetry in lower-limb loading during progressive strength training and sports rehabilitation protocols.
Resumo:
Recent literature acknowledges the need for new career development models to support the way that careers evolve in the 21st century workplace (Bloch 2005). This is particularly so within temporary organisation forms, and for those pursuing a career in project management (Hölzle 2010). Our research, explores how project managers working on projects and within temporary organisation forms and those working on project-linked contracts access the development opportunities they require to remain employable in an era of project-by-project employment. Set in Australia where a project-based economy (Crawford, French and Lloyd-Walker 2013) and contract work have led to casualisation of the workforce (Connell & Burgess, 2006; McKeown & Hanley (2009) the results suggest new approaches to career development may be required.
Resumo:
Purpose Cognitive alterations are reported in breast cancer patients receiving chemotherapy. This has adverse effects on patients’ quality of life and function. This systematic review investigates the effectiveness of pharmacologic and non-pharmacologic interventions to manage cognitive alterations associated with breast cancer treatment. Methods Medline via EBSCOhost, CINAHL and Cochrane CENTRAL were searched for the period January 1999 to May 2014 for prospective randomized controlled trials related to the management of chemotherapy-associated cognitive alterations. Included studies investigated the management of chemotherapy-associated cognitive alterations and used subjective or objective measures in patients with breast cancer during or after chemotherapy. Two authors independently extracted data and assessed the risk of bias. Results Thirteen studies involving 1138 participants were included. Overall, the risk of bias for the 13 studies were either high (n=11) or unclear (n=2). Pharmacologic interventions included psychostimulants (n=4), epoetin alfa (n=1), and Ginkgo biloba (n=1). Non-pharmacologic interventions were cognitive training (n=5) and physical activity (n=2). Pharmacologic agents were ineffective except for self-reported cognitive function in an epoetin alfa study. Cognitive training interventions demonstrated benefits in self-reported cognitive function, memory, verbal function and language and orientation/attention. Physical activity interventions were effective in improving executive function and self-reported concentration. Conclusion Current evidence does not favor the pharmacologic management of cognitive alterations associated with breast cancer treatment. Cognitive training and physical activity interventions appear promising, but additional studies are required to establish their efficacy. Further research is needed to overcome methodological shortfalls such as heterogeneity in participant characteristics and non-standardized neuropsychological outcome measures.
Resumo:
Introduction Systematic reviews, through the synthesis of multiple primary research studies, can be powerful tools in enabling evidence-informed public health policy debate, development and action. In seeking to optimize the utility of these reviews, it is important to understand the needs of those using them. Previous work has emphasized that researchers should adopt methods that are appropriate to the problems that public health decision-makers are grappling with, as well as to the policy context in which they operate.1,2 Meeting these demands poses significant methodological challenges for review authors and prompts a reconsideration of the resources, training and support structures available to facilitate the efficient and timely production of useful, comprehensive reviews. The Cochrane Public Health Group (CPHG) was formed in 2008 to support reviews of complex, upstream public health topics. The majority of CPHG authors are from the UK, which has historically been at the forefront of efforts to promote the production and use of systematic reviews of research relevant to public health decision-makers. The UK therefore provides a suitably mature national context in which to examine (i) the current and future demands of decision-makers to increase the use, value and impact of evidence syntheses; (ii) the implications this has for the scope and methods of reviews and (iii) the required action to build and support capacity to conduct such reviews.
Resumo:
This paper addresses the research question, ‘What are the diffusion determinants for green urbanism innovations in Australia?’ This is a significant topic given the global movement towards green urbanism. The study reported here is based on desktop research that provides new insights through (1) synthesis of the latest research findings on green urbanism innovations and (2) interpretation of diffusion issues through our innovation system model. Although innovation determinants have been studied extensively overseas and in Australia, there is presently a gap in the literature when it comes to these determinants for green urbanism in Australia. The current paper fills this gap. Using a conceptual framework drawn from the innovation systems literature, this paper synthesises and interprets the literature to map the current state of green urbanism innovations in Australia and to analyse the drivers for, and obstacles to, their optimal diffusion. The results point to the importance of collaboration between project-based actors in the implementation of green urbanism. Education, training and regulation across the product system is also required to improve the cultural and technical context for implementation. The results are limited by their exploratory nature and future research is planned to quantify barriers to green urbanism.
Resumo:
Biological factors underlying individual variability in fearfulness and anxiety have important implications for stress-related psychiatric illness including PTSD and major depression. Using an advanced intercross line (AIL) derived from C57BL/6 and DBA/2J mouse strains and behavioral selection over 3 generations, we established two lines exhibiting High or Low fear behavior after fear conditioning. Across the selection generations, the two lines showed clear differences in training and tests for contextual and conditioned fear. Before fear conditioning training, there were no differences between lines in baseline freezing to a novel context. However, after fear conditioning High line mice demonstrated pronounced freezing in a new context suggestive of poor context discrimination. Fear generalization was not restricted to contextual fear. High fear mice froze to a novel acoustic stimulus while freezing in the Low line did not increase over baseline. Enhanced fear learning and generalization are consistent with transgenic and pharmacological disruption of the hypothalamic-pituitary-adrenal axis (HPA-axis) (Brinks, 2009, Thompson, 2004, Kaouane, 2012). To determine whether there were differences in HPA-axis regulation between the lines, morning urine samples were collected to measure basal corticosterone. Levels of secreted corticosterone in the circadian trough were analyzed by corticosterone ELISA. High fear mice were found to have higher basal corticosterone levels than low line animals. Examination of hormonal stress response components by qPCR revealed increased expression of CRH mRNA and decreased mRNA for MR and CRHR1 in hypothalamus of high fear mice. These alterations may contribute to both the behavioral phenotype and higher basal corticosterone in High fear mice. To determine basal brain activity in vivo in High and Low fear mice we used manganese-enhanced magnetic resonance imaging (MEMRI). Analysis revealed a pattern of basal brain activity made up of amygdala, cortical and hippocampal circuits that was elevated in the High line. Ongoing studies also seek to determine the relative balance of excitatory and inhibitory tone in the amygdala and hippocampus and the neuronal structure of its neurons. While these heterogeneous lines are selected on fear memory expression, HPA-axis alterations and differences in hippocampal activity segregate with the behavioral phenotypes. These differences are detectable in a basal state strongly suggesting these are biological traits underlying the behavioral phenotype (Johnson et al, 2011).