838 resultados para Training time
Resumo:
In 1993 the Australian Broadcasting Corporation was contracted by the Australian Government to assist in the reshaping of the South African Broadcasting Corporation from a state-run broadcaster to a respected and trusted national broadcaster for all people in the newly democratic South Africa. Broadcast journalism training was identified by ABC consultant Bob Wurth as possibly the greatest need for SABC Radio. This thesis examines the ABC's role in South Africa and the effectiveness of its radio journalism training project considering the organisational, structural, cultural and political constraints of the SABC. This thesis will show through interviews and participant observation the difficulties in achieving the production of Western Liberal journalism values at the SABC within the time constraints set by the project funded by the Australian Government and the particular South African morays.
Resumo:
Background: Palliative care should be provided according to the individual needs of the patient, caregiver and family, so that the type and level of care provided, as well as the setting in which it is delivered, are dependent on the complexity and severity of individual needs, rather than prognosis or diagnosis. This paper presents a study designed to assess the feasibility and efficacy of an intervention to assist in the allocation of palliative care resources according to need, within the context of a population of people with advanced cancer. ---------- Methods/design: People with advanced cancer and their caregivers completed bi-monthly telephone interviews over a period of up to 18 months to assess unmet needs, anxiety and depression, quality of life, satisfaction with care and service utilisation. The intervention, introduced after at least two baseline phone interviews, involved a) training medical, nursing and allied health professionals at each recruitment site on the use of the Palliative Care Needs Assessment Guidelines and the Needs Assessment Tool: Progressive Disease - Cancer (NAT: PD-C); b) health professionals completing the NAT: PD-C with participating patients approximately monthly for the rest of the study period. Changes in outcomes will be compared pre-and post-intervention.---------- Discussion: The study will determine whether the routine, systematic and regular use of the Guidelines and NAT: PD-C in a range of clinical settings is a feasible and effective strategy for facilitating the timely provision of needs based care.
Resumo:
Skeletal muscle displays enormous plasticity to respond to contractile activity with muscle from strength- (ST) and endurance-trained (ET) athletes representing diverse states of the adaptation continuum. Training adaptation can be viewed as the accumulation of specific proteins. Hence, the altered gene expression that allows for changes in protein concentration is of major importance for any training adaptation. Accordingly, the aim of the present study was to quantify acute subcellular responses in muscle to habitual and unfamiliar exercise. After 24-h diet/exercise control, 13 male subjects (7 ST and 6 ET) performed a random order of either resistance (8 × 5 maximal leg extensions) or endurance exercise (1 h of cycling at 70% peak O2 uptake). Muscle biopsies were taken from vastus lateralis at rest and 3 h after exercise. Gene expression was analyzed using real-time PCR with changes normalized relative to preexercise values. After cycling exercise, peroxisome proliferator-activated receptor-γ coactivator-1α (ET ∼8.5-fold, ST ∼10-fold, P < 0.001), pyruvate dehydrogenase kinase-4 (PDK-4; ET ∼26-fold, ST ∼39-fold), vascular endothelial growth factor (VEGF; ET ∼4.5-fold, ST ∼4-fold), and muscle atrophy F-box protein (MAFbx) (ET ∼2-fold, ST ∼0.4-fold) mRNA increased in both groups, whereas MyoD (∼3-fold), myogenin (∼0.9-fold), and myostatin (∼2-fold) mRNA increased in ET but not in ST (P < 0.05). After resistance exercise PDK-4 (∼7-fold, P < 0.01) and MyoD (∼0.7-fold) increased, whereas MAFbx (∼0.7-fold) and myostatin (∼0.6-fold) decreased in ET but not in ST. We conclude that prior training history can modify the acute gene responses in skeletal muscle to subsequent exercise.
Resumo:
Aim: Whilst motorcycle rider training is commonly incorporated into licensing programs in many developed nations, little empirical support has been found in previous research to prescribe it as an effective road safety countermeasure. It has been posited that the lack of effect of motorcycle rider training on crash reduction may, in part, be due to the predominant focus on skills-based training with little attention devoted to addressing attitudes and motives that influence subsequent risky riding. However, little past research has actually endeavoured to measure attitudinal and motivational factors as a function of rider training. Accordingly, this study was undertaken to assess the effect of a commercial motorcycle rider training program on psychosocial factors that have been shown to influence risk taking by motorcyclists. Method: Four hundred and thirty-eight motorcycle riders attending a competency-based licence training course in Brisbane, Australia, voluntarily participated in the study. A self-report questionnaire adapted from the Rider Risk Assessment Measure (RRAM) was administered to participants at the commencement of training, then again at the conclusion of training. Participants were informed of the independent nature of the research and that their responses would in no way effect their chance of obtaining a licence. To minimise potential demand characteristics, participants were instructed to seal completed questionnaires in envelopes and place them in a sealed box accessible only by the research team (i.e. not able to be viewed by instructors). Results: Significant reductions in the propensity for thrill seeking and intentions to engage in risky riding in the next 12 months were found at the end of training. In addition, a significant increase in attitudes to safety was found. Conclusions: These findings indicate that rider training may have a positive short-term influence on riders’ propensity for risk taking. However, such findings must be interpreted with caution in regard to the subsequent safety of riders as these factors may be subject to further influence once riders are licensed and actively engage with peers during on-road riding. This highlights a challenge for road safety education / training programs in regard to the adoption of safety practices and the need for behavioural follow-up over time to ascertain long-term effects. This study was the initial phase of an ongoing program of research into rider training and risk taking framed around Theory of Planned Behaviour concepts. A subsequent 12 month follow-up of the study participants has been undertaken with data analysis pending.
Resumo:
Learning Outcome: Gain knowledge in the area of dietetic training in Australia and the benefits of collaborative partnerships between government and universities to achieve improvements in dietetic service delivery, evidenced based practice, and student placements. Prisoners have high rates of chronic disease, however dietetic services and research in this sector is limited. Securing high quality professional practice placements for dietetic training in Australia is competitive, and prisons provide exciting opportunities. Queensland University of Technology (QUT) has a unique twenty year partnership with Queensland Corrective Services (QCS) with a service learning model placing final year dietetic students within prisons. Building on this partnership, in 2007 a new joint position was funded to establish dietetic services to over 5500 prisoners and support viable best practice dietetic education. Evaluation of the past three years of this partnership has shown an expansion of QUT student placements in Queensland prisons, with a third of final year students each undertaking 120 hours of foodservice management practicum. Student evaluations of placement over this period are much higher than the University average. Through the joint position student projects have been targeted on strategic areas to support nutrition and dietetic policy and practice. Projects have been broadened from menu reviews to more comprehensive quality improvement and dietetic research activities, with all student learning activities transferrable to other foodservice settings. Student practice in the prisons has been extended beyond foodservice management to include group education and dietetic counseling. For QCS, student placements have equated to close to a full-time dietitian position, with nutrition policy now being implemented as an outcome of this support. This innovative partnership has achieved a sustainable student placement model, supported research, whilst delivering dietetic services to a difficult to access group. Funding Disclosure: None
Resumo:
Listening is the basic and complementary skill in second language learning. The term listening is used in language teaching to refer to a complex process that allows us to understand spoken language. Listening, the most widely used language skill, is often used in conjunction with the other skills of speaking, reading and writing. Listening is not only a skill area in primary language performance (L1), but is also a critical means of acquiring a second language (L2). Listening is the channel in which we process language in real time – employing pacing, units of encoding and decoding (the 2 processes are central to interpretation and meaning making) and pausing (allows for reflection) that are unique to spoken language. Despite the wide range of areas investigated in listening strategies during training, there is a lack of research looking specifically at how effectively L1 listening strategy training may transfer to L2. To investigate the development of any such transfer patterns the instructional design and implementation of listening strategy of L1 will be critical.
Resumo:
While strengthened partnerships between University and Schools have been proposed in recent reviews of teacher education (House of Representative Standing Committee on Education and Vocational Training, 2007; Caldwell & Sutton, 2010; Donaldson, 2010), there is a need to understand the benefits and challenges for participants of these partnerships. The Teacher Education Centre of Excellence (TECE) in this study is a preservice teacher preparation partnership between a Queensland University, Queensland Department of Education, Training and Employment (DETE) and an Education Queensland school. It was established in response to a mandated reform within the Improving Teacher Quality National Partnership Agreement (Department of Education Employment and Workplace Relations, 2011). High-achieving Bachelor of Education preservice teachers apply to be part of the 18-month program in the third year of their four-year Education degree. These preservice teachers experience mentoring in partner schools in addition to course work designed and delivered by a DETE appointed Head of Mentoring and a university academic. On completion of the program, graduates will be appointed to South West Queensland rural and remote Education Queensland schools. This paper analyses participant perspectives from the first phase of this partnership in particular identifying the benefits and challenges experienced by the preservice teachers and the leaders of the program from the participating institutions. A sociocultural theoretical perspective (Wenger, 1998) informed the analysis examining how preservice teachers experience a sense of becoming a professional teacher within a specific employment context. Data from interviews with 6 pre-service teachers and 8 program leaders were analysed inductively through coding of interview records. Findings indicate the importance of strong relationships and opportunity for reciprocal learning through ongoing professional conversations as contexts for preservice teachers to develop an identity as an emerging professional. This research has significance for the ongoing development of this partnership as well as informing the principles for the design of future similar partnerships.
Resumo:
Echocardiography is the commonest form of non-invasive cardiac imaging and is fundamental to patient management. However, due to its methodology, it is also operator dependent. There are well defined pathways in training and ongoing accreditation to achieve and maintain competency. To satisfy these requirements, significant time has to be dedicated to scanning patients, often in the time pressured clinical environment. Alternative, computer based training methods are being considered to augment echocardiographic training. Numerous advances in technology have resulted in the development of interactive programmes and simulators to teach trainees the skills to perform particular procedures, including transthoracic and transoesophageal echocardiography. 82 sonographers and TOE proceduralists utilised an echocardiographic simulator and assessed its utility using defined criteria. 40 trainee sonographers assessed the simulator and were taught how to obtain an apical 2 chamber (A2C) view and image the superior vena cava (SVC). 100% and 88% found the simulator useful in obtaining the SVC or A2C view respectively. All users found it easy to use and the majority found it helped with image acquisition and interpretation. 42 attendees of a TOE training day utilising the simulator assessed the simulator with 100% finding it easy to use, as well as the augmented reality graphics benefiting image acquisition. 90% felt that it was realistic. This study revealed that both trainee sonographers and TOE proceduralists found the simulation process was realistic, helped in image acquisition and improved assessment of spatial relationships. Echocardiographic simulators may play an important role in the future training of echocardiographic skills.
Resumo:
Background: Currently in the Australian higher education sector higher productivity from allied health clinical education placements are a contested issue. This paper will report results of a study that investigated output changes associated with occupational therapy and nutrition/dietetics clinical education placements in Queensland, Australia. Supervisors’ and students’ time use during placements and how this changes for supervisors compared to when students are not present in the workplace is also presented. Methodology/Principal Findings: A cohort design was used with students from four Queensland universities, and their supervisors employed by Queensland Health. There was an increasing trend in the number of occasions of service delivered when the students were present, and a statistically significant increase in the daily mean length of occasions of service delivered during the placement compared to pre-placement levels. For project-based placements that were not directly involved in patient care, supervisors’ project activity time decreased during placements, with students undertaking considerably more time in project activities. Conclusions/Significance: A novel method for estimating productivity and time use changes during clinical education programs for allied health disciplines has been applied. During clinical education placements there was a net increase in outputs, suggesting supervisors engage in longer consultations with patients for the purpose of training students, while maintaining patient numbers. Other activities are reduced. This paper is the first time these data have been shown and form a good basis for future assessments of the economic impact of student placements for allied health disciplines.
Resumo:
Management of groundwater systems requires realistic conceptual hydrogeological models as a framework for numerical simulation modelling, but also for system understanding and communicating this to stakeholders and the broader community. To help overcome these challenges we developed GVS (Groundwater Visualisation System), a stand-alone desktop software package that uses interactive 3D visualisation and animation techniques. The goal was a user-friendly groundwater management tool that could support a range of existing real-world and pre-processed data, both surface and subsurface, including geology and various types of temporal hydrological information. GVS allows these data to be integrated into a single conceptual hydrogeological model. In addition, 3D geological models produced externally using other software packages, can readily be imported into GVS models, as can outputs of simulations (e.g. piezometric surfaces) produced by software such as MODFLOW or FEFLOW. Boreholes can be integrated, showing any down-hole data and properties, including screen information, intersected geology, water level data and water chemistry. Animation is used to display spatial and temporal changes, with time-series data such as rainfall, standing water levels and electrical conductivity, displaying dynamic processes. Time and space variations can be presented using a range of contouring and colour mapping techniques, in addition to interactive plots of time-series parameters. Other types of data, for example, demographics and cultural information, can also be readily incorporated. The GVS software can execute on a standard Windows or Linux-based PC with a minimum of 2 GB RAM, and the model output is easy and inexpensive to distribute, by download or via USB/DVD/CD. Example models are described here for three groundwater systems in Queensland, northeastern Australia: two unconfined alluvial groundwater systems with intensive irrigation, the Lockyer Valley and the upper Condamine Valley, and the Surat Basin, a large sedimentary basin of confined artesian aquifers. This latter example required more detail in the hydrostratigraphy, correlation of formations with drillholes and visualisation of simulation piezometric surfaces. Both alluvial system GVS models were developed during drought conditions to support government strategies to implement groundwater management. The Surat Basin model was industry sponsored research, for coal seam gas groundwater management and community information and consultation. The “virtual” groundwater systems in these 3D GVS models can be interactively interrogated by standard functions, plus production of 2D cross-sections, data selection from the 3D scene, rear end database and plot displays. A unique feature is that GVS allows investigation of time-series data across different display modes, both 2D and 3D. GVS has been used successfully as a tool to enhance community/stakeholder understanding and knowledge of groundwater systems and is of value for training and educational purposes. Projects completed confirm that GVS provides a powerful support to management and decision making, and as a tool for interpretation of groundwater system hydrological processes. A highly effective visualisation output is the production of short videos (e.g. 2–5 min) based on sequences of camera ‘fly-throughs’ and screen images. Further work involves developing support for multi-screen displays and touch-screen technologies, distributed rendering, gestural interaction systems. To highlight the visualisation and animation capability of the GVS software, links to related multimedia hosted online sites are included in the references.
Resumo:
Granulysin is a cytolytic granule protein released by natural killer cells and activated cytotoxic T lymphocytes. The influence of exercise training on circulating granulysin concentration is unknown, as is the relationship between granulysin concentration, natural killer cell number and natural killer cell cytotoxicity. We examined changes in plasma granulysin concentration, natural killer cell number and cytotoxicity following acute exercise and different training loads. Fifteen highly trained male cyclists completed a baseline 40-km cycle time trial (TT401) followed by five weeks of normal training and a repeat time trial (TT402). The cyclists then completed four days of high intensity training followed by another time trial (TT403) on day five. Following one final week of normal training cyclists completed another time trial (TT404). Fasting venous blood was collected before and after each time trial to determine granulysin concentration, natural killer cell number and natural killer cell cytotoxicity. Granulysin concentration increased significantly after each time trial (P<0.001). Pre-exercise granulysin concentration for TT403 was significantly lower than pre-exercise concentration for TT401 (-20.3 +/- 7.5%, P<0.026), TT402 (-16.7 +/- 4.3%, P<0.003) and 7T404 (-21 +/- 4.2%, P<0.001). Circulating natural killer cell numbers also increased significantly post-exercise for each time trial (P<0.001), however there was no significant difference across TT40 (P>0.05). Exercise did not significantly alter natural killer cell cytotoxicity on a per cell basis, and there were no significant differences between the four time trials. In conclusion, plasma granulysin concentration increases following moderate duration, strenuous exercise and is decreased in response to a short-term period of intensified training.
Resumo:
The purpose of the present study was to examine the influence of 3 different high-intensity interval training regimens on the first and second ventilatory thresholds (VT1 and VT2), anaerobic capacity (ANC), and plasma volume (PV) in well-trained endurance cyclists. Before and after 2 and 4 weeks of training, 38 well-trained cyclists (VO2peak = 64.5 +/- 5.2 ml[middle dot]kg-1[middle dot]min-1) performed (a) a progressive cycle test to measure VO2peak, peak power output (PPO), VT1, and VT2; (b) a time to exhaustion test (Tmax) at their VO2peak power output (Pmax); and (c) a 40-km time-trial (TT40). Subjects were assigned to 1 of 4 training groups (group 1: n = 8, 8 3 60% Tmax at Pmax, 1:2 work-recovery ratio; group 2: n = 9, 8 x 60% Tmax at Pmax, recovery at 65% maximum heart rate; group 3: n = 10, 12 x 30 seconds at 175% PPO, 4.5-minute recovery; control group: n = 11). The TT40 performance, VO2peak, VT1,VT2, and ANC were all significantly increased in groups 1, 2, and 3 (p < 0.05) but not in the control group. However, PV did not change in response to the 4-week training program. Changes in TT40 performance were modestly related to the changes in VO2peak, VT1, VT2, and ANC (r = 0.41, 0.34, 0.42, and 0.40, respectively; all p < 0.05). In conclusion, the improvements in TT40 performance were related to significant increases in VO2peak, VT1,VT2, and ANC but were not accompanied by significant changes in PV. Thus, peripheral adaptations rather than central adaptations are likely responsible for the improved performances witnessed in well-trained endurance athletes following various forms of high-intensity interval training programs.
Resumo:
PURPOSE: The purpose of this study was to examine the influence of three different high-intensity interval training (HIT) regimens on endurance performance in highly trained endurance athletes. METHODS: Before, and after 2 and 4 wk of training, 38 cyclists and triathletes (mean +/- SD; age = 25 +/- 6 yr; mass = 75 +/- 7 kg; VO(2peak) = 64.5 +/- 5.2 mL x kg(-1) min(-1)) performed: 1) a progressive cycle test to measure peak oxygen consumption (VO(2peak)) and peak aerobic power output (PPO), 2) a time to exhaustion test (T(max)) at their VO(2peak) power output (P(max)), as well as 3) a 40-km time-trial (TT(40)). Subjects were matched and assigned to one of four training groups (G(2), N = 8, 8 x 60% T(max) at P(max), 1:2 work:recovery ratio; G(2), N = 9, 8 x 60% T(max) at P(max), recovery at 65% HR(max); G(3), N = 10, 12 x 30 s at 175% PPO, 4.5-min recovery; G(CON), N = 11). In addition to G(1), G(2), and G(3) performing HIT twice per week, all athletes maintained their regular low-intensity training throughout the experimental period. RESULTS: All HIT groups improved TT(40) performance (+4.4 to +5.8%) and PPO (+3.0 to +6.2%) significantly more than G(CON) (-0.9 to +1.1%; P < 0.05). Furthermore, G(1) (+5.4%) and G(2) (+8.1%) improved their VO(2peak) significantly more than G(CON) (+1.0%; P < 0.05). CONCLUSION: The present study has shown that when HIT incorporates P(max) as the interval intensity and 60% of T(max) as the interval duration, already highly trained cyclists can significantly improve their 40-km time trial performance. Moreover, the present data confirm prior research, in that repeated supramaximal HIT can significantly improve 40-km time trial performance.
Resumo:
Background: Heart failure is a serious condition estimated to affect 1.5-2.0% of the Australian population with a point prevalence of approximately 1% in people aged 50-59 years, 10% in people aged 65 years or more and over 50% in people aged 85 years or over (National Heart Foundation of Australian and the Cardiac Society of Australia and New Zealand, 2006). Sleep disturbances are a common complaint of persons with heart failure. Disturbances of sleep can worsen heart failure symptoms, impair independence, reduce quality of life and lead to increased health care utilisation in patients with heart failure. Previous studies have identified exercise as a possible treatment for poor sleep in patients without cardiac disease however there is limited evidence of the effect of this form of treatment in heart failure. Aim: The primary objective of this study was to examine the effect of a supervised, hospital-based exercise training programme on subjective sleep quality in heart failure patients. Secondary objectives were to examine the association between changes in sleep quality and changes in depression, exercise performance and body mass index. Methods: The sample for the study was recruited from metropolitan and regional heart failure services across Brisbane, Queensland. Patients with a recent heart failure related hospital admission who met study inclusion criteria were recruited. Participants were screened by specialist heart failure exercise staff at each site to ensure exercise safety prior to study enrolment. Demographic data, medical history, medications, Pittsburgh Sleep Quality Index score, Geriatric Depression Score, exercise performance (six minute walk test), weight and height were collected at Baseline. Pittsburgh Sleep Quality Index score, Geriatric Depression Score, exercise performance and weight were repeated at 3 months. One hundred and six patients admitted to hospital with heart failure were randomly allocated to a 3-month disease-based management programme of education and self-management support including standard exercise advice (Control) or to the same disease management programme as the Control group with the addition of a tailored physical activity program (Intervention). The intervention consisted of 1 hour of aerobic and resistance exercise twice a week. Programs were designed and supervised by an exercise specialist. The main outcome measure was achievement of a clinically significant change (.3 points) in global Pittsburgh Sleep Quality score. Results: Intervention group participants reported significantly greater clinical improvement in global sleep quality than Control (p=0.016). These patients also exhibited significant improvements in component sleep disturbance (p=0.004), component sleep quality (p=0.015) and global sleep quality (p=0.032) after 3 months of supervised exercise intervention. Improvements in sleep quality correlated with improvements in depression (p<0.001) and six minute walk distance (p=0.04). When study results were examined categorically, with subjects classified as either "poor" or "good" sleepers, subjects in the Control group were significantly more likely to report "poor" sleep at 3 months (p=0.039) while Intervention participants were likely to report "good" sleep at this time (p=0.08). Conclusion: Three months of supervised, hospital based, aerobic and resistance exercise training improved subjective sleep quality in patients with heart failure. This is the first randomised controlled trial to examine the role of aerobic and resistance exercise training in the improvement of sleep quality for patients with this disease. While this study establishes exercise as a therapy for poor sleep quality, further research is needed to investigate the effect of exercise training on objective parameters of sleep in this population.
Resumo:
The objective of exercise training is to initiate desirable physiological adaptations that ultimately enhance physical work capacity. Optimal training prescription requires an individualized approach, with an appropriate balance of training stimulus and recovery and optimal periodization. Recovery from exercise involves integrated physiological responses. The cardiovascular system plays a fundamental role in facilitating many of these responses, including thermoregulation and delivery/removal of nutrients and waste products. As a marker of cardiovascular recovery, cardiac parasympathetic reactivation following a training session is highly individualized. It appears to parallel the acute/intermediate recovery of the thermoregulatory and vascular systems, as described by the supercompensation theory. The physiological mechanisms underlying cardiac parasympathetic reactivation are not completely understood. However, changes in cardiac autonomic activity may provide a proxy measure of the changes in autonomic input into organs and (by default) the blood flow requirements to restore homeostasis. Metaboreflex stimulation (e.g. muscle and blood acidosis) is likely a key determinant of parasympathetic reactivation in the short term (0–90 min post-exercise), whereas baroreflex stimulation (e.g. exercise-induced changes in plasma volume) probably mediates parasympathetic reactivation in the intermediate term (1–48 h post-exercise). Cardiac parasympathetic reactivation does not appear to coincide with the recovery of all physiological systems (e.g. energy stores or the neuromuscular system). However, this may reflect the limited data currently available on parasympathetic reactivation following strength/resistance-based exercise of variable intensity. In this review, we quantitatively analyse post-exercise cardiac parasympathetic reactivation in athletes and healthy individuals following aerobic exercise, with respect to exercise intensity and duration, and fitness/training status. Our results demonstrate that the time required for complete cardiac autonomic recovery after a single aerobic-based training session is up to 24 h following low-intensity exercise, 24–48 h following threshold-intensity exercise and at least 48 h following high-intensity exercise. Based on limited data, exercise duration is unlikely to be the greatest determinant of cardiac parasympathetic reactivation. Cardiac autonomic recovery occurs more rapidly in individuals with greater aerobic fitness. Our data lend support to the concept that in conjunction with daily training logs, data on cardiac parasympathetic activity are useful for individualizing training programmes. In the final sections of this review, we provide recommendations for structuring training microcycles with reference to cardiac parasympathetic recovery kinetics. Ultimately, coaches should structure training programmes tailored to the unique recovery kinetics of each individual.