889 resultados para Four-day week


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis investigates the experiences of teachers who trialled an electronic curriculum and assessment tool in the wider context of text-mediated ruling relations organising their work. Problematised as policy and text, this tool is interrogated as a 'solution' to problems perceived in teachers' work in an era of increased accountability. It provides evidence that teachers' work is shaped by forces operating outside their control and mediated by the policy discourses and subjectivities available to them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background and aims The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients consume ≤50% of the offered food in Australian and New Zealand hospitals. After controlling for confounders (nutritional status, age, disease type and severity), the ANCDS also established an independent association between poor food intake and increased in-hospital mortality. This study aimed to evaluate if medical nutrition therapy (MNT) could improve dietary intake in hospital patients eating poorly. Methods An exploratory pilot study was conducted in the respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, percentage food intake (0%, 25%, 50%, 75%, and 100%) was evaluated for each main meal and snack for a 24-hour period in patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT. Food intake was re-evaluated on the seventh day following recruitment (post-MNT). Results 184 patients were observed over four weeks; 32 patients were referred for MNT. Although baseline and post-MNT data for 20 participants (68±17years, 65% females) indicated a significant increase in median energy and protein intake post-MNT (3600kJ/day, 40g/day) versus baseline (2250kJ/day, 25g/day) (p<0.05), the increased intake met only 50% of dietary requirements. Persistent nutrition impact symptoms affected intake. Conclusion In this pilot study whilst dietary intake improved, it remained inadequate to meet participants’ estimated requirements due to ongoing nutrition-impact symptoms. Appropriate medical management and early enteral feeding could be a possible solution for such patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: As part of ongoing quality assurance, all university programs must be regularly reviewed to ensure curriculum is current, meets university and national standards, and for medical science, criteria for AIMS Accreditation. With recent developments at the national and international level also signaling change, a course design team (CDT) was assembled and tasked with developing and implementing a new four year Bachelor of Medical Laboratory Science (BMLS) course at QUT. Method: A whole-of-course approach was adopted, incorporating inverted curriculum and Capstone experience. First, course vision and desired graduate profile are defined as course learning outcomes (CLO), i.e. skills, knowledge, behaviours and attributes graduates must demonstrate. CLO are then back-mapped into introductory, developmental and expected phases from fourth to first year on a course plan and assessment map. Unit learning outcomes (ULO) are then defined, and finally, each unit (subject) designed, directly aligned with assessment. Results: The resulting BMLS course represents a deliberate program of study across four years, which from day one, focuses on the professional aspects of MLS, clinical pathology disciplines, and incrementally developing and assessing the skills, knowledge, behaviours and attributes required to undertake the Work Integrated Learning Internship (WILI) and Capstone experience in final year, and subsequently, graduate from the program. Conclusions: At the start of the year, the BMLS commenced with higher than anticipated enrolments. To date, survey data and feedback is positive, with particular emphasis on the directed nature of the course. The method of course design also ensures university/national standards, and criteria for AIMS Accreditation have been met.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Transfusion-related acute lung injury (TRALI) is a serious and potentially fatal consequence of transfusion. A two-event TRALI model demonstrated date-of-expiry - day (D) 5 platelet (PLT) and D42 packed red blood cell (PRBC) supernatants (SN) induced TRALI in LPS-treated sheep. We have adapted a whole blood transfusion culture model as an investigative bridge between the ovine TRALI model human responses to transfusion. Methods A whole blood transfusion model was adapted to replicate the ovine model - specifically +/- 0.23μg/mL LPS as the first event and 10% SN volume (transfusion) as the second event. Four pooled SN from blood products, previously used in the TRALI ovine model, were investigated: D1-PLT, D5-PLT, D1-PRBC, and D42-PRBC. Fresh human whole blood (recipient) was mixed with combinations of LPS and BP-SN stimuli and incubated in vitro for 6 hrs. Addition of golgi plug enabled measurement of monocyte cytokine production (IL-6, IL-8, IL-10, IL-12, TNF-α, IL-1α, CXCL-5, IP-10, MIP-1α, MCP-1) using multi-colour flow cytometry. Responses for 6 recipients were assessed. Results In the presence of LPS, D42-PRBC-SN significantly increased monocyte IL-6 (P=0.031), IL-8 (P=0.016) and IL-1α (P=0.008) production compared to D1-PRBC-SN. This response to D42-PRBC-SN was LPS-dependent, and was not evident in non-LPSstimulated controls. This response was also specific to D42-PRBC-SN, as similar changes were not evident for the D5-PLT-SN, compared to the D1-PLT-SN, regardless of the presence of LPS. D5-PLT-SN significantly increased IL-12 production (P=0.024) compared to D1-PLT-SN. This response was again LPS-dependent. Conclusions These data demonstrate a novel two-event mechanism of monocyte inflammatory response that was dependent upon both the presence of date-of-expiry blood product SN and LPS. Further, these results demonstrate different cytokines responses induced by date-of-expiry PLT-SN and PRBC-SN. These data are consistent with the evidence from the ovine TRALI model, and enhancing its relevance to transfusion related changes in humans.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an account of an autonomous mobile robot deployment in a densely crowded public event with thousands of people from different age groups attending. The robot operated for eight hours on an open floor surrounded by tables, chairs and massive touchscreen displays. Due to the large number of people who were in close vicinity of the robot, different safety measures were implemented including the use of no-go zones which prevent the robot from blocking emergency exits or moving too close to the display screens. The paper presents the lessons learnt and experiences obtained from this experiment, and provides a discussion about the state of mobile service robots in such crowded environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The overarching aim of this programme of work was to evaluate the effectiveness of the existing learning environment within the Australian Institute of Sport (AIS) elite springboard diving programme. Unique to the current research programme, is the application of ideas from an established theory of motor learning, specifically ecological dynamics, to an applied high performance training environment. In this research programme springboard diving is examined as a complex system, where individual, task, and environmental constraints are continually interacting to shape performance. As a consequence, this thesis presents some necessary and unique insights into representative learning design and movement adaptations in a sample of elite athletes. The questions examined in this programme of work relate to how best to structure practice, which is central to developing an effective learning environment in a high performance setting. Specifically, the series of studies reported in the chapters of this doctoral thesis: (i) provide evidence for the importance of designing representative practice tasks in training; (ii) establish that completed and baulked (prematurely terminated) take-offs are not different enough to justify the abortion of a planned dive; and (iii), confirm that elite athletes performing complex skills are able to adapt their movement patterns to achieve consistent performance outcomes from variable dive take-off conditions. Chapters One and Two of the thesis provide an overview of the theoretical ideas framing the programme of work, and include a review of literature pertinent to the research aims and subsequent empirical chapters. Chapter Three examined the representativeness of take-off tasks completed in the two AIS diving training facilities routinely used in springboard diving. Results highlighted differences in the preparatory phase of reverse dive take-offs completed by elite divers during normal training tasks in the dry-land and aquatic training environments. The most noticeable differences in dive take-off between environments began during the hurdle (step, jump, height and flight) where the diver generates the necessary momentum to complete the dive. Consequently, greater step lengths, jump heights and flight times, resulted in greater board depression prior to take-off in the aquatic environment where the dives required greater amounts of rotation. The differences observed between the preparatory phases of reverse dive take-offs completed in the dry-land and aquatic training environments are arguably a consequence of the constraints of the training environment. Specifically, differences in the environmental information available to the athletes, and the need to alter the landing (feet first vs. wrist first landing) from the take-off, resulted in a decoupling of important perception and action information and a decomposition of the dive take-off task. In attempting to only practise high quality dives, many athletes have followed a traditional motor learning approach (Schmidt, 1975) and tried to eliminate take-off variations during training. Chapter Four examined whether observable differences existed between the movement kinematics of elite divers in the preparation phases of baulked (prematurely terminated) and completed take-offs that might justify this approach to training. Qualitative and quantitative analyses of variability within conditions revealed greater consistency and less variability when dives were completed, and greater variability amongst baulked take-offs for all participants. Based on these findings, it is probable that athletes choose to abort a planned take-off when they detect small variations from the movement patterns (e.g., step lengths, jump height, springboard depression) of highly practiced comfortable dives. However, with no major differences in coordination patterns (topology of the angle-angle plots), and the potential for negative performance outcomes in competition, there appears to be no training advantage in baulking on unsatisfactory take-offs during training, except when a threat of injury is perceived by the athlete. Instead, it was considered that enhancing the athletes' movement adaptability would be a more functional motor learning strategy. In Chapter Five, a twelve-week training programme was conducted to determine whether a sample of elite divers were able to adapt their movement patterns and complete dives successfully, regardless of the perceived quality of their preparatory movements on the springboard. The data indeed suggested that elite divers were able to adapt their movements during the preparatory phase of the take-off and complete good quality dives under more varied take-off conditions; displaying greater consistency and stability in the key performance outcome (dive entry). These findings are in line with previous research findings from other sports (e.g., shooting, triple jump and basketball) and demonstrate how functional or compensatory movement variability can afford greater flexibility in task execution. By previously only practising dives with good quality take-offs, it can be argued that divers only developed strong couplings between information and movement under very specific performance circumstances. As a result, this sample was sometimes characterised by poor performance in competition when the athletes experienced a suboptimal take-off. Throughout this training programme, where divers were encouraged to minimise baulking and attempt to complete every dive, they demonstrated that it was possible to strengthen the information and movement coupling in a variety of performance circumstances, widening of the basin of performance solutions and providing alternative couplings to solve a performance problem even when the take-off was not ideal. The results of this programme of research provide theoretical and experimental implications for understanding representative learning design and movement pattern variability in applied sports science research. Theoretically, this PhD programme contributes empirical evidence to demonstrate the importance of representative design in the training environments of high performance sports programmes. Specifically, this thesis advocates for the design of learning environments that effectively capture and enhance functional and flexible movement responses representative of performance contexts. Further, data from this thesis showed that elite athletes performing complex tasks were able to adapt their movements in the preparatory phase and complete good quality dives under more varied take-off conditions. This finding signals some significant practical implications for athletes, coaches and sports scientists. As such, it is recommended that care should be taken by coaches when designing practice tasks since the clear implication is that athletes need to practice adapting movement patterns during ongoing regulation of multi-articular coordination tasks. For example, volleyball servers can adapt to small variations in the ball toss phase, long jumpers can visually regulate gait as they prepare for the take-off, and springboard divers need to continue to practice adapting their take-off from the hurdle step. In summary, the studies of this programme of work have confirmed that the task constraints of training environments in elite sport performance programmes need to provide a faithful simulation of a competitive performance environment in order that performance outcomes may be stabilised with practice. Further, it is apparent that training environments can be enhanced by ensuring the representative design of task constraints, which have high action fidelity with the performance context. Ultimately, this study recommends that the traditional coaching adage 'perfect practice makes perfect", be reconsidered; instead advocating that practice should be, as Bernstein (1967) suggested, "repetition without repetition".

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Heat-related impacts may have greater public health implications as climate change continues. It is important to appropriately characterize the relationship between heatwave and health outcomes. However, it is unclear whether a case-crossover design can be effectively used to assess the event- or episode-related health effects. This study examined the association between exposure to heatwaves and mortality and emergency hospital admissions (EHAs) from non-external causes in Brisbane, Australia, using both case-crossover and time series analyses approaches. Methods Poisson generalised additive model (GAM) and time-stratified case-crossover analyses were used to assess the short-term impact of heatwaves on mortality and EHAs. Heatwaves exhibited a significant impact on mortality and EHAs after adjusting for air pollution, day of the week, and season. Results For time-stratified case-crossover analysis, odds ratios of mortality and EHAs during heatwaves were 1.62 (95% confidence interval (CI): 1.36–1.94) and 1.22 (95% CI: 1.14–1.30) at lag 1, respectively. Time series GAM models gave similar results. Relative risks of mortality and EHAs ranged from 1.72 (95% CI: 1.40–2.11) to 1.81 (95% CI: 1.56–2.10) and from 1.14 (95% CI: 1.06–1.23) to 1.28 (95% CI: 1.21–1.36) at lag 1, respectively. The risk estimates gradually attenuated after the lag of one day for both case-crossover and time series analyses. Conclusions The risk estimates from both case-crossover and time series models were consistent and comparable. This finding may have implications for future research on the assessment of event- or episode-related (e.g., heatwave) health effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The formalin test is increasingly applied as a model of inflammatory pain using high formalin concentrations (5–15%). However, little is known about the effects of low formalin concentrations on related behavioural responses. To examine this, rat pups were subjected to various concentrations of formalin at four developmental stages: 7, 13, 22, and 82 days of age. At postnatal day (PND) 7, sex differences in flinching but not licking responses were observed with 0.5% formalin evoking higher flinching in males than in females. A dose response was evident in that 0.5% formalin also produced higher licking responses compared to 0.3% or 0.4% formalin. At PND 13, a concentration of 0.8% formalin evoked a biphasic response. At PND 22, a concentration of 1.1% evoked higher flinching and licking responses during the late phase (10–30 min) in both males and females. During the early phase (0–5 min), 1.1% evoked higher licking responses compared to 0.9% or 1% formalin. 1.1% formalin produced a biphasic response that was not evident with 0.9 or 1%. At PND 82, rats displayed a biphasic pattern in response to three formalin concentrations (1.25%, 1.75% and 2.25%) with the presence of an interphase for both 1.75% and 2.25% but not for 1.25%. These data suggest that low formalin concentrations induce fine-tuned responses that are not apparent with the high formalin concentration commonly used in the formalin test. These data also show that the developing nociceptive system is very sensitive to subtle changes in formalin concentrations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The association between temperature and mortality has been examined mainly in North America and Europe. However, less evidence is available in developing countries, especially in Thailand. In this study, we examined the relationship between temperature and mortality in Chiang Mai city, Thailand, during 1999–2008. Method A time series model was used to examine the effects of temperature on cause-specific mortality (non-external, cardiopulmonary, cardiovascular, and respiratory) and age-specific non-external mortality (<=64, 65–74, 75–84, and > =85 years), while controlling for relative humidity, air pollution, day of the week, season and long-term trend. We used a distributed lag non-linear model to examine the delayed effects of temperature on mortality up to 21 days. Results We found non-linear effects of temperature on all mortality types and age groups. Both hot and cold temperatures resulted in immediate increase in all mortality types and age groups. Generally, the hot effects on all mortality types and age groups were short-term, while the cold effects lasted longer. The relative risk of non-external mortality associated with cold temperature (19.35°C, 1st percentile of temperature) relative to 24.7°C (25th percentile of temperature) was 1.29 (95% confidence interval (CI): 1.16, 1.44) for lags 0–21. The relative risk of non-external mortality associated with high temperature (31.7°C, 99th percentile of temperature) relative to 28°C (75th percentile of temperature) was 1.11 (95% CI: 1.00, 1.24) for lags 0–21. Conclusion This study indicates that exposure to both hot and cold temperatures were related to increased mortality. Both cold and hot effects occurred immediately but cold effects lasted longer than hot effects. This study provides useful data for policy makers to better prepare local responses to manage the impact of hot and cold temperatures on population health.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose Commencing selected workouts with low muscle glycogen availability augments several markers of training adaptation compared with undertaking the same sessions with normal glycogen content. However, low glycogen availability reduces the capacity to perform high-intensity (>85% of peak aerobic power (V·O2peak)) endurance exercise. We determined whether a low dose of caffeine could partially rescue the reduction in maximal self-selected power output observed when individuals commenced high-intensity interval training with low (LOW) compared with normal (NORM) glycogen availability. Methods Twelve endurance-trained cyclists/triathletes performed four experimental trials using a double-blind Latin square design. Muscle glycogen content was manipulated via exercise–diet interventions so that two experimental trials were commenced with LOW and two with NORM muscle glycogen availability. Sixty minutes before an experimental trial, subjects ingested a capsule containing anhydrous caffeine (CAFF, 3 mg-1·kg-1 body mass) or placebo (PLBO). Instantaneous power output was measured throughout high-intensity interval training (8 × 5-min bouts at maximum self-selected intensity with 1-min recovery). Results There were significant main effects for both preexercise glycogen content and caffeine ingestion on power output. LOW reduced power output by approximately 8% compared with NORM (P < 0.01), whereas caffeine increased power output by 2.8% and 3.5% for NORM and LOW, respectively, (P < 0.01). Conclusion We conclude that caffeine enhanced power output independently of muscle glycogen concentration but could not fully restore power output to levels commensurate with that when subjects commenced exercise with normal glycogen availability. However, the reported increase in power output does provide a likely performance benefit and may provide a means to further enhance the already augmented training response observed when selected sessions are commenced with reduced muscle glycogen availability. It has long been known that endurance training induces a multitude of metabolic and morphological adaptations that improve the resistance of the trained musculature to fatigue and enhance endurance capacity and/or exercise performance (13). Accumulating evidence now suggests that many of these adaptations can be modified by nutrient availability (9–11,21). Growing evidence suggests that training with reduced muscle glycogen using a “train twice every second day” compared with a more traditional “train once daily” approach can enhance the acute training response (29) and markers representative of endurance training adaptation after short-term (3–10 wk) training interventions (8,16,30). Of note is that the superior training adaptation in these previous studies was attained despite a reduction in maximal self-selected power output (16,30). The most obvious factor underlying the reduced intensity during a second training bout is the reduction in muscle glycogen availability. However, there is also the possibility that other metabolic and/or neural factors may be responsible for the power drop-off observed when two exercise bouts are performed in close proximity. Regardless of the precise mechanism(s), there remains the intriguing possibility that the magnitude of training adaptation previously reported in the face of a reduced training intensity (Hulston et al. (16) and Yeo et al.) might be further augmented, and/or other aspects of the training stimulus better preserved, if power output was not compromised. Caffeine ingestion is a possible strategy that might “rescue” the aforementioned reduction in power output that occurs when individuals commence high-intensity interval training (HIT) with low compared with normal glycogen availability. Recent evidence suggests that, at least in endurance-based events, the maximal benefits of caffeine are seen at small to moderate (2–3 mg·kg-1 body mass (BM)) doses (for reviews, see Refs. (3,24)). Accordingly, in this study, we aimed to determine the effect of a low dose of caffeine (3 mg·kg-1 BM) on maximal self-selected power output during HIT commenced with either normal (NORM) or low (LOW) muscle glycogen availability. We hypothesized that even under conditions of low glycogen availability, caffeine would increase maximal self-selected power output and thereby partially rescue the reduction in training intensity observed when individuals commence HIT with low glycogen availability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantity and timing of protein ingestion are major factors regulating myofibrillar protein synthesis (MPS). However, the effect of specific ingestion patterns on MPS throughout a 12 h period is unknown. We determined how different distributions of protein feeding during 12 h recovery after resistance exercise affects anabolic responses in skeletal muscle. Twenty-four healthy trained males were assigned to three groups (n = 8/group) and undertook a bout of resistance exercise followed by ingestion of 80 g of whey protein throughout 12 h recovery in one of the following protocols: 8 × 10 g every 1.5 h (PULSE); 4 × 20 g every 3 h (intermediate: INT); or 2 × 40 g every 6 h (BOLUS). Muscle biopsies were obtained at rest and after 1, 4, 6, 7 and 12 h post exercise. Resting and post-exercise MPS (l-[ring-(13)C6] phenylalanine), and muscle mRNA abundance and cell signalling were assessed. All ingestion protocols increased MPS above rest throughout 1-12 h recovery (88-148%, P < 0.02), but INT elicited greater MPS than PULSE and BOLUS (31-48%, P < 0.02). In general signalling showed a BOLUS>INT>PULSE hierarchy in magnitude of phosphorylation. MuRF-1 and SLC38A2 mRNA were differentially expressed with BOLUS. In conclusion, 20 g of whey protein consumed every 3 h was superior to either PULSE or BOLUS feeding patterns for stimulating MPS throughout the day. This study provides novel information on the effect of modulating the distribution of protein intake on anabolic responses in skeletal muscle and has the potential to maximize outcomes of resistance training for attaining peak muscle mass.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: It is not known whether it is possible to repeatedly supercompensate muscle glycogen stores after exhaustive exercise bouts undertaken within several days. Methods: We evaluated the effect of repeated exercise-diet manipulation on muscle glycogen and triacylglycerol (IMTG) metabolism and exercise capacity in six well-trained subjects who completed an intermittent, exhaustive cycling protocol (EX) on three occasions separated by 48 h (i.e., days 1, 3, and 5) in a 5-d period. Twenty-four hours before day 1, subjects consumed a moderate (6 g·kg-1)-carbohydrate (CHO) diet, followed by 5 d of a high (12 g·kg-1·d -1)-CHO diet. Muscle biopsies were taken at rest, immediately post-EX on days 1, 3, and 5, and after 3 h of recovery on days 1 and 3. Results: Compared with day 1, resting muscle [glycogen] was elevated on day 3 but not day 5 (435 ± 57 vs 713 ± 60 vs 409 ± 40 mmol·kg -1, P < 0.001). [IMTG] was reduced by 28% (P < 0.05) after EX on day 1, but post-EX levels on days 3 and 5 were similar to rest. EX was enhanced on days 3 and 5 compared with day 1 (31.9 ± 2.5 and 35.4 ± 3.8 vs 24.1 ± 1.4 kJ·kg-1, P < 0.05). Glycogen synthase activity at rest and immediately post-EX was similar between trials. Additionally, the rates of muscle glycogen accumulation were similar during the 3-h recovery period on days 1 and 3. Conclusion: We show that well-trained men cannot repeatedly supercompensate muscle [glycogen] after glycogen-depleting exercise and 2 d of a high-CHO diet, suggesting that the mechanisms responsible for glycogen accumulation are attenuated as a consequence of successive days of glycogen-depleting exercise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Disengaged and disruptive students have been an ongoing concern for teachers for many years. Teaching is complex—complex students with complex lives and complex behaviours. How best to help these students is an ever-present question without a simple answer. Solutions need to be found. Under a positive behaviour support framework when serious, disruptive behaviour requires intervention, an individualised positive behaviour support plan (PBS plan) is developed and implemented. This multicase study (Stake, 2006) investigated how task engagement was changed for boys from year four to year seven who demonstrated serious, disruptive behaviour. The individualised PBS plan was the primary tool of behaviour intervention in each of the five cases. Using the Behaviour Support Plan Quality Evaluation Scoring Guide II (BSP-QE) (Browning- Wright, Saren & Mayer, 2003) the five PBS plans were evaluated prior to implementation and rated highly in terms of technical quality. Positive changes in student task engagement were forthcoming in all five cases. Eleven advisory visiting teachers in behaviour and eleven classroom teachers, five of whom were case-study participants, took part in this study. The classroom teachers were employed in south-east Queensland primary schools located in suburbs of economic disadvantage. All 22 participants expressed very similar perceptions of serious, disruptive behaviour emphasising the collateral impact upon the teaching and learning. Data obtained through direct observations, surveys and semi-structured interviews confirmed previous research to reveal a strong link between integrity of PBS plan implementation and student behaviour change. While classroom teachers, in the main, effectively managed the implementation of the PBS plan, social validity of goals, procedures and effects; in-class technical assistance and performance feedback were identified as three enablers to effective teacher implementation of the PBS plan. While the purpose of each PBS plan was to influence change in student behaviour, this study found that changing teacher behaviour was also instrumental in achieving positive student outcomes. Changing teacher behaviour and building capacity was facilitated by trusting, collaborative partnerships established between the Advisory Visiting Teacher-Behaviour and the classroom teacher responsible for the plan implementation. The Advisory Visiting Teacher-Behaviour provides assistance to teachers dealing with students who demonstrate ongoing, problematic behaviour. The inclusion of a teaching component as part of the implementation stage of the consultation process appeared to have considerable influence upon successful intervention. Results substantiated earlier understandings of the importance of teacher instruction highlighting the value of explicit teaching and performance feedback to the delivery of effective behaviour intervention. Conclusions drawn from this study have had a major impact upon the work of a regional team of Advisory Visiting Teachers-Behaviour. The focus of behaviour intervention has moved from being primarily upon the individual student to include a greater emphasis upon the critical role of the teacher. Procedures and processes are being re-evaluated to align with evidence-based practice and to include a collaborative consultation approach to improve teacher assistance. The framework and content of staff development and training is being created directly from the findings of this study. This practical application of the results has informed better ways of providing behaviour intervention for students demonstrating serious, disruptive behaviour. What this study has clearly shown is that when it comes to behaviour intervention, the important role of the teacher cannot be underestimated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Cancer-related malnutrition is associated with increased morbidity, poorer tolerance of treatment, decreased quality of life, increased hospital admissions, and increased health care costs (Isenring et al., 2013). This study’s aim was to determine whether a novel, automated screening system was a useful tool for nutrition screening when compared against a full nutrition assessment using the Patient-Generated Subjective Global Assessment (PG-SGA) tool. Methods A single site, observational, cross-sectional study was conducted in an outpatient oncology day care unit within a Queensland tertiary facility, with three hundred outpatients (51.7% male, mean age 58.6 ± 13.3 years). Eligibility criteria: ≥18 years, receiving anticancer treatment, able to provide written consent. Patients completed the Malnutrition Screening Tool (MST). Nutritional status was assessed using the PG-SGA. Data for the automated screening system was extracted from the pharmacy software program Charm. This included body mass index (BMI) and weight records dating back up to six months. Results The prevalence of malnutrition was 17%. Any weight loss over three to six weeks prior to the most recent weight record as identified by the automated screening system relative to malnutrition resulted in 56.52% sensitivity, 35.43% specificity, 13.68% positive predictive value, 81.82% negative predictive value. MST score 2 or greater was a stronger predictor of nutritional risk relative to PG-SGA classified malnutrition (70.59% sensitivity, 69.48% specificity, 32.14% positive predictive value, 92.02% negative predictive value). Conclusions Both the automated screening system and the MST fell short of the accepted professional standard for sensitivity (80%) or specificity (60%) when compared to the PG-SGA. However, although the MST remains a better predictor of malnutrition in this setting, uptake of this tool in the Oncology Day Care Unit remains challenging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In view of the upcoming Sydney Olympics and several recent reports describing the experience at the Atlantic Olympics, we report the findings of the only Australian study which, to our knowledge, measured the impact of a large-scale sporting event on a public hospital. The study also provided an avenue for increased surveillance for communicable diseases. We prospectively assessed the utilisation of the Royal Darwin Hospital (RDH) by visiting athletes, officials and spectators during the 1997 Arafura Games, a biannual, seven-day international sporting event which attracts some 4,000 athletes and their supporters from across Australia, South-East Asia and the Pacific. The RDH Emergency Department (ED) is the only free, 24- hour medical facility in Darwin and no additional staff or resources were provided during the Games period. Official facilities included two privately operated sports medicine clinics for the sole use of athletes with sporting injuries during prescribed hours in the week of competition, and the presence of St John Ambulance at venues...