847 resultados para Faculty Workload
Resumo:
In 2012, Queensland University of Technology (QUT) committed to the massive project of revitalizing its Bachelor of Science (ST01) degree. Like most universities in Australia, QUT has begun work to align all courses by 2015 to the requirements of the updated Australian Qualifications Framework (AQF) which is regulated by the Tertiary Education Quality and Standards Agency (TEQSA). From the very start of the redesigned degree program, students approach scientific study with an exciting mix of theory and highly topical real world examples through their chosen “grand challenge.” These challenges, Fukushima and nuclear energy for example, are the lenses used to explore science and lead to 21st century learning outcomes for students. For the teaching and learning support staff, our grand challenge is to expose all science students to multidisciplinary content with a strong emphasis on embedding information literacies into the curriculum. With ST01, QUT is taking the initiative to rethink not only content but how units are delivered and even how we work together between the faculty, the library and learning and teaching support. This was the desired outcome but as we move from design to implementation, has this goal been achieved? A main component of the new degree is to ensure scaffolding of information literacy skills throughout the entirety of the three year course. However, with the strong focus on problem-based learning and group work skills, many issues arise both for students and lecturers. A move away from a traditional lecture style is necessary but impacts on academics’ workload and comfort levels. Therefore, academics in collaboration with librarians and other learning support staff must draw on each others’ expertise to work together to ensure pedagogy, assessments and targeted classroom activities are mapped within and between units. This partnership can counteract the tendency of isolated, unsupported academics to concentrate on day-to-day teaching at the expense of consistency between units and big picture objectives. Support staff may have a more holistic view of a course or degree than coordinators of individual units, making communication and truly collaborative planning even more critical. As well, due to staffing time pressures, design and delivery of new curriculum is generally done quickly with no option for the designers to stop and reflect on the experience and outcomes. It is vital we take this unique opportunity to closely examine what QUT has and hasn’t achieved to be able to recommend a better way forward. This presentation will discuss these important issues and stumbling blocks, to provide a set of best practice guidelines for QUT and other institutions. The aim is to help improve collaboration within the university, as well as to maximize students’ ability to put information literacy skills into action. As our students embark on their own grand challenges, we must challenge ourselves to honestly assess our own work.
Resumo:
Moderation of student assessment is a critical component of teaching and learning in contemporary universities. In Australia, moderation is mandated through university policies and through the new national university accreditation authority, Tertiary Education Quality and Standards Agency which began operations in late January 2012 (TEQSA, 2012). The TEQSA requirement to declare details of moderation and any other arrangements used to support consistency and reliability of assessment and grading across each subject in the course of study is a radical step intended to move toward heightened accountability and greater transparency in the tertiary sector as well as entrenching evidence-based practice in the management of Australian academic programs. In light of this reform, the purpose of this project was to investigate and analyse current moderation practices operating within a faculty of education at a large urban university in Queensland, Australia. This qualitative study involved interviews with the unit coordinators (n=21) and tutors (n=8) of core undergraduate education units and graduate diploma units within the faculty. Four distinct discourses of moderation that academics drew on to discuss their practices were identified in the study. These were: equity, justification, community building, and accountability. These discourses, together with recommendations for changes to moderation practices are discussed in this paper.
Resumo:
Executive Summary Emergency Departments (EDs) locally, nationally and internationally are becoming increasingly busy. Within this context, it can be challenging to deliver a health service that is safe, of high quality and cost-effective. Whilst various models are described within the literature that aim to measure ED ‘work’ or ‘activity’, they are often not linked to a measure of costs to provide such activity. It is important for hospital and ED managers to understand and apply this link so that optimal staffing and financial resourcing can be justifiably sought. This research is timely given that Australia has moved towards a national Activity Based Funding (ABF) model for ED activity. ABF is believed to increase transparency of care and fairness (i.e. equal work receives equal pay). ABF involves a person-, performance- or activity-based payment system, and thus a move away from historical “block payment” models that do not incentivise efficiency and quality. The aim of the Statewide Workforce and Activity-Based Funding Modelling Project in Queensland Emergency Departments (SWAMPED) is to identify and describe best practice Emergency Department (ED) workforce models within the current context of ED funding that operates under an ABF model. The study is comprised of five distinct phases. This monograph (Phase 1) comprises a systematic review of the literature that was completed in June 2013. The remaining phases include a detailed survey of Queensland hospital EDs’ resource levels, activity and operational models of care, development of new resource models, development of a user-friendly modelling interface for ED mangers, and production of a final report that identifies policy implications. The anticipated deliverable outcome of this research is the development of an ABF based Emergency Workforce Modelling Tool that will enable ED managers to profile both their workforce and operational models of care. Additionally, the tool will assist with the ability to more accurately inform adequate staffing numbers required in the future, inform planning of expected expenditures and be used for standardisation and benchmarking across similar EDs. Summary of the Findings Within the remit of this review of the literature, the main findings include: 1. EDs are becoming busier and more congested Rising demand, barriers to ED throughput and transitions of care all contribute to ED congestion. In addition requests by organisational managers and the community require continued broadening of the scope of services required of the ED and further increases in demand. As the population live longer with more lifestyle diseases their propensity to require ED care continues to grow. 2. Various models of care within EDs exist Models often vary to account for site specific characteritics to suit staffing profile, ED geographical location (e.g. metropolitan or rural site), and patient demographic profile (e.g. paediatrics, older persons, ethnicity). Existing and new models implemented within EDs often depend on the target outcome requiring change. Generally this is focussed on addressing issues at the input, throughput or output areas of the ED. Even with models targeting similar demographic or illness, the structure and process elements underpinning the model can vary, which can impact on outcomes and variance to the patient and carer experience between and within EDs. Major models of care to manage throughput inefficiencies include: A. Workforce Models of Care focus on the appropriate level of staffing for a given workload to provide prompt, timely and clinically effective patient care within an emergency care setting. The studies reviewed suggest that the early involvement of senior medical decision maker and/or specialised nursing roles such as Emergency Nurse Practitioners and Clinical Initiatives Nurse, primary contact or extended scope Allied Health Practitioners can facilitate patient flow and improve key indicators such as length of stay and reducing the number of those who did not wait to be seen amongst others. B. Operational Models of Care within EDs focus on mechanisms for streaming (e.g. fast-tracking) or otherwise grouping patient care based on acuity and complexity to assist with minimising any throughput inefficiencies. While studies support the positive impact of these models in general, it appears that they are most effective when they are adequately resourced. 3. Various methods of measuring ED activity exist Measuring ED activity requires careful consideration of models of care and staffing profile. Measuring activity requires the ability to account for factors including: patient census, acuity, LOS, intensity of intervention, department skill-mix plus an adjustment for non-patient care time. 4. Gaps in the literature Continued ED growth calls for new and innovative care delivery models that are safe, clinically effective and cost effective. New roles and stand-alone service delivery models are often evaluated in isolation without considering the global and economic impact on staffing profiles. Whilst various models of accounting for and measuring health care activity exist, costing studies and cost effectiveness studies are lacking for EDs making accurate and reliable assessments of care models difficult. There is a necessity to further understand, refine and account for measures of ED complexity that define a workload upon which resources and appropriate staffing determinations can be made into the future. There is also a need for continued monitoring and comprehensive evaluation of newly implemented workforce modelling tools. This research acknowledges those gaps and aims to: • Undertake a comprehensive and integrated whole of department workforce profiling exercise relative to resources in the context of ABF. • Inform workforce requirements based on traditional quantitative markers (e.g. volume and acuity) combined with qualitative elements of ED models of care; • Develop a comprehensive and validated workforce calculation tool that can be used to better inform or at least guide workforce requirements in a more transparent manner.
Resumo:
Objective To examine the impact of applying for funding on personal workloads, stress and family relationships. Design Qualitative study of researchers preparing grant proposals. Setting Web-based survey on applying for the annual National Health and Medical Research Council (NHMRC) Project Grant scheme. Participants Australian researchers (n=215). Results Almost all agreed that preparing their proposals always took top priority over other work (97%) and personal (87%) commitments. Almost all researchers agreed that they became stressed by the workload (93%) and restricted their holidays during the grant writing season (88%). Most researchers agreed that they submitted proposals because chance is involved in being successful (75%), due to performance requirements at their institution (60%) and pressure from their colleagues to submit proposals (53%). Almost all researchers supported changes to the current processes to submit proposals (95%) and peer review (90%). Most researchers (59%) provided extensive comments on the impact of writing proposals on their work life and home life. Six major work life themes were: (1) top priority; (2) career development; (3) stress at work; (4) benefits at work; (5) time spent at work and (6) pressure from colleagues. Six major home life themes were: (1) restricting family holidays; (2) time spent on work at home; (3) impact on children; (4) stress at home; (5) impact on family and friends and (6) impact on partner. Additional impacts on the mental health and well-being of researchers were identified. Conclusions The process of preparing grant proposals for a single annual deadline is stressful, time consuming and conflicts with family responsibilities. The timing of the funding cycle could be shifted to minimise applicant burden, give Australian researchers more time to work on actual research and to be with their families.
Resumo:
Introduction The acute health effects of heatwaves in a subtropical climate and their impact on emergency departments (ED) are not well known. The purpose of this study is to examine overt heat-related presentations to EDs associated with heatwaves in Brisbane. Methods Data were obtained for the summer seasons (December to February) from 2000-2012. Heatwave events were defined as two or more successive days with daily maximum temperature >=34[degree sign]C (HWD1) or >=37[degree sign]C (HWD2). Poisson generalised additive model was used to assess the effect of heatwaves on heat-related visits (International Classification of Diseases (ICD) 10 codes T67 and X30; ICD 9 codes 992 and E900.0). Results Overall, 628 cases presented for heat-related illnesses. The presentations significantly increased on heatwave days based on HWD1 (relative risk (RR) = 4.9, 95% confidence interval (CI): 3.8, 6.3) and HWD2 (RR = 18.5, 95% CI: 12.0, 28.4). The RRs in different age groups ranged between 3-9.2 (HWD1) and 7.5-37.5 (HWD2). High acuity visits significantly increased based on HWD1 (RR = 4.7, 95% CI: 2.3, 9.6) and HWD2 (RR = 81.7, 95% CI: 21.5, 310.0). Average length of stay in ED significantly increased by >1 hour (HWD1) and >2 hours (HWD2). Conclusions Heatwaves significantly increase ED visits and workload even in a subtropical climate. The degree of impact is directly related to the extent of temperature increases and varies by socio-demographic characteristics of the patients. Heatwave action plans should be tailored according to the population needs and level of vulnerability. EDs should have plans to increase their surge capacity during heatwaves.
Resumo:
The interest in utilising multiple heterogeneous Unmanned Aerial Vehicles (UAVs) in close proximity is growing rapidly. As such, many challenges are presented in the effective coordination and management of these UAVs; converting the current n-to-1 paradigm (n operators operating a single UAV) to the 1-to-n paradigm (one operator managing n UAVs). This paper introduces an Information Abstraction methodology used to produce the functional capability framework initially proposed by Chen et al. and its Level Of Detail (LOD) indexing scale. This framework was validated through comparing the operator workload and Situation Awareness (SA) of three experiment scenarios involving multiple autonomously heterogeneous UAVs. The first scenario was set in a high LOD configuration with highly abstracted UAV functional information; the second scenario was set in a mixed LOD configuration; and the final scenario was set in a low LOD configuration with maximal UAV functional information. Results show that there is a significant statistical decrease in operator workload when a UAV’s functional information is displayed at its physical form (low LOD - maximal information) when comparing to the mixed LOD configuration.
Resumo:
The objective of this experimental study is to capture the dynamic temporal processes that occur in changing work settings and to test how work control and individuals' motivational predispositions interact to predict reactions to these changes. To this aim, we examine the moderating effects of global self-determined and non-self-determined motivation, at different levels of work control, on participants' adaptation and stress reactivity to changes in workload during four trials of an inbox activity. Workload was increased or decreased at Trial 3, and adaptation to this change was examined via fluctuations in anxiety, coping, motivation, and performance. In support of the hypotheses, results revealed that, for non-self-determined individuals, low work control was stress-buffering and high work control was stress-exacerbating when predicting anxiety and intrinsic motivation. In contrast, for self-determined individuals, high work control facilitated the adaptive use of planning coping in response to a change in workload. Overall, this pattern of results demonstrates that, while high work control was anxiety-provoking and demotivating for non-self-determined individuals, self-determined individuals used high work control to implement an adaptive antecedent-focused emotion regulation strategy (i.e., planning coping) to meet situational demands. Other interactive effects of global motivation emerged on anxiety, active coping, and task performance. These results and their practical implications are discussed.
Resumo:
This study investigated the effects of workload, control, and general self-efficacy on affective task reactions (i.e., demands-ability fit, active coping, and anxiety) during a work simulation. The main goals were: (1) to determine the extent general self-efficacy moderates the effects of demand and control on affective task reactions, and; (2) to determine if this varies as a function of changes in workload. Participants (N=141) completed an inbox activity under conditions of low or high control and within low and high workload conditions. The order of trials varied so that workload increased or decreased. Results revealed individuals with high general self-efficacy reported better demands-abilities fit and active coping as well as less anxiety. Three interactive effects were found. First, it was found that high control increased demands-abilities fit from trial 1 to trial 2, but only when workload decreased. Second, it was found that low efficacious individuals active coping increased in trial 2, but only under high control. Third, it was found that high control helped high efficacious individuals manage anxiety when workload decreased. However, for individuals with low general self-efficacy, neither high nor low control alleviated anxiety (i.e., whether workload increased or decreased over time).
Resumo:
The nature of differences in salaries between academic faculty and Certified Registered Nurse Anesthetists (CRNAs) working in clinical positions using recently collected data are explored. The differences in median salaries among program directors, assistant program directors, academic faculty, and clinical faculty are large. Furthermore, survey results imply that the most important barrier to recruiting teaching faculty is salary differentials. Part 1 of this 2-part column discusses salaries, recruitment, and retention of CRNA faculty; Part 2, to be published in the June 2008 issue, will focus on clinical faculty contributions to the education of CRNAs.
Resumo:
This article enhances existing approaches to present-day asynchronous awareness concepts by providing the means to explicitly represent and mediate contextual information. The resulting concept of contextual awareness takes different notions of the term context into account. Following a human-centered approach, the proposed methods serve as mediators for context between persons rather than automatically detecting context. Based on this variant of awareness, the atmosphere framework is introduced to provide mechanisms to deal with the problem of workload in tandem with contextual information. Atmosphere provides a highly tailorable structure and interface to deal with a wide variance of user and organizational requirements. The article closes with the description of a partial implementation of the framework and its evaluation.
Resumo:
Safety at railway level crossings (RLX) is one part of a wider picture of safety within the whole transport system. Governments, the rail industry and road organisations have used a variety of countermeasures for many years to improve RLX safety. New types of interventions are required in order to reduce the number of crashes and associated social costs at railway crossings. This paper presents the results of a large research program which aimed to assess the effectiveness of emerging Intelligent Transport Systems (ITS) interventions, both on-road and in-vehicle based, to improve the safety of car drivers at RLXs in Australia. The three most promising technologies selected from the literature review and focus groups were tested in an advanced driving simulator to provide a detailed assessment of their effects on driver behaviour. The three interventions were: (i) in-vehicle visual warning using a GPS/smartphone navigation-like system, (ii) in-vehicle audio warning and; (iii) on-road intervention known as valet system (warning lights on the road surface activated as a train approaches). The effects of these technologies on 57 participants were assessed in a systematic approach focusing on the safety of the intervention, effects on the road traffic around the crossings and driver’s acceptance of the technology. Given that the ITS interventions were likely to provide a benefit by improving the driver’s awareness of the crossing status in low visibility conditions, such conditions were investigated through curves in the track before arriving at the crossing. ITS interventions were also expected to improve driver behaviour at crossings with high traffic (blocking back issue), which were also investigated at active crossings. The key findings are: (i) interventions at passive crossings are likely to provide safety benefits; (ii) the benefits of ITS interventions on driver behaviour at active crossings are limited; (iii) the trialled ITS interventions did not show any issues in terms of driver distraction, driver acceptance or traffic delays; (iv) these interventions are easy to use, do not increase driver workload substantially; (v) participants’ intention to use the technology is high and; (vi) participants saw most value in succinct messages about approaching trains as opposed to knowing the RLX locations or the imminence of a collision with a train.
Resumo:
Introduction Intense exercise induced acidosis occurs from the accumulation of hydrogen ions as by-products of anaerobic metabolism. Oral ingestion of ß-alanine, a limiting precursor of the intracellular physiochemical buffer carnosine in skeletal muscle, may counteract any detrimental effect of acidosis and benefit performance. The aim of this study was to investigate the effect of ß-alanine as an ergogenic aid during high intensity exercise performance in healthy males. Methods Five males ingested either ß-alanine (BAl) (4.8 g.d-1 for 4wk, then 6.4 g.d-1 for 2wk) or placebo (Pl) (CaCO3) in a crossover design with 6 wk washout between. Following supplementation, participants performed two different intense exercise protocols over consecutive days. On the first day a repeated sprint ability (RSA) test of 5 x 6s, with 24s rest periods, was performed. On the second day a cycling capacity test measuring the time to exhaustion (TTE) was performed at 110% of their max workload achieved in a pre supplementation max test (CCT110%). Non-invasive quantification of carnosine, prior to, and following each supplementation, with magnetic resonance spectrometry was performed in the soleus and gastrocnemius. Time to fatigue (CCT110%), peak and mean power (RSA), blood pH, and plasma lactate were measured. Results Muscle carnosine concentration was not different prior to ß-alanine supplementation and increased 18% in the soleus and 26% in the gastrocnemius, respectively with 6 wk supplementation. There was no difference in the measured performance variables during the RSA test (peak and average power output). TTE during the CCT110% was significantly enhanced following the ingestion of BAl (155s ± 19.03) compared to Pl (134s ± 26.16). No changes were observed in blood pH during either exercise protocol and during the recovery from exercise. Plasma lactate in the BAl condition was significantly higher than Pl only from the 15th minute following exercise during the CCT110%. FIG. 1: Changes in carnosine concentration in the gastrocnemius prior and post 6 week chronic supplementation of placebo and β-alanine. Values expressed as mean.* p<0.05 from Pl at 6 weeks, # p<0.05 from pre supplementation. Conclusion/Discussion Greater muscle carnosine content following 6wk supplementation of ß-alanine enhanced the potential for intracellular buffering capacity. However, this only translated into enhanced performance during the CCT110% high intensity cycling exercise protocol, with no change observed during the RSA test. No differences in post exercise and recovery plasma lactates and blood pH, indicates that 6wks ß-alanine supplementation has no effect on anaerobic metabolism during multiple bout high intensity exercise. Changes in plasma lactate during recovery supports that ß-alanine supplementation may affect anaerobic metabolism however during single bout high intensity.
Resumo:
Intense exercise induced acidosis occurs after accumulation of hydrogen ions as by-products of anaerobic metabolism. Oral ingestion of ß-alanine, a limiting precursor of the intracellular physiochemical buffer carnosine in skeletal muscle, may counteract detrimental effects of acidosis and benefit performance. This study aimed to investigate the effect of ß-alanine as an ergogenic aid during high intensity exercise performance. Five healthy males ingested either ß-alanine or placebo (Pl) (CaCO3) in a crossover design with 6 wk washout between. Participants performed two different intense exercise protocols over consecutive days. On the first day a repeated sprint ability (RSA) test was performed. On the second day a cycling capacity test measuring the time to exhaustion (TTE) was performed at 110% of maximum workload achieved in a pre supplementation max test (CCT110%). Non-invasive quantification of carnosine, prior to, and following each supplementation, with in vivo magnetic resonance spectrometry was performed in the soleus and gastrocnemius muscle. Time to fatigue (CCT110%), peak and mean power (RSA), blood pH, and plasma lactate were measured. Muscle carnosine concentration was not different prior to ß-alanine supplementation and increased 18% in the soleus and 26% in the gastrocnemius, respectively after supplementation. There was no difference in the measured performance variables during the RSA test (peak and average power output). TTE during the CCT110% was significantly enhanced following the ingestion of BAl (155s ± 19.03) compared to Pl (134s ± 26.16). No changes were observed in blood pH during either exercise protocol and during the recovery from exercise. Plasma lactate after BAI was significantly higher than Pl only from the 15th minute following exercise during the CCT110%. Greater muscle carnosine content following 6wk supplementation of ß-alanine enhanced the potential for intracellular buffering capacity. This translated into enhanced performance during the CCT110% high intensity cycling exercise protocol but not during the RSA test. The lack of change in plasma lactate or blood pH indicates that 6wks ß-alanine supplementation has no effect on anaerobic metabolism during multiple-bout high-intensity exercise. Changes measured in plasma lactate during recovery support the hypothesis that ß-alanine supplementation may affect anaerobic metabolism particularly during single bout high intensity.
Resumo:
Purpose To establish whether the use of a passive or active technique of planning target volume (PTV) definition and treatment methods for non-small cell lung cancer (NSCLC) deliver the most effective results. This literature review assesses the advantages and disadvantages in recent studies of each, while assessing the validity of the two approaches for planning and treatment. Methods A systematic review of literature focusing on the planning and treatment of radiation therapy to NSCLC tumours. Different approaches which have been published in recent articles are subjected to critical appraisal in order to determine their relative efficacy. Results Free-breathing (FB) is the optimal method to perform planning scans for patients and departments, as it involves no significant increase in cost, workload or education. Maximum intensity projection (MIP) is the fastest form of delineation, however it is noted to be less accurate than the ten-phase overlap approach for computed tomography (CT). Although gating has proven to reduce margins and facilitate sparing of organs at risk, treatment times can be longer and planning time can be as much as 15 times higher for intensity modulated radiation therapy (IMRT). This raises issues with patient comfort and stabilisation, impacting on the chance of geometric miss. Stereotactic treatments can take up to 3 hours to treat, along with increases in planning and treatment, as well as the additional hardware, software and training required. Conclusion Four-dimensional computed tomography (4DCT) is superior to 3DCT, with the passive FB approach for PTV delineation and treatment optimal. Departments should use a combination of MIP with visual confirmation ensuring coverage for stage 1 disease. Stages 2-3 should be delineated using ten-phases overlaid. Stereotactic and gated treatments for early stage disease should be used accordingly; FB-IMRT is optimal for latter stage disease.