840 resultados para constant work-rate
Resumo:
Construction and demolition (C&D) waste occupies the largest share of overall waste generation in many countries. However, waste management practices and outcomes may differ between countries. For instance, in Australia, C&D waste recovery is continuously improving during the last years but the amount of C&D waste increases every year, as there has been little improvement in waste avoidance and minimization. In contrast, in Germany, waste generation remains constant over many years despite the continuous economic growth. The waste recycling rate in Germany is one of the highest in the world. However, most waste recycled is from demolition work rather than from waste generated during new construction. In addition, specific laws need to be developed to further reduce landfill of non-recycled waste. Despite of the differences, C&D waste generation and recovery in both countries depend on the effectiveness of the statutory framework, which regulates their waste management practices. This is an issue in other parts of the world as well. Therefore countries can learn from each other to improve their current statutory framework for C&D waste management. By taking Germany and Australia as an example, possible measures to improve current practices of C&D waste management through better statutory tools are identified in this paper. After providing an overview of the statutory framework of both countries and their status in waste generation and recovery, a SWOT analysis is conducted to identify strengths, weaknesses, opportunities and threats of the statutory tools. Recommendations to improve the current statutory frameworks, in order to achieve less waste generation and more waste recovery in the construction industry are provided for the German and Australian government and they can also be transferred to other countries.
Resumo:
Background: Quality of work life (QWL) is defined as the extent to which employee is satisfied with personal and working needs through participating in the workplace while achieving the organisation’s goals. QWL has been found to influence the commitment and productivity of employees in healthcare organisations, as well as in other industries. However, reliable information on the QWL of PHC nurses is limited. The purpose of this study was to assess the QWL among PHC nurses in the Jazan region, Saudi Arabia. Methods: A descriptive research design, namely, a cross-sectional survey was used in this study. Data were collected using Brooks’ survey of quality of nursing work life (QNWL) and demographic questions. A convenience sample was recruited from 143 PHC centres in Jazan, Saudi Arabia. The Jazan region is located in the southern part of Saudi Arabia. A response rate of 91% (N = 532/585) was achieved (effective RR = 87%, n = 508). Data analysis consisted of descriptive statistics, t-test and one way-analysis of variance. Total scores and sub-scores for QWL Items and item summary statistics were computed and reported, using SPSS version 17 for Windows. Results: Findings suggested that the respondents were dissatisfied with their work life. The major influencing factors were unsuitable working hours/shifts, lack of facilities for nurses, inability to balance work with family needs, inadequacy of family-leave time, poor staffing, management and supervision practices, lack of professional development opportunities, and inappropriate working environment in terms of the level of security, patient care supplies and equipment, and recreation facilities (Break-area). Other essential factors include the community’s view of nursing and inadequate salary. More positively, the majority of nurses were satisfied with their co-workers, satisfied to be nurses and had a sense of belonging in their workplaces. Significant differences were found according to gender, age, marital status, dependent children, dependent adults, nationality, ethnicity, nursing tenure, organisational tenure, positional tenure, and payment per month. No significant differences were found according to education level and location of PHC. Conclusions: These findings can be used by PHC managers and policy makers for developing and appropriately implementing successful plans to improve the QWL. This will help to enhance the home and work environments, improve individual and organisation performance and increase nurses’ commitment.
Resumo:
Purpose. To determine whether Australia's Walk to Work Day media campaign resulted in behavioural change among targeted groups. Methods. Pre- and postcampaign telephone surveys of a cohort of adults aged 18 to 65 years (n = 1100, 55% response rate) were randomly sampled from Australian major melropolitan areas. Tests for dependent samples were applied (McNemax chi(2) or paired t-test). Results. Among participants who did not usually actively commute to work was a significant decrease in car only use an increase in walking combined with public transport. Among those who were employed was a significant increase in total time walking (+16 min/wk; t [780] = 2.04, p < .05) and in other moderate physical activity (+120 min/wk; t [1087] = 4.76, p < .005), resulting in a significant decrease in the proportion who were inactive (chi(2) (1) = 6.1, p < .05). Conclusion. Although nonexperimental, the Walk to Work Day initiative elicited short-term changes in targeted behaviors among target groups. Reinforcement by integrating worksite health promotion strategies may be required for sustained effects.
Resumo:
A frame-rate stereo vision system, based on non-parametric matching metrics, is described. Traditional metrics, such as normalized cross-correlation, are expensive in terms of logic. Non-parametric measures require only simple, parallelizable, functions such as comparators, counters and exclusive-or, and are thus very well suited to implementation in reprogrammable logic.
Resumo:
Visual abnormalities, both at the sensory input and the higher interpretive levels, have been associated with many of the symptoms of schizophrenia. Individuals with schizophrenia typically experience distortions of sensory perception, resulting in perceptual hallucinations and delusions that are related to the observed visual deficits. Disorganised speech, thinking and behaviour are commonly experienced by sufferers of the disorder, and have also been attributed to perceptual disturbances associated with anomalies in visual processing. Compounding these issues are marked deficits in cognitive functioning that are observed in approximately 80% of those with schizophrenia. Cognitive impairments associated with schizophrenia include: difficulty with concentration and memory (i.e. working, visual and verbal), an impaired ability to process complex information, response inhibition and deficits in speed of processing, visual and verbal learning. Deficits in sustained attention or vigilance, poor executive functioning such as poor reasoning, problem solving, and social cognition, are all influenced by impaired visual processing. These symptoms impact on the internal perceptual world of those with schizophrenia, and hamper their ability to navigate their external environment. Visual processing abnormalities in schizophrenia are likely to worsen personal, social and occupational functioning. Binocular rivalry provides a unique opportunity to investigate the processes involved in visual awareness and visual perception. Binocular rivalry is the alternation of perceptual images that occurs when conflicting visual stimuli are presented to each eye in the same retinal location. The observer perceives the opposing images in an alternating fashion, despite the sensory input to each eye remaining constant. Binocular rivalry tasks have been developed to investigate specific parts of the visual system. The research presented in this Thesis provides an explorative investigation into binocular rivalry in schizophrenia, using the method of Pettigrew and Miller (1998) and comparing individuals with schizophrenia to healthy controls. This method allows manipulations to the spatial and temporal frequency, luminance contrast and chromaticity of the visual stimuli. Manipulations to the rival stimuli affect the rate of binocular rivalry alternations and the time spent perceiving each image (dominance duration). Binocular rivalry rate and dominance durations provide useful measures to investigate aspects of visual neural processing that lead to the perceptual disturbances and cognitive dysfunction attributed to schizophrenia. However, despite this promise the binocular rivalry phenomenon has not been extensively explored in schizophrenia to date. Following a review of the literature, the research in this Thesis examined individual variation in binocular rivalry. The initial study (Chapter 2) explored the effect of systematically altering the properties of the stimuli (i.e. spatial and temporal frequency, luminance contrast and chromaticity) on binocular rivalry rate and dominance durations in healthy individuals (n=20). The findings showed that altering the stimuli with respect to temporal frequency and luminance contrast significantly affected rate. This is significant as processing of temporal frequency and luminance contrast have consistently been demonstrated to be abnormal in schizophrenia. The current research then explored binocular rivalry in schizophrenia. The primary research question was, "Are binocular rivalry rates and dominance durations recorded in participants with schizophrenia different to those of the controls?" In this second study binocular rivalry data that were collected using low- and highstrength binocular rivalry were compared to alternations recorded during a monocular rivalry task, the Necker Cube task to replicate and advance the work of Miller et al., (2003). Participants with schizophrenia (n=20) recorded fewer alternations (i.e. slower alternation rates) than control participants (n=20) on both binocular rivalry tasks, however no difference was observed between the groups on the Necker cube task. Magnocellular and parvocellular visual pathways, thought to be abnormal in schizophrenia, were also investigated in binocular rivalry. The binocular rivalry stimuli used in this third study (Chapter 4) were altered to bias the task for one of these two pathways. Participants with schizophrenia recorded slower binocular rivalry rates than controls in both binocular rivalry tasks. Using a ‘within subject design’, binocular rivalry data were compared to data collected from a backwardmasking task widely accepted to bias both these pathways. Based on these data, a model of binocular rivalry, based on the magnocellular and parvocellular pathways that contribute to the dorsal and ventral visual streams, was developed. Binocular rivalry rates were compared with performance on the Benton’s Judgment of Line Orientation task, in individuals with schizophrenia compared to healthy controls (Chapter 5). The Benton’s Judgment of Line Orientation task is widely accepted to be processed within the right cerebral hemisphere, making it an appropriate task to investigate the role of the cerebral hemispheres in binocular rivalry, and to investigate the inter-hemispheric switching hypothesis of binocular rivalry proposed by Pettigrew and Miller (1998, 2003). The data were suggestive of intra-hemispheric rather than an inter-hemispheric visual processing in binocular rivalry. Neurotransmitter involvement in binocular rivalry, backward masking and Judgment of Line Orientation in schizophrenia were investigated using a genetic indicator of dopamine receptor distribution and functioning; the presence of the Taq1 allele of the dopamine D2 receptor (DRD2) receptor gene. This final study (Chapter 6) explored whether the presence of the Taq1 allele of the DRD2 receptor gene, and thus, by inference the distribution of dopamine receptors and dopamine function, accounted for the large individual variation in binocular rivalry. The presence of the Taq1 allele was associated with slower binocular rivalry rates or poorer performance in the backward masking and Judgment of Line Orientation tasks seen in the group with schizophrenia. This Thesis has contributed to what is known about binocular rivalry in schizophrenia. Consistently slower binocular rivalry rates were observed in participants with schizophrenia, indicating abnormally-slow visual processing in this group. These data support previous studies reporting visual processing abnormalities in schizophrenia and suggest that a slow binocular rivalry rate is not a feature specific to bipolar disorder, but may be a feature of disorders with psychotic features generally. The contributions of the magnocellular or dorsal pathways and parvocellular or ventral pathways to binocular rivalry, and therefore to perceptual awareness, were investigated. The data presented supported the view that the magnocellular system initiates perceptual awareness of an image and the parvocellular system maintains the perception of the image, making it available to higher level processing occurring within the cortical hemispheres. Abnormal magnocellular and parvocellular processing may both contribute to perceptual disturbances that ultimately contribute to the cognitive dysfunction associated with schizophrenia. An alternative model of binocular rivalry based on these observations was proposed.
Resumo:
Aims: This paper describes the development of a risk adjustment (RA) model predictive of individual lesion treatment failure in percutaneous coronary interventions (PCI) for use in a quality monitoring and improvement program. Methods and results: Prospectively collected data for 3972 consecutive revascularisation procedures (5601 lesions) performed between January 2003 and September 2011 were studied. Data on procedures to September 2009 (n = 3100) were used to identify factors predictive of lesion treatment failure. Factors identified included lesion risk class (p < 0.001), occlusion type (p < 0.001), patient age (p = 0.001), vessel system (p < 0.04), vessel diameter (p < 0.001), unstable angina (p = 0.003) and presence of major cardiac risk factors (p = 0.01). A Bayesian RA model was built using these factors with predictive performance of the model tested on the remaining procedures (area under the receiver operating curve: 0.765, Hosmer–Lemeshow p value: 0.11). Cumulative sum, exponentially weighted moving average and funnel plots were constructed using the RA model and subjectively evaluated. Conclusion: A RA model was developed and applied to SPC monitoring for lesion failure in a PCI database. If linked to appropriate quality improvement governance response protocols, SPC using this RA tool might improve quality control and risk management by identifying variation in performance based on a comparison of observed and expected outcomes.
Resumo:
Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.
Resumo:
Three native freshwater crayfish Cherax species are farmed in Australia namely; Redclaw (Cherax quadricarinatus), Marron (C. tenuimanus), and Yabby (C. destructor). Lack of appropriate data on specific nutrient requirements for each of these species, however, has constrained development of specific formulated diets and hence current use of over-formulated feeds or expensive marine shrimp feeds, limit their profitability. A number of studies have investigated nutritional requirements in redclaw that have focused on replacing expensive fish meal in formulated feeds with non-protein, less expensive substitutes including plant based ingredients. Confirmation that freshwater crayfish possess endogenous cellulase genes, suggests their potential ability to utilize complex carbohydrates like cellulose as nutrient sources in their diet. To date, studies have been limited to only C. quadricarinatus and C. destructor and no studies have compared the relative ability of each species to utilize soluble cellulose in their diets. Individual feeding trials of late-juveniles of each species were conducted separately in an automated recirculating culture system over 12 week cycles. Animals were fed either a test diet (TD) that contained 20% soluble cellulose or a reference diet (RD) substituted with the same amount of corn starch. Water temperature, conductivity and pH were maintained at constant and optimum levels for each species. Animals were fed at 3% of their body weight twice daily and wet body weight was recorded bi-weekly. At the end of experiment, all animals were harvested, measured and midgut gland extracts assayed for alpha-amylase, total protease and cellulase activity levels. After the trial period, redclaw fed with RD showed significantly higher (p<0.05) specific growth rate (SGR) compare with animals fed the TD while SGR of marron and yabby fed the two diets were not significantly different (p<0.05). Cellulase expression levels in redclaw were not significantly different between diets. Marron and yabby showed significantly higher cellulase activity when fed the RD. Amylase and protease activity in all three species were significantly higher in the animals fed with RD (Table 1). These results indicate that test animals of all species can utilize starch better than dietary soluble cellulose in their diet and inclusion of 20% soluble cellulose in diets does not appear to have any significant negative effect on their growth rate but survival was impacted in C. quadricarinatus while not in C. tenuimanus or C. destructor.
Resumo:
In most developing countries, the overall quality of the livelihood of labourers, work place environment and implementation of labour rights do not progress at the same rate as their industrial development. To address this situation, the ILO has initiated the concept of 'decent work' to assist regulators articulate labour-related social policy goals. Against this backdrop, this article assesses the Bangladesh Labour Law 2006 by reference to the four social principles developed by the ILO for ensuring 'decent work'. It explains the impact of the absence of these principles in this Law on the labour administration in the ready-made garment and ship-breaking industries. It finds that an appropriate legislative framework needs to be based on the principles of 'decent work' to establish a solid platform for a sound labour regulation in Bangladesh.
Resumo:
The increasing prevalence of obesity in society has been associated with a number of atherogenic risk factors such as insulin resistance. Aerobic training is often recommended as a strategy to induce weight loss, with a greater impact of high-intensity levels on cardiovascular function and insulin sensitivity, and a greater impact of moderate-intensity levels on fat oxidation. Anaerobic high-intensity (supramaximal) interval training has been advocated to improve cardiovascular function, insulin sensitivity and fat oxidation. However, obese individuals tend to have a lower tolerance of high-intensity exercise due to discomfort. Furthermore, some obese individuals may compensate for the increased energy expenditure by eating more and/or becoming less active. Recently, both moderate- and high-intensity aerobic interval training have been advocated as alternative approaches. However, it is still uncertain as to which approach is more effective in terms of increasing fat oxidation given the issues with levels of fitness and motivation, and compensatory behaviours. Accordingly, the objectives of this thesis were to compare the influence of moderate- and high-intensity interval training on fat oxidation and eating behaviour in overweight/obese men. Two exercise interventions were undertaken by 10-12 overweight/obese men to compare their responses to study variables, including fat oxidation and eating behaviour during moderate- and high-intensity interval training (MIIT and HIIT). The acute training intervention was a methodological study designed to examine the validity of using exercise intensity from the graded exercise test (GXT) - which measured the intensity that elicits maximal fat oxidation (FATmax) - to prescribe interval training during 30-min MIIT. The 30-min MIIT session involved 5-min repetitions of workloads 20% below and 20% above the FATmax. The acute intervention was extended to involve HIIT in a cross-over design to compare the influence of MIIT and HIIT on eating behaviour using subjective appetite sensation and food preference through the liking and wanting test. The HIIT consisted of 15-sec interval training at 85 %VO2peak interspersed by 15-sec unloaded recovery, with a total mechanical work equal to MIIT. The medium term training intervention was a cross-over 4-week (12 sessions) MIIT and HIIT exercise training with a 6-week detraining washout period. The MIIT sessions consisted of 5-min cycling stages at ±20% of mechanical work at 45 %VO2peak, and the HIIT sessions consisted of repetitive 30-sec work at 90 %VO2peak and 30-sec interval rests, during identical exercise sessions of between 30 and 45 mins. Assessments included a constant-load test (45 %VO2peak for 45 mins) followed by 60-min recovery at baseline and the end of 4-week training, to determine fat oxidation rate. Participants’ responses to exercise were measured using blood lactate (BLa), heart rate (HR) and rating of perceived exertion (RPE) and were measured during the constant-load test and in the first intervention training session of every week during training. Eating behaviour responses were assessed by measuring subjective appetite sensations, liking and wanting and ad libitum energy intake. Results of the acute intervention showed that FATmax is a valid method to estimate VO2 and BLa, but is not valid to estimate HR and RPE in the MIIT session. While the average rate of fat oxidation during 30-min MIIT was comparable with the rate of fat oxidation at FATmax (0.16 ±0.09 and 0.14 ±0.08 g/min, respectively), fat oxidation was significantly higher at minute 25 of MIIT (P≤0.01). In addition, there was no significant difference between MIIT and HIIT in the rate of appetite sensations after exercise, but there was a tendency towards a lower rate of hunger after HIIT. Different intensities of interval exercise also did not affect explicit liking or implicit wanting. Results of the medium-term intervention indicated that current interval training levels did not affect body composition, fasting insulin and fasting glucose. Maximal aerobic capacity significantly increased (P≤0.01) (2.8 and 7.0% after MIIT and HIIT respectively) during GXT, and fat oxidation significantly increased (P≤0.01) (96 and 43% after MIIT and HIIT respectively) during the acute constant-load exercise test. RPE significantly decreased after HIIT greater than MIIT (P≤0.05), and the decrease in BLa was greater during the constant-load test after HIIT than MIIT, but this difference did not reach statistical significance (P=0.09). In addition, following constant-load exercise, exercise-induced hunger and desire to eat decreased after HIIT greater than MIIT but were not significant (p value for desire to eat was 0.07). Exercise-induced liking of high-fat sweet (HFSW) and high-fat non-sweet (HFNS) foods increased after MIIT and decreased after HIIT (p value for HFNS was 0.09). The intervention explained 12.4% of the change in fat intake (p = 0.07). This research is significant in that it confirmed two points in the acute study. While the rate of fat oxidation increased during MIIT, the average rate of fat oxidation during 30-min MIIT was comparable with the rate of fat oxidation at FATmax. In addition, manipulating the intensity of acute interval exercise did not affect appetite sensations and liking and wanting. In the medium-term intervention, constant-load exercise-induced fat oxidation significantly increased after interval training, independent of exercise intensity. In addition, desire to eat, explicit liking for HFNS and fat intake collectively confirmed that MIIT is accompanied by a greater compensation of eating behaviour than HIIT. Findings from this research will assist in developing exercise strategies to provide obese men with various training options. In addition, the finding that overweight/obese men expressed a lower RPE and decreased BLa after HIIT compared with MIIT is contrary to the view that obese individuals may not tolerate high-intensity interval training. Therefore, high-intensity interval training can be advocated among the obese adult male population. Future studies may extend this work by using a longer-term intervention.
Resumo:
In this paper we explore the relationship between monthly random breath testing (RBT) rates (per 1000 licensed drivers) and alcohol-related traffic crash (ARTC) rates over time, across two Australian states: Queensland and Western Australia. We analyse the RBT, ARTC and licensed driver rates across 12 years; however, due to administrative restrictions, we model ARTC rates against RBT rates for the period July 2004 to June 2009. The Queensland data reveals that the monthly ARTC rate is almost flat over the five year period. Based on the results of the analysis, an average of 5.5 ARTCs per 100,000 licensed drivers are observed across the study period. For the same period, the monthly rate of RBTs per 1000 licensed drivers is observed to be decreasing across the study with the results of the analysis revealing no significant variations in the data. The comparison between Western Australia and Queensland shows that Queensland's ARTC monthly percent change (MPC) is 0.014 compared to the MPC of 0.47 for Western Australia. While Queensland maintains a relatively flat ARTC rate, the ARTC rate in Western Australia is increasing. Our analysis reveals an inverse relationship between ARTC RBT rates, that for every 10% increase in the percentage of RBTs to licensed driver there is a 0.15 decrease in the rate of ARTCs per 100,000 licenced drivers. Moreover, in Western Australia, if the 2011 ratio of 1:2 (RBTs to annual number of licensed drivers) were to double to a ratio of 1:1, we estimate the number of monthly ARTCs would reduce by approximately 15. Based on these findings we believe that as the number of RBTs conducted increases the number of drivers willing to risk being detected for drinking driving decreases, because the perceived risk of being detected is considered greater. This is turn results in the number of ARTCs diminishing. The results of this study provide an important evidence base for policy decisions for RBT operations.
Resumo:
Beginning in the second half of the 20th century, ICTs transformed many societies from industrial societies in which manufacturing was the central focus, into knowledge societies in which dealing effectively with data and information has become a central element of work (Anderson, 2008). To meet the needs of the knowledge society, universities must reinvent their structures and processes, their curricula and pedagogic practices. In addition to this, of course higher education is itself subject to the sweeping influence of ICTs. But what might effective higher education look like in the 21st century? In designing higher education systems and learning experiences which are responsive to the learning needs of the future and exploit the possibilities offered by ICTs, we can learn much from the existing professional development strategies of people who are already successful in 21st century fields, such as digital media. In this study, I ask: (1) what are the learning challenges faced by digital media professionals in the 21st century? (2) what are the various roles of formal and informal education in their professional learning strategies at present? (3) how do they prefer to acquire needed capabilities? In-depth interviews were undertaken with successful Australian digital media professionals working in micro businesses and SMEs to answer these questions. The strongest thematic grouping that emerged from the interviews related to the need for continual learning and relearning because of the sheer rate of change in the digital media industries. Four dialectical relationships became apparent from the interviewees’ commentaries around the learning imperatives arising out of the immense and continual changes occurring in the digital content industries: (1) currency vs best practice (2) diversification vs specialisation of products and services (3) creative outputs vs commercial outcomes (4) more learning opportunities vs less opportunity to learn. These findings point to the importance of ‘learning how to learn’ as a 21st century capability. The interviewees were ambivalent about university courses as preparation for professional life in their fields. Higher education was described by several interviewees as having relatively little value-add beyond what one described as “really expensive credentialling services.” For all interviewees in this study, informal learning strategies were the preferred methods of acquiring the majority of knowledge and skills, both for ongoing and initial professional development. Informal learning has no ‘curriculum’ per se, and tends to be opportunistic, unstructured, pedagogically agile and far more self-directed than formal learning (Eraut, 2004). In an industry impacted by constant change, informal learning is clearly both essential and ubiquitous. Inspired by the professional development strategies of the digital media professionals in this study, I propose a 21st century model of the university as a broad, open learning ecology, which also includes industry, professionals, users, and university researchers. If created and managed appropriately, the university learning network becomes the conduit and knowledge integrator for the latest research and industry trends, which students and professionals alike can access as needed.
Resumo:
Heavy-vehicle driving involves a challenging work environment and a high crash rate. We investigated the associations of sleepiness, sleep disorders, and work environment (including truck characteristics) with the risk of crashing between 2008 and 2011 in the Australian states of New South Wales and Western Australia. We conducted a case-control study of 530 heavy-vehicle drivers who had recently crashed and 517 heavy-vehicle drivers who had not. Drivers' crash histories, truck details, driving schedules, payment rates, sleep patterns, and measures of health were collected. Subjects wore a nasal flow monitor for 1 night to assess for obstructive sleep apnea. Driving schedules that included the period between midnight and 5:59 am were associated with increased likelihood of crashing (odds ratio = 3.42, 95% confidence interval: 2.04, 5.74), as were having an empty load (odds ratio = 2.61, 95% confidence interval: 1.72, 3.97) and being a less experienced driver (odds ratio = 3.25, 95% confidence interval: 2.37, 4.46). Not taking regular breaks and the lack of vehicle safety devices were also associated with increased crash risk. Despite the high prevalence of obstructive sleep apnea, it was not associated with the risk of a heavy-vehicle nonfatal, nonsevere crash. Scheduling of driving to avoid midnight-to-dawn driving and the use of more frequent rest breaks are likely to reduce the risk of heavy-vehicle nonfatal, nonsevere crashes by 2–3 times.
Resumo:
The public health and community nutrition workforce in Queensland has experienced significant change. These changes happened following a new government in 2012, and its response to the National Health and Hospitals Reform and budget constraints. This research documented and analysed current roles and activities of the preventative health nutrition workforce. An online survey was conducted with all positions known to be working in nutrition prevention (n=320). The sample population was generated using existing databases which were validated by comparisons with workforce data from the Queensland Health Department and sector consultation. Snowballing was also used. 128 practitioners responded to the survey (response rate =40%). This was made up of those whose job title included the words “nutritionist” or “dietitian” (n=64) and those whose job title did not (n=61). Three respondents did not supply a title. Ninety-four practitioners had a nutrition or dietetic qualification indicating that a number of the workforce have shifted to more generalist positions. Between 2009 and 2013 there has been a 90% reduction in the state-funded nutrition prevention workforce. The existing reduced workforce is now dispersed across a range of organisations. Areas of workforce growth such as Medicare Locals tend to attract less experienced practitioners (50% had ≤ 5years’ experience). These changes present challenges for the co-ordination and communication of nutrition work and equity and access of service delivery. This research highlights the need for adaptability of the public health and community nutrition workforce. These issues require consideration by the profession.
Resumo:
Objective To identify predictors for initiating and maintaining active commuting (AC) to work following the 2003 Australia's Walk to Work Day (WTWD) campaign. Methods Pre- and post-campaign telephone surveys of a cohort of working age (18–65years) adults (n = 1100, 55% response rate). Two dependent campaign outcomes were assessed: initiating or maintaining AC (i.e., walk/cycle and public transport) on a single day (WTWD), and increasing or maintaining health-enhancing active commuting (HEAC) level (≥ 30min/day) in a usual week following WTWD campaign. Results A significant population-level increase in HEAC (3.9%) was observed (McNemar's χ2 = 6.53, p = 0.01) with 136 (19.0%) achieving HEAC at post campaign. High confidence in incorporating walking into commute, being active pre-campaign and younger age (< 46years) were positively associated with both outcomes. The utility of AC for avoiding parking hassles (AOR = 2.1, 95% CI: 1.2–3.6), for less expense (AOR = 1.8, 95% CI: 1.1–3.1), for increasing one's health (AOR = 2.5, 95% CI: 1.1–5.6) and for clean air (AOR = 2.2, 95% CI: 1.0–4.4) predicted HEAC outcome whereas avoiding the stress of driving (AOR = 2.6, 95% CI: 1.4–5.0) and the hassle of parking predicted the single-day AC. Conclusions Transportation interventions targeting parking and costs could be further enhanced by emphasizing health benefits of AC. AC was less likely to occur among inactive employees.