849 resultados para Power tool industry


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The contribution of risky behaviour to the increased crash and fatality rates of young novice drivers is recognised in the road safety literature around the world. Exploring such risky driver behaviour has led to the development of tools like the Driver Behaviour Questionnaire (DBQ) to examine driving violations, errors, and lapses [1]. Whilst the DBQ has been utilised in young novice driver research, some items within this tool seem specifically designed for the older, more experienced driver, whilst others appear to asses both behaviour and related motives. The current study was prompted by the need for a risky behaviour measurement tool that can be utilised with young drivers with a provisional driving licence. Sixty-three items exploring young driver risky behaviour developed from the road safety literature were incorporated into an online survey. These items assessed driver, passenger, journey, car and crash-related issues. A sample of 476 drivers aged 17-25 years (M = 19, SD = 1.59 years) with a provisional driving licence and matched for age, gender, and education were drawn from a state-wide sample of 761 young drivers who completed the survey. Factor analysis based upon a principal components extraction of factors was followed by an oblique rotation to investigate the underlying dimensions to young novice driver risky behaviour. A five factor solution comprising 44 items was identified, accounting for 55% of the variance in young driver risky behaviour. Factor 1 accounted for 32.5% of the variance and appeared to measure driving violations that were transient in nature - risky behaviours that followed risky decisions that occurred during the journey (e.g., speeding). Factor 2 accounted for 10.0% of variance and appeared to measure driving violations that were fixed in nature; the risky decisions being undertaken before the journey (e.g., drink driving). Factor 3 accounted for 5.4% of variance and appeared to measure misjudgment (e.g., misjudged speed of oncoming vehicle). Factor 4 accounted for 4.3% of variance and appeared to measure risky driving exposure (e.g., driving at night with friends as passengers). Factor 5 accounted for 2.8% of variance and appeared to measure driver emotions or mood (e.g., anger). Given that the aim of the study was to create a research tool, the factors informed the development of five subscales and one composite scale. The composite scale had a very high internal consistency measure (Cronbach’s alpha) of .947. Self-reported data relating to police-detected driving offences, their crash involvement, and their intentions to break road rules within the next year were also collected. While the composite scale was only weakly correlated with self-reported crashes (r = .16, p < .001), it was moderately correlated with offences (r = .26, p < .001), and highly correlated with their intentions to break the road rules (r = .57, p < .001). Further application of the developed scale is needed to confirm the factor structure within other samples of young drivers both in Australia and in other countries. In addition, future research could explore the applicability of the scale for investigating the behaviour of other types of drivers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Obese children move less and with greater difficulty than normal-weight counterparts but expend comparable energy. Increased metabolic costs have been attributed to poor biomechanics but few studies have investigated the influence of obesity on mechanical demands of gait. This study sought to assess three-dimensional lower extremity joint powers in two walking cadences in 28 obese and normal-weight children. 3D-motion analysis was conducted for five trials of barefoot walking at self-selected and 30% greater than self-selected cadences. Mechanical power was calculated at the hip, knee, and ankle in sagittal, frontal and transverse planes. Significant group differences were seen for all power phases in the sagittal plane, hip and knee power at weight acceptance and hip power at propulsion in the frontal plane, and knee power during mid-stance in the transverse plane. After adjusting for body weight, group differences existed in hip and knee power phases at weight acceptance in sagittal and frontal planes, respectively. Differences in cadence existed for all hip joint powers in the sagittal plane and frontal plane hip power at propulsion. Frontal plane knee power at weight acceptance and sagittal plane knee power at propulsion were significantly different between cadences. Larger joint powers in obese children contribute to difficulty performing locomotor tasks, potentially decreasing motivation to exercise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this chapter, we are particularly concerned with making visible the general principles underlying the transmission of Social Studies curriculum knowledge, and considering it in light of a high-stakes mandated national assessment task. Specifically, we draw on Bernstein’s theoretical concept of pedagogic models as a tool for analysing orientations to teaching and learning. We introduce a case in point from the Australian context: one state Social Studies curriculum vis-a-vis one part of the Year Three national assessment measure for reading. We use our findings to consider the implications for the disciplinary knowledge of Social Studies in the communities in which we are undertaking our respective Australian Research Council Linkage project work (Glasswell et al.; Woods et al.). We propose that Social Studies disciplinary knowledge is being constituted, in part, through power struggles between different agencies responsible for the production and relay of official forms of state curriculum and national literacy assessment. This is particularly the case when assessment instruments are used to compare and contrast school results in highly visible web based league tables (see, for example, http://myschoolaustralia.ning.com/).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Anecdotal evidence from the infrastructure and building sectors highlights issues of drugs and alcohol and its association with safety risk on construction sites. Operating machinery and mobile equipment, proximity to live traffic together with congested sites, electrical equipment and operating at heights conspire to accentuate the potential adverse impact of drugs and alcohol in the workplace. While most Australian jurisdictions have identified this as a critical safety issue, information is limited regarding the prevalence of alcohol and other drugs in the workplace and there is limited evidential guidance regarding how to effectively and efficiently address such an issue. No known study has scientifically evaluated the relationship between the use of drugs and alcohol and safety impacts in construction, and there has been only limited adoption of nationally coordinated strategies, supported by employers and employees to render it socially unacceptable to arrive at a construction workplace with impaired judgement from drugs and alcohol. A nationally consistent collaborative approach across the construction workforce - involving employers and employees; clients; unions; contractors and sub-contractors is required to engender a cultural change in the construction workforce – in a similar manner to the on-going initiative in securing a cultural change to drink-driving in our society where peer intervention and support is encouraged. This study has four key objectives. Firstly, using the standard World Health Organisation AUDIT, a national qualitative and quantitative assessment of the use of drugs and alcohol will be carried out. This will build upon similar studies carried out in the Australian energy and mining sectors. Secondly, the development of an appropriate industry policy will adopt a non-punitive and rehabilitative approach developed in consultation with employers and employees across the infrastructure and building sectors, with the aim it be adopted nationally for adoption at the construction workplace. Thirdly, an industry-specific cultural change management program will be developed through a nationally collaborative approach to reducing the risk of impaired performance on construction sites and increasing workers’ commitment to drugs and alcohol safety. Finally, an implementation plan will be developed from data gathered from both managers and construction employees. Such an approach stands to benefit not only occupational health and safety, through a greater understanding of the safety impacts of alcohol and other drugs at work, but also alcohol and drug use as a wider community health issue. This paper will provide an overview of the background and significance of the study as well as outlining the proposed methodology that will be used to evaluate the safety impacts of alcohol and other drugs in the construction industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reliability of Critical Infrastructure is considered to be a fundamental expectation of modern societies. These large-scale socio-technical systems have always, due to their complex nature, been faced with threats challenging their ongoing functioning. However, increasing uncertainty in addition to the trend of infrastructure fragmentation has made reliable service provision not only a key organisational goal, but a major continuity challenge: especially given the highly interdependent network conditions that exist both regionally and globally. The notion of resilience as an adaptive capacity supporting infrastructure reliability under conditions of uncertainty and change has emerged as a critical capacity for systems of infrastructure and the organisations responsible for their reliable management. This study explores infrastructure reliability through the lens of resilience from an organisation and system perspective using two recognised resilience-enhancing management practices, High Reliability Theory (HRT) and Business Continuity Management (BCM) to better understand how this phenomenon manifests within a partially fragmented (corporatised) critical infrastructure industry – The Queensland Electricity Industry. The methodological approach involved a single case study design (industry) with embedded sub-units of analysis (organisations), utilising in-depth interviews and document analysis to illicit findings. Derived from detailed assessment of BCM and Reliability-Enhancing characteristics, findings suggest that the industry as a whole exhibits resilient functioning, however this was found to manifest at different levels across the industry and in different combinations. Whilst there were distinct differences in respect to resilient capabilities at the organisational level, differences were less marked at a systems (industry) level, with many common understandings carried over from the pre-corporatised operating environment. These Heritage Factors were central to understanding the systems level cohesion noted in the work. The findings of this study are intended to contribute to a body of knowledge encompassing resilience and high reliability in critical infrastructure industries. The research also has value from a practical perspective, as it suggests a range of opportunities to enhance resilient functioning under increasingly interdependent, networked conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Product placement is a fast growing multi-billion dollar industry yet measures of its effectiveness, which influence the critical area of pricing, have been problematic. Past attempts to measure the effect of a placement, and therefore provide a basis for pricing of placements, have been confounded by the effect on consumers of multiple prior exposures of a brand name in all marketing communications. Virtual product placement offers certain advantages: as a tool to measure the effectiveness of product placements; assistance with the problem of lack of audience selectivity in traditional product placement; testing different audiences for brands and addressing a gap in the existing academic literature by focusing on the impact of product placement on recall and recognition of new brands.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.