1000 resultados para KEYHOLE APPROACH
Resumo:
The successful management of workplace safety has many benefits for employees, employers and the community. Similar to other areas of job performance, safety performance can be enhanced through appropriate and well-designed training. The foundation of the development of effective training is a thorough training needs analysis (TNA). Currently, the application of psychometrically valid TNA practices for the management of workplace safety is an under-researched topic and limited guidance is available for implementing appropriate strategies. To address this gap in the literature, this chapter will provide an overview of TNA practices, including the purpose and benefits associated with implementing the systematic procedure. A case study will then be presented to illustrate how the TNA process was successfully applied to investigate the training needs of Australasian rail incident investigators to achieve an industry-approved national training package. Recommendations will be made to assist practitioners with implementing TNA practices with the goal of enhancing workplace safety management through targeted workforce development.
Resumo:
The conventional approach to setting a milling unit is essentially based on the desire to achieve a particular bagasse moisture content or fibre fill in each nip of the mill. This approach relies on the selection of the speed at which the mill will operate for the selected fibre rate. There is rarely any checking that the selected speed or the selected fibre fill is achieved and the same set of assumptions is generally carried over to use again in the next year. The conventional approach largely ignores the fact that the selection of mill settings actually determines the speed at which the mill will operate. Making an adjustment with the intent of changing the performance of the mill often also changes the speed of the mill as an unintended consequence. This paper presents an alternative approach to mill setting. The approach discussed makes use of mill feeding theory to define the relationship between fibre rate, mill speed and mill settings and uses that theory to provide an alternative means of determining the settings in some nips of the mill. Mill feeding theory shows that, as the feed work opening reduces, roll speed increases. The theory also shows that there is an optimal underfeed opening and Donnelly chute exit opening that will minimise roll speed and that the current South African guidelines appear to be well away from those optimal values.
Resumo:
Public acceptance is consistently listed as having an enormous impact on the implementation and success of a congestion charge scheme. This paper investigates public acceptance of such a scheme in Australia. Surveys were conducted in Brisbane and Melbourne, the two fastest growing Australian cities. Using an ordered logit modeling approach, the survey data including stated preferences were analyzed to pinpoint the important factors influencing people’s attitudes to a congestion charge and, in turn, to their transport mode choices. To accommodate the nature of, and to account for the resulting heterogeneity of the panel data, random effects were considered in the models. As expected, this study found that the amount of the congestion charge and the financial benefits of implementing it have a significant influence on respondents’ support for the charge and on the likelihood of their taking a bus to city areas. However, respondents’ current primary transport mode for travelling to the city areas has a more pronounced impact. Meanwhile, respondents’ perceptions of the congestion charge’s role in protecting the environment by reducing vehicle emissions, and of the extent to which the charge would mean that they travelled less frequently to the city for shopping or entertainment, also have a significant impact on their level of support for its implementation. We also found and explained notable differences across two cities. Finally, findings from this study have been fully discussed in relation to the literature.
Resumo:
Product Ecosystem theory is an emerging theory that shows that disruptive “game changing” innovation is only possible when the entire ecosystem is considered. When environmental variables change faster than products or services can adapt, disruptive innovation is required to keep pace. This has many parallels with natural ecosystems where species that cannot keep up with changes to the environment will struggle or become extinct. In this case the environment is the city, the environmental pressures are pollution and congestion, the product is the car and the product ecosystem is comprised of roads, bridges, traffic lights, legislation, refuelling facilities etc. Each one of these components is the responsibility of a different organisation and so any change that affects the whole ecosystem requires a transdisciplinary approach. As a simple example, cars that communicate wirelessly with traffic lights are only of value if wireless-enabled traffic lights exist and vice versa. Cars that drive themselves are technically possible but legislation in most places doesn’t allow their use. According to innovation theory, incremental innovation tends to chase ever diminishing returns and becomes increasingly unable to tackle the “big issues.” Eventually “game changing” disruptive innovation comes along and solves the “big issues” and/or provides new opportunities. Seen through this lens, the environmental pressures of urban traffic congestion and pollution are the “big issues.” It can be argued that the design of cars and the other components of the product ecosystem follow an incremental innovation approach. That is why the “big issues” remain unresolved. This paper explores the problems of pollution and congestion in urban environments from a Product Ecosystem perspective. From this a strategy will be proposed for a transdisciplinary approach to develop and implement solutions.
Resumo:
Entertainment is a key cultural category. Yet the definition of entertainment can differ depending upon whom one asks. This article maps out understandings of entertainment in three key areas. Within industrial discourses, entertainment is defined by a commercial business model. Within evaluative discourses used by consumers and critics, it is understood through an aesthetic system that privileges emotional engagement, story, speed and vulgarity. Within academia, entertainment has not been a key organizing concept within the humanities, despite the fact that it is one of the central categories used by producers and consumers of culture. It has been important within psychology, where entertainment is understood in a solipsistic sense as being anything that an individual finds entertaining. Synthesizing these approaches, the authors propose a cross-sectoral definition of entertainment as ‘audience-centred commercial culture’.
Resumo:
With the introduction of the Personally Controlled Health Record (PCEHR), the Australian public is being asked to accept greater responsibility for their healthcare. Although well designed, constructed and intentioned, policy and privacy concerns have resulted in an eHealth model that may impact future health information sharing requirements. Thus an opportunity to transform the beleaguered Australian PCEHR into a sustainable on-demand technology consumption model for patient safety must be explored further. Moreover, the current clerical focus of healthcare practitioners must be renegotiated to establish a shared knowledge creation landscape of action for safer patient interventions. To achieve this potential however requires a platform that will facilitate efficient and trusted unification of all health information available in real-time across the continuum of care. As a conceptual paper, the goal of the authors is to deliver insights into the antecedents of usage influencing superior patient outcomes within an eHealth-as-a-Service framework. To achieve this, the paper attempts to distil key concepts and identify common themes drawn from a preliminary literature review of eHealth and cloud computing concepts, specifically cloud service orchestration to establish a conceptual framework and a research agenda. Initial findings support the authors’ view that an eHealth-as-a-Service (eHaaS) construct will serve as a disruptive paradigm shift in the aggregation and transformation of health information for use as real-world knowledge in patient care scenarios. Moreover, the strategic value of extending the community Health Record Bank (HRB) model lies in the ability to automatically draw on a multitude of relevant data repositories and sources to create a single source of practice based evidence and to engage market forces to create financial sustainability.
Resumo:
This paper translates the concepts of sustainable production to three dimensions of economic, environmental and ecological sustainability to analyze optimal production scales by solving optimizing problems. Economic optimization seeks input-output combinations to maximize profits. Environmental optimization searches for input-output combinations that minimize the polluting effects of materials balance on the surrounding environment. Ecological optimization looks for input-output combinations that minimize the cumulative destruction of the entire ecosystem. Using an aggregate space, the framework illustrates that these optimal scales are often not identical because markets fail to account for all negative externalities. Profit-maximizing firms normally operate at the scales which are larger than optimal scales from the viewpoints of environmental and ecological sustainability; hence policy interventions are favoured. The framework offers a useful tool for efficiency studies and policy implication analysis. The paper provides an empirical investigation using a data set of rice farms in South Korea.
Resumo:
This article integrates the material/energy flow analysis into a production frontier framework to quantify resource efficiency (RE). The emergy content of natural resources instead of their mass content is used to construct aggregate inputs. Using the production frontier approach, aggregate inputs will be optimised relative to given output quantities to derive RE measures. This framework is superior to existing RE indicators currently used in the literature. Using the exergy/emergy content in constructing aggregate material or energy flows overcomes a criticism that mass content cannot be used to capture different quality of differing types of resources. Derived RE measures are both ‘qualitative’ and ‘quantitative’, whereas existing RE indicators are only qualitative. An empirical examination into the RE of 116 economies was undertaken to illustrate the practical applicability of the new framework. The results showed that economies, on average, could reduce the consumption of resources by more than 30% without any reduction in per capita gross domestic product (GDP). This calculation occurred after adjustments for differences in the purchasing power of national currencies. The existence of high variations in RE across economies was found to be positively correlated with participation of people in labour force, population density, urbanisation, and GDP growth over the past five years. The results also showed that economies of a higher income group achieved higher RE, and those economies that are more dependent on imports and primary industries would have lower RE performance.
Resumo:
This paper demonstrates a renewed procedure for the quantification of surface-enhanced Raman scattering (SERS) enhancement factors with improved precision. The principle of this method relies on deducting the resonance Raman scattering (RRS) contribution from surface-enhanced resonance Raman scattering (SERRS) to end up with the surface enhancement (SERS) effect alone. We employed 1,8,15,22-tetraaminophthalocyanato-cobalt(II) (4α-CoIITAPc), a resonance Raman- and electrochemically redox-active chromophore, as a probe molecule for RRS and SERRS experiments. The number of 4α-CoIITAPc molecules contributing to RRS and SERRS phenomena on plasmon inactive glassy carbon (GC) and plasmon active GC/Au surfaces, respectively, has been precisely estimated by cyclic voltammetry experiments. Furthermore, the SERS substrate enhancement factor (SSEF) quantified by our approach is compared with the traditionally employed methods. We also demonstrate that the present approach of SSEF quantification can be applied for any kind of different SERS substrates by choosing an appropriate laser line and probe molecule.
Resumo:
In this paper we propose a new multivariate GARCH model with time-varying conditional correlation structure. The time-varying conditional correlations change smoothly between two extreme states of constant correlations according to a predetermined or exogenous transition variable. An LM–test is derived to test the constancy of correlations and LM- and Wald tests to test the hypothesis of partially constant correlations. Analytical expressions for the test statistics and the required derivatives are provided to make computations feasible. An empirical example based on daily return series of five frequently traded stocks in the S&P 500 stock index completes the paper.
Resumo:
Abnormal event detection has attracted a lot of attention in the computer vision research community during recent years due to the increased focus on automated surveillance systems to improve security in public places. Due to the scarcity of training data and the definition of an abnormality being dependent on context, abnormal event detection is generally formulated as a data-driven approach where activities are modeled in an unsupervised fashion during the training phase. In this work, we use a Gaussian mixture model (GMM) to cluster the activities during the training phase, and propose a Gaussian mixture model based Markov random field (GMM-MRF) to estimate the likelihood scores of new videos in the testing phase. Further-more, we propose two new features: optical acceleration, and the histogram of optical flow gradients; to detect the presence of any abnormal objects and speed violations in the scene. We show that our proposed method outperforms other state of the art abnormal event detection algorithms on publicly available UCSD dataset.
Resumo:
In the past few years, there has been a steady increase in the attention, importance and focus of green initiatives related to data centers. While various energy aware measures have been developed for data centers, the requirement of improving the performance efficiency of application assignment at the same time has yet to be fulfilled. For instance, many energy aware measures applied to data centers maintain a trade-off between energy consumption and Quality of Service (QoS). To address this problem, this paper presents a novel concept of profiling to facilitate offline optimization for a deterministic application assignment to virtual machines. Then, a profile-based model is established for obtaining near-optimal allocations of applications to virtual machines with consideration of three major objectives: energy cost, CPU utilization efficiency and application completion time. From this model, a profile-based and scalable matching algorithm is developed to solve the profile-based model. The assignment efficiency of our algorithm is then compared with that of the Hungarian algorithm, which does not scale well though giving the optimal solution.
Resumo:
Driver training is one of the interventions aimed at mitigating the number of crashes that involve novice drivers. Our failure to understand what is really important for learners, in terms of risky driving, is one of the many drawbacks restraining us to build better training programs. Currently, there is a need to develop and evaluate Advanced Driving Assistance Systems that could comprehensively assess driving competencies. The aim of this paper is to present a novel Intelligent Driver Training System (IDTS) that analyses crash risks for a given driving situation, providing avenues for improvement and personalisation of driver training programs. The analysis takes into account numerous variables acquired synchronously from the Driver, the Vehicle and the Environment (DVE). The system then segments out the manoeuvres within a drive. This paper further presents the usage of fuzzy set theory to develop the safety inference rules for each manoeuvre executed during the drive. This paper presents a framework and its associated prototype that can be used to comprehensively view and assess complex driving manoeuvres and then provide a comprehensive analysis of the drive used to give feedback to novice drivers.
Resumo:
Background Multi attribute utility instruments (MAUIs) are preference-based measures that comprise a health state classification system (HSCS) and a scoring algorithm that assigns a utility value to each health state in the HSCS. When developing a MAUI from a health-related quality of life (HRQOL) questionnaire, first a HSCS must be derived. This typically involves selecting a subset of domains and items because HRQOL questionnaires typically have too many items to be amendable to the valuation task required to develop the scoring algorithm for a MAUI. Currently, exploratory factor analysis (EFA) followed by Rasch analysis is recommended for deriving a MAUI from a HRQOL measure. Aim To determine whether confirmatory factor analysis (CFA) is more appropriate and efficient than EFA to derive a HSCS from the European Organisation for the Research and Treatment of Cancer’s core HRQOL questionnaire, Quality of Life Questionnaire (QLQ-C30), given its well-established domain structure. Methods QLQ-C30 (Version 3) data were collected from 356 patients receiving palliative radiotherapy for recurrent/metastatic cancer (various primary sites). The dimensional structure of the QLQ-C30 was tested with EFA and CFA, the latter informed by the established QLQ-C30 structure and views of both patients and clinicians on which are the most relevant items. Dimensions determined by EFA or CFA were then subjected to Rasch analysis. Results CFA results generally supported the proposed QLQ-C30 structure (comparative fit index =0.99, Tucker–Lewis index =0.99, root mean square error of approximation =0.04). EFA revealed fewer factors and some items cross-loaded on multiple factors. Further assessment of dimensionality with Rasch analysis allowed better alignment of the EFA dimensions with those detected by CFA. Conclusion CFA was more appropriate and efficient than EFA in producing clinically interpretable results for the HSCS for a proposed new cancer-specific MAUI. Our findings suggest that CFA should be recommended generally when deriving a preference-based measure from a HRQOL measure that has an established domain structure.
Resumo:
Heterogeneous health data is a critical issue when managing health information for quality decision making processes. In this paper we examine the efficient aggregation of lifestyle information through a data warehousing architecture lens. We present a proof of concept for a clinical data warehouse architecture that enables evidence based decision making processes by integrating and organising disparate data silos in support of healthcare services improvement paradigms.