839 resultados para Effective performance
Resumo:
This Capstone focuses on the overview of generic performance evaluation process, characteristics of Generation X and Y employees in a workplace, first and second-hand research in the area of Effective Performance Evaluations for Generation X and Y employees, and recommends different approaches to performance evaluations for Generation X and Y employees to increase its effectiveness.
Resumo:
Mode of access: Internet.
Resumo:
Mode of access: Internet.
Resumo:
Increasing global competition, rapid technological changes, advances in manufacturing and information technology and discerning customers are forcing supply chains to adopt improvement practices that enable them to deliver high quality products at a lower cost and in a shorter period of time. A lean initiative is one of the most effective approaches toward achieving this goal. In the lean improvement process, it is critical to measure current and desired performance level in order to clearly evaluate the lean implementation efforts. Many attempts have tried to measure supply chain performance incorporating both quantitative and qualitative measures but failed to provide an effective method of measuring improvements in performances for dynamic lean supply chain situations. Therefore, the necessity of appropriate measurement of lean supply chain performance has become imperative. There are many lean tools available for supply chains; however, effectiveness of a lean tool depends on the type of the product and supply chain. One tool may be highly effective for a supply chain involved in high volume products but may not be effective for low volume products. There is currently no systematic methodology available for selecting appropriate lean strategies based on the type of supply chain and market strategy This thesis develops an effective method to measure the performance of supply chain consisting of both quantitative and qualitative metrics and investigates the effects of product types and lean tool selection on the supply chain performance Supply chain performance matrices and the effects of various lean tools over performance metrics mentioned in the SCOR framework have been investigated. A lean supply chain model based on the SCOR metric framework is then developed where non- lean and lean as well as quantitative and qualitative metrics are incorporated in appropriate metrics. The values of appropriate metrics are converted into triangular fuzzy numbers using similarity rules and heuristic methods. Data have been collected from an apparel manufacturing company for multiple supply chain products and then a fuzzy based method is applied to measure the performance improvements in supply chains. Using the fuzzy TOPSIS method, which chooses an optimum alternative to maximise similarities with positive ideal solutions and to minimise similarities with negative ideal solutions, the performances of lean and non- lean supply chain situations for three different apparel products have been evaluated. To address the research questions related to effective performance evaluation method and the effects of lean tools over different types of supply chains; a conceptual framework and two hypotheses are investigated. Empirical results show that implementation of lean tools have significant effects over performance improvements in terms of time, quality and flexibility. Fuzzy TOPSIS based method developed is able to integrate multiple supply chain matrices onto a single performance measure while lean supply chain model incorporates qualitative and quantitative metrics. It can therefore effectively measure the improvements for supply chain after implementing lean tools. It is demonstrated that product types involved in the supply chain and ability to select right lean tools have significant effect on lean supply chain performance. Future study can conduct multiple case studies in different contexts.
Resumo:
We addressed four research questions, each relating to the training and assessment of the competencies associated with the performance of ultrasound-guided axillary brachial plexus blockade (USgABPB). These were: (i) What are the most important determinants of learning of USgABPB? (ii) What is USgABPB? What are the errors most likely to occur when trainees learn to perform this procedure? (iii) How should end-user input be applied to the development of a novel USgABPB simulator? (iv) Does structured simulation based training influence novice learning of the procedure positively? We demonstrated that the most important determinants of learning USgABPB are: (a) Access to a formal structured training programme. (b) Frequent exposure to clinical learning opportunity in an appropriate setting (c) A clinical learning opporunity requires an appropriate patient, trainee and teacher being present at the same time, in an appropriate environment. We carried out a comprehensive description of the procedure. We performed a formal task analysis of USgABPB, identifying (i) 256 specific tasks associated with the safe and effective performance of the procedure, and (ii) the 20 most critical errors likely to occur in this setting. We described a methodology for this and collected data based on detailed, sequential evaluation of prototypes by trainees in anaesthesia. We carried out a pilot randomised control trial assessing the effectiveness of a USgABPB simulator during its development. Our data did not enable us to draw a reliable conclusion to this question; the trail did provide important new learning (as a pilot) to inform future investigation of this question. We believe that the ultimate goal of designing effective simulation-based training and assessment of ultrasound-guided regional anaesthesia is closer to realisation as a result of this work. It remains to be proven if this approach will have a positive impact on procedural performance, and more importantly improve patient outcomes.
Resumo:
Top management from retail banks must delegate authority to lower-level managers to operate branches and service centers. Doing so, they must navigate through conflicts of interest, asymmetric information and limited monitoring in designing compensation plans for such agents. Pursuant to this delegation, the banks adopt a system of performance targets and incentives to align the interests of senior management and unit managers. This paper evaluates the causal relationship between performance-based salaries and managers’ effective performance. We use a fixed effects estimator to analyze an unbalanced panel of data from one of the largest Brazilian retail banks during the period from January 2007 to June 2009. The results indicate that agents with guaranteed variable salary contracts demonstrate inferior performance compared with agents who have performance-based compensation packages. We conclude that there is a moral hazard that can be observed in the behavior of agents who are subject to guaranteed variable salary contracts.
Resumo:
Orthodox contingency theory links effective organisational performance to compatible relationships between the environment and organisation strategy and structure and assumes that organisations have the capacity to adapt as the environment changes. Recent contributions to the literature on organisation theory claim that the key to effective performance is effective adaptation which in turn requires the simultaneous reconciliation of efficiency and innovation which is afforded by an unique environment-organisation configuration. The literature on organisation theory recognises the continuing confusion caused by the fragmented and often conflicting results from cross-sectional studies. Although the case is made for longitudinal studies which comprehensively describe the evolving relationship between the environment and the organisation there is little to suggest how such studies should be executed in practice. Typically the choice is between the approaches of the historicised case study and statistical analysis of large populations which examine the relationship between environment and organisation strategy and/or structure and ignore the product-process relationship. This study combines the historicised case study and the multi-variable and ordinal scale approach of statistical analysis to construct an analytical framework which tracks and exposes the environment-organisation-performance relationship over time. The framework examines changes in the environment, strategy and structure and uniquely includes an assessment of the organisation's product-process relationship and its contribution to organisational efficiency and innovation. The analytical framework is applied to examine the evolving environment-organisation relationship of two organisations in the same industry over the same twenty-five year period to provide a sector perspective of organisational adaptation. The findings demonstrate the significance of the environment-organisation configuration to the scope and frequency of adaptation and suggest that the level of sector homogeneity may be linked to the level of product-process standardisation.
Resumo:
The author looks at trends in software and systems, and the current and likely implications of these trends on the discipline of performance engineering. In particular, he examines software complexity growth and its consequences for performance engineering for enhanced understanding, more efficient analysis and effective performance improvement. The pressures for adaptive and autonomous systems introduce further opportunities for performance innovation. The promise of aspect oriented software development technologies for assisting with some of these challenges is introduced.
Resumo:
Construction teams and construction organisations have their own distinctive cultures. There also exists an infrastructure, both social and contractual, which ensures that these projects within which the teams operate are completed successfully. It is these issues which this research has addressed. The project was instigated by Queensland Department of Main Roads, Public Works and John Holland Group in order to address how they might better implement relationship management (RM) on their construction projects. The project was devised initially in order to facilitate a change in culture which would allow the project to be run in a relational manner and would lead to effective performance in terms of the KPIs that the organisations set for themselves, described as business better than usual. This report describes the project, its outcomes and deliverable and indicates the changes that were made to the project during the research process. Hence, the initial premise of the project and the problem to investigate was the implementation of relational contracting: • throughout a range of projects; • with a focus on client body staff. The additions that were made to the project, and documented in the variations to the project, included two major additional areas of study: • client management and stakeholder management; • a live case study of an alliancing project. The context within which the research was undertaken is important. The research was driven by main roads with their desire to improve their operations by focusing on the relationship between the major project participants (however, stakeholder and client organisation management became an obvious issue as the research progressed, hence the variations). The context was initially focussed on main roads, public works and John Holland group organisations but it became clear very quickly that this was in fact an industry-wide issue and not an issue specific solely to the project participants. Hence, the context within which this research took place can be described as below: The deliverables from the project are a toolkit for determining RM needs in an organisation, a monograph describing the practical implementation of RM and the outline for a RM CPD and Masters course
Resumo:
This thesis is a study of new design methods for allowing evolutionary algorithms to be more effectively utilised in aerospace optimisation applications where computation needs are high and computation platform space may be restrictive. It examines the applicability of special hardware computational platforms known as field programmable gate arrays and shows that with the right implementation methods they can offer significant benefits. This research is a step forward towards the advancement of efficient and highly automated aircraft systems for meeting compact physical constraints in aerospace platforms and providing effective performance speedups over traditional methods.
Resumo:
Data mining involves nontrivial process of extracting knowledge or patterns from large databases. Genetic Algorithms are efficient and robust searching and optimization methods that are used in data mining. In this paper we propose a Self-Adaptive Migration Model GA (SAMGA), where parameters of population size, the number of points of crossover and mutation rate for each population are adaptively fixed. Further, the migration of individuals between populations is decided dynamically. This paper gives a mathematical schema analysis of the method stating and showing that the algorithm exploits previously discovered knowledge for a more focused and concentrated search of heuristically high yielding regions while simultaneously performing a highly explorative search on the other regions of the search space. The effective performance of the algorithm is then shown using standard testbed functions and a set of actual classification datamining problems. Michigan style of classifier was used to build the classifier and the system was tested with machine learning databases of Pima Indian Diabetes database, Wisconsin Breast Cancer database and few others. The performance of our algorithm is better than others.
Resumo:
In this paper, we propose a self Adaptive Migration Model for Genetic Algorithms, where parameters of population size, the number of points of crossover and mutation rate for each population are fixed adaptively. Further, the migration of individuals between populations is decided dynamically. This paper gives a mathematical schema analysis of the method stating and showing that the algorithm exploits previously discovered knowledge for a more focused and concentrated search of heuristically high yielding regions while simultaneously performing a highly explorative search on the other regions of the search space. The effective performance of the algorithm is then shown using standard testbed functions, when compared with Island model GA(IGA) and Simple GA(SGA).
Resumo:
In this paper, we propose a self Adaptive Migration Model for Genetic Algorithms, where parameters of population size, the number of points of crossover and mutation rate for each population are fixed adaptively. Further, the migration of individuals between populations is decided dynamically. This paper gives a mathematical schema analysis of the method stating and showing that the algorithm exploits previously discovered knowledge for a more focused and concentrated search of heuristically high yielding regions while simultaneously performing a highly explorative search on the other regions of the search space. The effective performance of the algorithm is then shown using standard testbed functions, when compared with Island model GA(IGA) and Simple GA(SGA).
Resumo:
Increasing nitrate concentrations in ground water is deleterious to human health as ingestion of such water can cause methemoglobinemia in infants and even cancer in adults (desirable limit for nitrate as NO3 - 45 mg/L, IS code 10500-1991). Excess nitrate concentrations in ground water is contributed by reason being disposal of sewage and excessive use of fertilizers. Though numerous technologies such as reverse osmosis, ion exchange, electro-dialysis, permeable reactive barriers using zerovalent iron etc exists, nitrate removal continues to be one of challenging issue as nitrate ion is highly mobile within the soil strata. The tapping the denitrification potential of soil denitrifiers which are inherently available in the soil matrix is the most sustainable approach to mitigate accumulation of nitrate in ground water. The insitu denitrification of sand and bentonite enhanced sand (bentonite content = 5%) in presence of easily assimilable organic carbon such as ethanol was studied. Batch studies showed that nitrate reduction by sand follows first order kinetics with a rate constant 5.3x10(-2) hr(-1) and rate constant 4.3 x 10(-2) hr(-1) was obtained for bentonite-enhanced sand (BS) at 25 degrees C. Filter columns (height = 5 cm and diameter = 8.2 cm) were constructed using sand and bentonite-enhanced sand as filter media. The filtration rate through both the filter columns was maintained at average value of 2.60 cm/h. The nitrate removal rates through both the filter media was assessed for solution containing 22.6 mg NO3-N/L concentrations while keeping C/N mass ratio as 3. For sand filter column, the nitrate removal efficiency reached the average value of 97.6% after passing 50 pore volumes of the nitrate solution. For bentonite-enhanced sand filter column, the average nitrate removal efficiency was 83.5%. The time required for effective operation for sand filter bed was 100 hours, while bentonite-enhanced sand filter bed did not require any maturation period as that of sand filter bed for effective performance because the presence of micropores in bentonite increases the hydraulic retention time of the solution inside the filter bed.
Resumo:
Mechanical principles of fibre-optic disc accelerometers (FODA) different from those assumed in previous calculation methods are presented. An FODA with a high sensitivity of 82 rad/ g and a resonance frequency of 360 Hz is designed and tested. In this system, the minimum measurable demodulation phase of the phase-generated carrier (PGC) is 10(-5) rad, and the minimum acceleration reaches 120 ng theoretically. This kind of FODA, with its high responsivity, all-optic-fibre configuration, small size, light weight and stiff shell housing, ensures effective performance in practice.