65 resultados para task performance benchmarking
Resumo:
Higher education in business school environments is increasingly focused on how to best equip students with the skills necessary for leadership in the global workplace. This paper examines the impact of two particularly important cognitive capabilities - task reflexivity and intercultural sensitivity, on academic performance in an MBA programme. It was hypothesised that in an intercultural learning environment, task reflexivity would be associated with higher academic performance, and that this relationship would be mediated via intercultural sensitivity. Questionnaire data from 77 MBA students was analysed alongside academic performance. Results demonstrated that task reflexivity was indirectly related to academic performance through intercultural sensitivity. These findings suggest that engagement in task reflexivity enables students to develop greater levels of intercultural sensitivity, allowing them to reap the positive effects of diversity in their peer group for their own learning and performance. Limitations and practical implications of the research for professional practice are discussed. © 2014 © 2014 Society for Research into Higher Education.
Resumo:
Purpose – The purpose of this paper is to measure the performance of commercial virtual learning environment (VLE) systems, which helps the decision makers to select the appropriate system for their institutions. Design/methodology/approach – This paper develops an integrated multiple criteria decision making approach, which combines the analytic hierarchy process (AHP) and quality function deployment (QFD), to evaluate and select the best system. The evaluating criteria are derived from the requirements of those who use the system. A case study is provided to demonstrate how the integrated approach works. Findings – The major advantage of the integrated approach is that the evaluating criteria are of interest to the stakeholders. This ensures that the selected system will achieve the requirements and satisfy the stakeholders most. Another advantage is that the approach can guarantee the benchmarking to be consistent and reliable. From the case study, it is proved that the performance of a VLE system being used at the university is the best. Therefore, the university should continue to run the system in order to support and facilitate both teaching and learning. Originality/value – It is believed that there is no study that measures the performance of VLE systems, and thus decision makers may have difficulties in system evaluation and selection for their institutions.
Resumo:
The n-tuple recognition method was tested on 11 large real-world data sets and its performance compared to 23 other classification algorithms. On 7 of these, the results show no systematic performance gap between the n-tuple method and the others. Evidence was found to support a possible explanation for why the n-tuple method yields poor results for certain datasets. Preliminary empirical results of a study of the confidence interval (the difference between the two highest scores) are also reported. These suggest a counter-intuitive correlation between the confidence interval distribution and the overall classification performance of the system.
Resumo:
The problem of resource allocation in sparse graphs with real variables is studied using methods of statistical physics. An efficient distributed algorithm is devised on the basis of insight gained from the analysis and is examined using numerical simulations, showing excellent performance and full agreement with the theoretical results.
Resumo:
As a means of benchmarking their position and assisting with anticipating an uncertain future, the identification of critical information systems (IS) management issues frameworks is becoming an increasingly important research task for both academics and industrialists. This paper provides a description and summary of previous work on identifying IS issues frameworks by reviewing 20 research investigations in terms of what they studied and how they were conducted. It also suggests some possible directions and methodologies for future research. The summary and suggestions for further work are applicable for issues framework research in the IS management field as well as in other business and management areas.
Resumo:
Purpose – The purpose of the paper is to develop an integrated framework for performance management of healthcare services. Design/methodology/approach – This study develops a performance management framework for healthcare services using a combined analytic hierarchy process (AHP) and logical framework (LOGFRAME). The framework is then applied to the intensive care units of three different hospitals in developing nations. Numerous focus group discussions were undertaken, involving experts from the specific area under investigation. Findings – The study reveals that a combination of outcome, structure and process-based critical success factors and a combined AHP and LOGFRAME-based performance management framework helps manage performance of healthcare services. Practical implications – The proposed framework could be practiced in hospital-based healthcare services. Originality/value – The conventional approaches to healthcare performance management are either outcome-based or process-based, which cannot reveal improvement measures appropriately in order to assure superior performance. Additionally, they lack planning, implementing and evaluating improvement projects that are identified from performance measurement. This study presents an integrated approach to performance measurement and implementing framework of improvement projects.
Resumo:
Effective management of projects is becoming increasingly important for any type of organization to remain competitive in today’s dynamic business environment due to pressure of globalization. The use of benchmarking is widening as a technique for supporting project management. Benchmarking can be described as the search for the best practices, leading to the superior performance of an organization. However, effectiveness of benchmarking depends on the use of tools for collecting and analyzing information and deriving subsequent improvement projects. This study demonstrates how analytic hierarchy process (AHP), a multiple attribute decision-making technique, can be used for benchmarking project management practices. The entire methodology has been applied to benchmark project management practice of Caribbean public sector organizations with organizations in the Indian petroleum sector, organizations in the infrastructure sector of Thailand and the UK. This study demonstrates the effectiveness of a proposed benchmarking model using AHP, determines problems and issues of Caribbean project management in the public sector and suggests improvement measures for effective project management.
Resumo:
People and their performance are key to an organization's effectiveness. This review describes an evidence-based framework of the links between some key organizational influences and staff performance, health and well-being. This preliminary framework integrates management and psychological approaches, with the aim of assisting future explanation, prediction and organizational change. Health care is taken as the focus of this review, as there are concerns internationally about health care effectiveness. The framework considers empirical evidence for links between the following organizational levels: 1. Context (organizational culture and inter-group relations; resources, including staffing; physical environment) 2. People management (HRM practices and strategies; job design, workload and teamwork; employee involvement and control over work; leadership and support) 3. Psychological consequences for employees (health and stress; satisfaction and commitment; knowledge, skills and motivation) 4. Employee behaviour (absenteeism and turnover; task and contextual performance; errors and near misses) 5. Organizational performance; patient care. This review contributes to an evidence base for policies and practices of people management and performance management. Its usefulness will depend on future empirical research, using appropriate research designs, sufficient study power and measures that are reliable and valid.
Resumo:
This is a review of studies that have investigated the proposed rehabilitative benefit of tinted lenses and filters for people with low vision. Currently, eye care practitioners have to rely on marketing literature and anecdotal reports from users when making recommendations for tinted lens or filter use in low vision. Our main aim was to locate a prescribing protocol that was scientifically based and could assist low vision specialists with tinted lens prescribing decisions. We also wanted to determine if previous work had found any tinted lens/task or tinted lens/ocular condition relationships, i.e. were certain tints or filters of use for specific tasks or for specific eye conditions. Another aim was to provide a review of previous research in order to stimulate new work using modern experimental designs. Past studies of tinted lenses and low vision have assessed effects on visual acuity (VA), grating acuity, contrast sensitivity (CS), visual field, adaptation time, glare, photophobia and TV viewing. Objective and subjective outcome measures have been used. However, very little objective evidence has been provided to support anecdotal reports of improvements in visual performance. Many studies are flawed in that they lack controls for investigator bias, and placebo, learning and fatigue effects. Therefore, the use of tinted lenses in low vision remains controversial and eye care practitioners will have to continue to rely on anecdotal evidence to assist them in their prescribing decisions. Suggestions for future research, avoiding some of these experimental shortcomings, are made. © 2002 The College of Optometrists.
Resumo:
Visual detection performance (d') is usually an accelerating function of stimulus contrast, which could imply a smooth, threshold-like nonlinearity in the sensory response. Alternatively, Pelli (1985 Journal of the Optical Society of America A 2 1508 - 1532) developed the 'uncertainty model' in which responses were linear with contrast, but the observer was uncertain about which of many noisy channels contained the signal. Such internal uncertainty effectively adds noise to weak signals, and predicts the nonlinear psychometric function. We re-examined these ideas by plotting psychometric functions (as z-scores) for two observers (SAW, PRM) with high precision. The task was to detect a single, vertical, blurred line at the fixation point, or identify its polarity (light vs dark). Detection of a known polarity was nearly linear for SAW but very nonlinear for PRM. Randomly interleaving light and dark trials reduced performance and rendered it non-linear for SAW, but had little effect for PRM. This occurred for both single-interval and 2AFC procedures. The whole pattern of results was well predicted by our Monte Carlo simulation of Pelli's model, with only two free parameters. SAW (highly practised) had very low uncertainty. PRM (with little prior practice) had much greater uncertainty, resulting in lower contrast sensitivity, nonlinear performance, and no effect of external (polarity) uncertainty. For SAW, identification was about v2 better than detection, implying statistically independent channels for stimuli of opposite polarity, rather than an opponent (light - dark) channel. These findings strongly suggest that noise and uncertainty, rather than sensory nonlinearity, limit visual detection.
Resumo:
Research into social facilitation effects reveals three factors affecting response performance: types of task, types of audience and type of actor. This study attempts to establish a minimal baseline for task and audience type in order to examine difference between personality types in the actors. Results indicate that performance in both extraverts and introverts increases in the minimal conditions of the mere presence of another person whilst carrying out a simple mathematical task. These results are interpreted through an analysis of Zajonc's (1965) drive theory with Eysenck's (1967) personality theory indicating that through further investigation performance curves might be devised for introverts and extraverts performing under a variety of task and audience conditions.
Resumo:
Benchmarking techniques have evolved over the years since Xerox’s pioneering visits to Japan in the late 1970s. The focus of benchmarking has also shifted during this period. By tracing in detail the evolution of benchmarking in one specific area of business activity, supply and distribution management, as seen by the participants in that evolution, creates a picture of a movement from single function, cost-focused, competitive benchmarking, through cross-functional, cross-sectoral, value-oriented benchmarking to process benchmarking. As process efficiency and effectiveness become the primary foci of benchmarking activities, the measurement parameters used to benchmark performance converge with the factors used in business process modelling. The possibility is therefore emerging of modelling business processes and then feeding the models with actual data from benchmarking exercises. This would overcome the most common criticism of benchmarking, namely that it intrinsically lacks the ability to move beyond current best practice. In fact the combined power of modelling and benchmarking may prove to be the basic building block of informed business process re-engineering.
Resumo:
As the backbone of e-business, Enterprise Resource Planning (ERP)system plays an important role in today's competitive business environment. Few publications discuss the application of ERP systems in a virtual enterprise (VE). A VE is defined as a dynamic partnership among enterprises that can bring together complementary core competencies needed to achieve a business task. Since VE strongly emphasises partner cooperation, specific issues exist relative to the implementation of ERP systems in a VE. This paper discusses the use of VE Performance Measurement System(VEPMS) to coordinate ERP systems of VE partners. It also defines the framework of a `Virtual Enterprise Resource Planning (VERP) system', and identifies research avenues in this field.
Resumo:
We investigate the feasibility of simultaneous suppressing of the amplification noise and nonlinearity, representing the most fundamental limiting factors in modern optical communication. To accomplish this task we developed a general design optimisation technique, based on concepts of noise and nonlinearity management. We demonstrate the immense efficiency of the novel approach by applying it to a design optimisation of transmission lines with periodic dispersion compensation using Raman and hybrid Raman-EDFA amplification. Moreover, we showed, using nonlinearity management considerations, that the optimal performance in high bit-rate dispersion managed fibre systems with hybrid amplification is achieved for a certain amplifier spacing – which is different from commonly known optimal noise performance corresponding to fully distributed amplification. Required for an accurate estimation of the bit error rate, the complete knowledge of signal statistics is crucial for modern transmission links with strong inherent nonlinearity. Therefore, we implemented the advanced multicanonical Monte Carlo (MMC) method, acknowledged for its efficiency in estimating distribution tails. We have accurately computed acknowledged for its efficiency in estimating distribution tails. We have accurately computed marginal probability density functions for soliton parameters, by numerical modelling of Fokker-Plank equation applying the MMC simulation technique. Moreover, applying a powerful MMC method we have studied the BER penalty caused by deviations from the optimal decision level in systems employing in-line 2R optical regeneration. We have demonstrated that in such systems the analytical linear approximation that makes a better fit in the central part of the regenerator nonlinear transfer function produces more accurate approximation of the BER and BER penalty. We present a statistical analysis of RZ-DPSK optical signal at direct detection receiver with Mach-Zehnder interferometer demodulation