896 resultados para Team Evaluation Models
Resumo:
This thesis has been concerned with obtaining evidence to explore the proposition that the provision of occupational health services as arranged at the present time represents a misallocation of resources. The research has been undertaken within the occupational health service of a large Midlands food factory. As the research progressed it became evident that questions were being raised about the nature and scope of occupational health as well as the contribution, in combating danger at work, that occupational health services can make to the health and safety team. These questions have been scrutinized in depth, as they are clearly important, and a resolution of the problem of the definition of occupational health has been proposed. I have taken the approach of attempting to identify specific objectives or benefits of occupational health activities so that it is possible to assess how far these objectives are being achieved. I have looked at three aspects of occupational health; audiometry, physiotherapy and pre-employment medical examinations as these activities embody crucial concepts which are common to all activities in an occupational health programme. A three category classification of occupational health activities is proposed such that the three activities provide examples within each category. These are called personnel therapy, personnel input screening and personnel throughput screening. I conclude that I have not shown audiometry to be cost-effective. My observations of the physiotherapy service lead me to support the suggestion that there is a decline in sickness absence rates due to physiotherapy in industry. With pre-employment medical examinations I have shown that the service is product safety oriented and that benefits are extremely difficult to identify. In regard to the three services studied, in the one factory investigated, and because of the immeasurability of certain activities, I find support for the proposition that the mix of occupational health services as provided at the present time represents a misallocation of resources.
Resumo:
How are innovative new business models established if organizations constantly compare themselves against existing criteria and expectations? The objective is to address this question from the perspective of innovators and their ability to redefine established expectations and evaluation criteria. The research questions ask whether there are discernible patterns of discursive action through which innovators theorize institutional change and what role such theorizations play for mobilizing support and realizing change projects. These questions are investigated through a case study on a critical area of enterprise computing software, Java application servers. In the present case, business practices and models were already well established among incumbents with critical market areas allocated to few dominant firms. Fringe players started experimenting with a new business approach of selling services around freely available opensource application servers. While most new players struggled, one new entrant succeeded in leading incumbents to adopt and compete on the new model. The case demonstrates that innovative and substantially new models and practices are established in organizational fields when innovators are able to refine expectations and evaluation criteria within an organisational field. The study addresses the theoretical paradox of embedded agency. Actors who are embedded in prevailing institutional logics and structures find it hard to perceive potentially disruptive opportunities that fall outside existing ways of doing things. Changing prevailing institutional logics and structures requires strategic and institutional work aimed at overcoming barriers to innovation. The study addresses this problem through the lens of (new) institutional theory. This discourse methodology traces the process through which innovators were able to establish a new social and business model in the field.
Resumo:
This thesis describes research that has developed the principles of a modelling tool for the analytical evaluation of a manufacturing strategy. The appropriate process of manufacturing strategy formulation is based on mental synthesis with formal planning processes supporting this role. Inherent to such processes is a stage where the effects of alternative strategies on the performance of a manufacturing system must be evaluated so that a choice of preferred strategy can be made. Invariably this evaluation is carried out by practitioners applying mechanisms of judgement, bargaining and analysis. Ibis thesis makes a significant and original contribution to the provision of analytical support for practitioners in this role. The research programme commences by defining the requirements of analytical strategy evaluation from the perspective of practitioners. A broad taxonomy of models has been used to identify a set of potentially suitable techniques for the strategy evaluation task. Then, where possible, unsuitable modelling techniques have been identified on the basis of evidence in the literature and discarded from this set. The remaining modelling techniques have been critically appraised by testing representative contemporary modelling tools in an industrially based experimentation programme. The results show that individual modelling techniques exhibit various limitations in the strategy evaluation role, though some combinations do appear to provide the necessary functionality. On the basis of this comprehensive and in-depth knowledge a modelling tool ' has been specifically designed for this task. Further experimental testing has then been conducted to verify the principles of this modelling tool. Ibis research has bridged the fields of manufacturing strategy formulation and manufacturing systems modelling and makes two contributions to knowledge. Firstly, a comprehensive and in-depth platform of knowledge has been established about modelling techniques in manufacturing strategy evaluation. Secondly, the principles of a tool that supports this role have been formed and verified.
Resumo:
We discuss aggregation of data from neuropsychological patients and the process of evaluating models using data from a series of patients. We argue that aggregation can be misleading but not aggregating can also result in information loss. The basis for combining data needs to be theoretically defined, and the particular method of aggregation depends on the theoretical question and characteristics of the data. We present examples, often drawn from our own research, to illustrate these points. We also argue that statistical models and formal methods of model selection are a useful way to test theoretical accounts using data from several patients in multiple-case studies or case series. Statistical models can often measure fit in a way that explicitly captures what a theory allows; the parameter values that result from model fitting often measure theoretically important dimensions and can lead to more constrained theories or new predictions; and model selection allows the strength of evidence for models to be quantified without forcing this into the artificial binary choice that characterizes hypothesis testing methods. Methods that aggregate and then formally model patient data, however, are not automatically preferred to other methods. Which method is preferred depends on the question to be addressed, characteristics of the data, and practical issues like availability of suitable patients, but case series, multiple-case studies, single-case studies, statistical models, and process models should be complementary methods when guided by theory development.
Resumo:
A number of professional sectors have recently moved away from their longstanding career model of up-or-out promotion and embraced innovative alternatives. Professional labor is a critical resource in professional service firms. Therefore, changes to these internal labor markets are likely to trigger other innovations, for example in knowledge management, incentive schemes and team composition. In this chapter we look at how new career models affect the core organizing model of professional firms and, in turn, their capacity for and processes of innovation. We consider how professional firms link the development of human capital and the division of professional labor to distinctive demands for innovation and how novel career systems help them respond to these demands.
Resumo:
This chapter argues that creative, innovative organizations are places where there is a firm and shared belief among most members in an inspirational vision of what the organization is trying to achieve. There is a high level of interaction, discussion, constructive debate, and influence among the members of the organization as they go about their work. Trust, cooperative orientations, and a sense of interpersonal safety characterize interpersonal and intergroup relationships. Members of the organization, particularly those at the upper echelons (and there are few echelons) are consistently positive and open to members' ideas for new and improved ways of working, providing both encouragement and the resources for innovation. Creativity is heralded as key for organizational survival and success. As global economic models become the norm and competitiveness assumes an international character, leaders realize that, in order to prosper in a highly challenging environment, companies must innovate. The source of organizational innovation is unquestionably the ideas generated by individuals and teams. © 2012 Elsevier Inc. All rights reserved.
Resumo:
Although recent research highlights the role of team member goalorientation in team functioning, research has neglected the effects of diversity in goalorientation. In a laboratory study with groups working on a problem-solving task, we show that diversity in learning and performanceorientation are related to decreased group performance. Moreover, we find that the effect of diversity in learning orientation is mediated by group information elaboration and the effect of diversity in performanceorientation by group efficiency. In addition, we demonstrate that teamreflexivity can counteract the negative effects of diversity in goalorientation. These results suggest that models of goal orientation in groups should incorporate the effects of diversity in goal orientation.
Resumo:
The appraisal and relative performance evaluation of nurses are very important and beneficial for both nurses and employers in an era of clinical governance, increased accountability and high standards of health care services. They enhance and consolidate the knowledge and practical skills of nurses by identification of training and career development plans as well as improvement in health care quality services, increase in job satisfaction and use of cost-effective resources. In this paper, a data envelopment analysis (DEA) model is proposed for the appraisal and relative performance evaluation of nurses. The model is validated on thirty-two nurses working at an Intensive Care Unit (ICU) at one of the most recognized hospitals in Lebanon. The DEA was able to classify nurses into efficient and inefficient ones. The set of efficient nurses was used to establish an internal best practice benchmark to project career development plans for improving the performance of other inefficient nurses. The DEA result confirmed the ranking of some nurses and highlighted injustice in other cases that were produced by the currently practiced appraisal system. Further, the DEA model is shown to be an effective talent management and motivational tool as it can provide clear managerial plans related to promoting, training and development activities from the perspective of nurses, hence increasing their satisfaction, motivation and acceptance of appraisal results. Due to such features, the model is currently being considered for implementation at ICU. Finally, the ratio of the number DEA units to the number of input/output measures is revisited with new suggested values on its upper and lower limits depending on the type of DEA models and the desired number of efficient units from a managerial perspective.
Resumo:
This paper describes the work undertaken in the Scholarly Ontologies Project. The aim of the project has been to develop a computational approach to support scholarly sensemaking, through interpretation and argumentation, enabling researchers to make claims: to describe and debate their view of a document's key contributions and relationships to the literature. The project has investigated the technicalities and practicalities of capturing conceptual relations, within and between conventional documents in terms of abstract ontological structures. In this way, we have developed a new kind of index to distributed digital library systems. This paper reports a case study undertaken to test the sensemaking tools developed by the Scholarly Ontologies project. The tools used were ClaiMapper, which allows the user to sketch argument maps of individual papers and their connections, ClaiMaker, a server on which such models can be stored and saved, which provides interpretative services to assist the querying of argument maps across multiple papers and ClaimFinder, a novice interface to the search services in ClaiMaker.
Resumo:
In this paper, we explore the idea of social role theory (SRT) and propose a novel regularized topic model which incorporates SRT into the generative process of social media content. We assume that a user can play multiple social roles, and each social role serves to fulfil different duties and is associated with a role-driven distribution over latent topics. In particular, we focus on social roles corresponding to the most common social activities on social networks. Our model is instantiated on microblogs, i.e., Twitter and community question-answering (cQA), i.e., Yahoo! Answers, where social roles on Twitter include "originators" and "propagators", and roles on cQA are "askers" and "answerers". Both explicit and implicit interactions between users are taken into account and modeled as regularization factors. To evaluate the performance of our proposed method, we have conducted extensive experiments on two Twitter datasets and two cQA datasets. Furthermore, we also consider multi-role modeling for scientific papers where an author's research expertise area is considered as a social role. A novel application of detecting users' research interests through topical keyword labeling based on the results of our multi-role model has been presented. The evaluation results have shown the feasibility and effectiveness of our model.
Resumo:
Three human astroglioma lines U251-MG, U373-MG and CCF-STTG1 have been evaluated further as possible models for astrocytotoxicity (GFAP and IL-6 release). The effects of bacterial lipopolysaccharide, chloroquine diphosphate and acrylamide were studied on GFAP expression and LPS, chloroquine diphosphate, ethanol, trimethyltin chloride (TMTC) and acrylamide were examined on interleukin-6 (IL-6) release in the U373-MG line only. At 4-h LIPS elevated GFAP (17.0±5.0% P < 0.05) above control in the U251-MG cell line only. Chloroquine diphosphate over 4 h in the U251-MG line resulted in an increase in GFAP-IR to 20.3 ±4.2% and 21.1 ± 4.1 % above control levels 0.1 µM (P< 0.05) and 1 µM (P< 0.05) respectively. CQD was associated with decreases in MTT turnover, particularly after 24 h incubation. With the U373-MG line, LPS (0.5 µg/ml) increased IL-6 expression 640% above control (P < 0.001), whilst chloroquine diphosphate (100 µM), ethanol (10mM) and TMTC chloride (1 µM) also increased IL-6. It is possible that batteries of astrocytic human glioma cell lines may be applicable to the sensitive evaluation of toxicants on astrogliotic expression markers such as GFAP and IL-6.
Resumo:
The accurate identification of T-cell epitopes remains a principal goal of bioinformatics within immunology. As the immunogenicity of peptide epitopes is dependent on their binding to major histocompatibility complex (MHC) molecules, the prediction of binding affinity is a prerequisite to the reliable prediction of epitopes. The iterative self-consistent (ISC) partial-least-squares (PLS)-based additive method is a recently developed bioinformatic approach for predicting class II peptide−MHC binding affinity. The ISC−PLS method overcomes many of the conceptual difficulties inherent in the prediction of class II peptide−MHC affinity, such as the binding of a mixed population of peptide lengths due to the open-ended class II binding site. The method has applications in both the accurate prediction of class II epitopes and the manipulation of affinity for heteroclitic and competitor peptides. The method is applied here to six class II mouse alleles (I-Ab, I-Ad, I-Ak, I-As, I-Ed, and I-Ek) and included peptides up to 25 amino acids in length. A series of regression equations highlighting the quantitative contributions of individual amino acids at each peptide position was established. The initial model for each allele exhibited only moderate predictivity. Once the set of selected peptide subsequences had converged, the final models exhibited a satisfactory predictive power. Convergence was reached between the 4th and 17th iterations, and the leave-one-out cross-validation statistical terms - q2, SEP, and NC - ranged between 0.732 and 0.925, 0.418 and 0.816, and 1 and 6, respectively. The non-cross-validated statistical terms r2 and SEE ranged between 0.98 and 0.995 and 0.089 and 0.180, respectively. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made freely available online (http://www.jenner.ac.uk/MHCPred).
Resumo:
Data envelopment analysis (DEA) is the most widely used methods for measuring the efficiency and productivity of decision-making units (DMUs). The need for huge computer resources in terms of memory and CPU time in DEA is inevitable for a large-scale data set, especially with negative measures. In recent years, wide ranges of studies have been conducted in the area of artificial neural network and DEA combined methods. In this study, a supervised feed-forward neural network is proposed to evaluate the efficiency and productivity of large-scale data sets with negative values in contrast to the corresponding DEA method. Results indicate that the proposed network has some computational advantages over the corresponding DEA models; therefore, it can be considered as a useful tool for measuring the efficiency of DMUs with (large-scale) negative data.
Resumo:
The finding that Pareto distributions are adequate to model Internet packet interarrival times has motivated the proposal of methods to evaluate steady-state performance measures of Pareto/D/1/k queues. Some limited analytical derivation for queue models has been proposed in the literature, but their solutions are often of a great mathematical challenge. To overcome such limitations, simulation tools that can deal with general queueing system must be developed. Despite certain limitations, simulation algorithms provide a mechanism to obtain insight and good numerical approximation to parameters of queues. In this work, we give an overview of some of these methods and compare them with our simulation approach, which are suited to solve queues with Generalized-Pareto interarrival time distributions. The paper discusses the properties and use of the Pareto distribution. We propose a real time trace simulation model for estimating the steady-state probability showing the tail-raising effect, loss probability, delay of the Pareto/D/1/k queue and make a comparison with M/D/1/k. The background on Internet traffic will help to do the evaluation correctly. This model can be used to study the long- tailed queueing systems. We close the paper with some general comments and offer thoughts about future work.
Resumo:
This thesis describes research that has developed the principles of a modelling tool for the analytical evaluation of a manufacturing strategy. The appropriate process of manufacturing strategy formulation is based on mental synthesis with formal planning processes supporting this role. Inherent to such processes is a stage where the effects of alternative strategies on the performance of a manufacturing system must be evaluated so that a choice of preferred strategy can be made. Invariably this evaluation is carried out by practitioners applying mechanisms of judgement, bargaining and analysis. Ibis thesis makes a significant and original contribution to the provision of analytical support for practitioners in this role. The research programme commences by defining the requirements of analytical strategy evaluation from the perspective of practitioners. A broad taxonomy of models has been used to identify a set of potentially suitable techniques for the strategy evaluation task. Then, where possible, unsuitable modelling techniques have been identified on the basis of evidence in the literature and discarded from this set. The remaining modelling techniques have been critically appraised by testing representative contemporary modelling tools in an industrially based experimentation programme. The results show that individual modelling techniques exhibit various limitations in the strategy evaluation role, though some combinations do appear to provide the necessary functionality. On the basis of this comprehensive and in-depth knowledge a modelling tool ' has been specifically designed for this task. Further experimental testing has then been conducted to verify the principles of this modelling tool. Ibis research has bridged the fields of manufacturing strategy formulation and manufacturing systems modelling and makes two contributions to knowledge. Firstly, a comprehensive and in-depth platform of knowledge has been established about modelling techniques in manufacturing strategy evaluation. Secondly, the principles of a tool that supports this role have been formed and verified.