17 resultados para Internal working models
em Aston University Research Archive
Resumo:
Relationships with supervisors are a major source of negative emotions at work, but little is known about why this is so. The aim of the research was to use attachment theory (Bowlby, 1969, 1973; 1980) as a framework for investigating the nature and causes of employee negative emotional experiences, in the context of their supervisory relationships. The research was conducted in three stages. In Stage 1 two studies were conducted to develop a measure of employee perceptions of supervisor caregiving (SCS). Results indicated that the 20-item scale had good reliability and validity. Stage 2 required participants (N=183) to complete a questionnaire that was designed to examine the roles of supervisor caregiving and working models (specific and global) in determining cognitive and emotional responses to hypothetical supervisor behaviours. The results provided partial support for an Independent Effects Model. Supervisor caregiving predicted specific anxiety and avoidance. In tum, both dimensions of attachment predicted negative emotions, but this relationship was mediated by event interpretation only in the case of avoidance. Global models made a smaller but significant contribution to negative emotions overall. There was no support for an interaction effect between specific and global models in determining event interpretation. In stage 3 a sub-sample of questionnaire respondents (N=24) were interviewed about 'real-life' caregiving and negative emotional experiences in their supervisory relationships. Secure individuals experienced supervisors as consistently warm, available, and responsive. They reported few negative events or emotions. Individuals with insecure specific working models experienced rejecting or inconsistent supervisor caregiving. They were sensitised to trust and closeness issues in their relationships, and reported negative events and emotions underpinned by these themes. Overall, results broadly supported attachment theory predictions. It is concluded that an attachment theory perspective provides new insight into the nature and causes of employee negative emotions in supervisory relationships.
Resumo:
This paper formulates several mathematical models for determining the optimal sequence of component placements and assignment of component types to feeders simultaneously or the integrated scheduling problem for a type of surface mount technology placement machines, called the sequential pick-andplace (PAP) machine. A PAP machine has multiple stationary feeders storing components, a stationary working table holding a printed circuit board (PCB), and a movable placement head to pick up components from feeders and place them to a board. The objective of integrated problem is to minimize the total distance traveled by the placement head. Two integer nonlinear programming models are formulated first. Then, each of them is equivalently converted into an integer linear type. The models for the integrated problem are verified by two commercial packages. In addition, a hybrid genetic algorithm previously developed by the authors is adopted to solve the models. The algorithm not only generates the optimal solutions quickly for small-sized problems, but also outperforms the genetic algorithms developed by other researchers in terms of total traveling distance.
Resumo:
This paper examines the strategic implications of resource allocation models (RAMs). Four interrelated aspects of resource allocation are discussed: degree of centralisation, locus of strategic direction, cross-subsidy, and locus of control. The paper begins with a theoretical overview of these concepts, locating the study in the contexts of both strategic management literature and the university. The concepts are then examined empirically, drawing upon a longitudinal study of three UK universities, Warwick, London School of Economics and Political Science (LSE), and Oxford Brookes. Findings suggest that RAMs are historically and culturally situated within the context of each university and this is associated with different patterns of strategic direction and forms of strategic control. As such, the RAM in use may be less a matter of best practice than one of internal fit. The paper concludes with some implications for theory and practice by discussing the potential trajectories of each type of RAM.
Resumo:
Purpose – Increasing turnover of frontline staff in call centres is detrimental to the delivery of quality service to customers. This paper aims to present the context for the rapid growth of the business process outsourcing (BPO) sector in India, and to address a critical issue faced by call centre organisations in this sector – the high employee turnover. Design/methodology/approach – Following a triangulation approach, two separate empirical investigations are conducted to examine various aspects of high labour turnover rates in the call centre sector in India. Study one examines the research issue via 51 in-depth interviews in as many units. Study two reports results from a questionnaire survey with 204 frontline agents across 11 call centres regarding employee turnover. Findings – This research reveals a range of reasons – from monotonous work, stressful work environment, adverse working conditions, lack of career development opportunities; to better job opportunities elsewhere, which emerge as the key causes of increasing attrition rates in the Indian call centre industry. Research limitations/implications – The research suggests that there are several issues that need to be handled carefully by management of call centres in India to overcome the problem of increasing employee turnover, and that this also demands support from the Indian government. Originality/value – The contributions of this study untangle the issues underlying a key problem in the call centre industry, i.e. employee turnover in the Indian call centre industry context. Adopting an internal marketing approach, it provides useful information for both academics and practitioners and suggests internal marketing interventions, and avenues for future research to combat the problem of employee turnover.
Resumo:
In many models of edge analysis in biological vision, the initial stage is a linear 2nd derivative operation. Such models predict that adding a linear luminance ramp to an edge will have no effect on the edge's appearance, since the ramp has no effect on the 2nd derivative. Our experiments did not support this prediction: adding a negative-going ramp to a positive-going edge (or vice-versa) greatly reduced the perceived blur and contrast of the edge. The effects on a fairly sharp edge were accurately predicted by a nonlinear multi-scale model of edge processing [Georgeson, M. A., May, K. A., Freeman, T. C. A., & Hesse, G. S. (in press). From filters to features: Scale-space analysis of edge and blur coding in human vision. Journal of Vision], in which a half-wave rectifier comes after the 1st derivative filter. But we also found that the ramp affected perceived blur more profoundly when the edge blur was large, and this greater effect was not predicted by the existing model. The model's fit to these data was much improved when the simple half-wave rectifier was replaced by a threshold-like transducer [May, K. A. & Georgeson, M. A. (2007). Blurred edges look faint, and faint edges look sharp: The effect of a gradient threshold in a multi-scale edge coding model. Vision Research, 47, 1705-1720.]. This modified model correctly predicted that the interaction between ramp gradient and edge scale would be much larger for blur perception than for contrast perception. In our model, the ramp narrows an internal representation of the gradient profile, leading to a reduction in perceived blur. This in turn reduces perceived contrast because estimated blur plays a role in the model's estimation of contrast. Interestingly, the model predicts that analogous effects should occur when the width of the window containing the edge is made narrower. This has already been confirmed for blur perception; here, we further support the model by showing a similar effect for contrast perception. © 2007 Elsevier Ltd. All rights reserved.
Resumo:
In this paper the exchange rate forecasting performance of neural network models are evaluated against random walk and a range of time series models. There are no guidelines available that can be used to choose the parameters of neural network models and therefore the parameters are chosen according to what the researcher considers to be the best. Such an approach, however, implies that the risk of making bad decisions is extremely high which could explain why in many studies neural network models do not consistently perform better than their time series counterparts. In this paper through extensive experimentation the level of subjectivity in building neural network models is considerably reduced and therefore giving them a better chance of performing well. Our results show that in general neural network models perform better than traditionally used time series models in forecasting exchange rates.
Resumo:
The thesis offers a comparative interdisciplinary approach to the examination of the intellectual debates about the relationship between individual and society in the GDR under Honecker. It shows that there was not only a continuum of debate between the academic disciplines, but also from the radical critics of the GDR leadership such as Robert Havemann, Rudolf Bahro and Stefan Heym through the social scientists, literary critics and legal theorists working in the academic institutions to theorists close to the GDR leadership. It also shows that the official line and policy of the ruling party itself on the question of the individual and society was not static over the period, but changed in response to internal and external pressures. Over the period 1971 - 1989 greater emphasis was placed by many intellectuals on the individual, his needs and interests. It was increasingly recognised that conflicts could exist between the individual and society in GDR socialism. Whereas the radical critics argued that these conflicts were due to features of GDR society, such as the hierarchical system of labour functions and bureaucracy, and extrapolated from this a general conflict between the political leadership and population, orthodox critics argued that conflicts existed between a specific individual and society and were largely due to external and historical factors. The internal critics also pointed to the social phenomena which were detrimental to the individual's development in the GDR, but they put forward less radical solutions. With the exception of a few radical young writers, all theorists studied in this thesis gave precedence to social interests over individual interests and so did not advocate a return to `individualistic' positions. The continuity of sometimes quite controversial discussions in the GDR academic journals and the flexibility of the official line and policy suggests that it is inappropriate to refer to GDR society under Honecker simply as totalitarian, although it did have some totalitarian features. What the thesis demonstrates is the existence of `Teiloffentlichkeiten' in which critical discussion is conducted even as the official, orthodox line is given out for public consumption in the high-circulation media.
Resumo:
The thesis reports of a study into the effect upon organisations of co-operative information systems (CIS) incorporating flexible communications, group support and group working technologies. A review of the literature leads to the development of a model of effect based upon co-operative business tasks. CIS have the potential to change how co-operative business tasks are carried out and their principal effect (or performance) may therefore be evaluated by determining to what extent they are being employed to perform these tasks. A significant feature of CIS use identified is the extent to which they may be designed to fulfil particular tasks, or by contrast, may be applied creatively by users in an emergent fashion to perform tasks. A research instrument is developed using a survey questionnaire to elicit users judgements of the extent to which a CIS is employed to fulfil a range of co-operative tasks. This research instrument is applied to a longitudinal study of Novell GroupWise introduction at Northamptonshire County Council during which qualitative as well as quantitative data were gathered. A method of analysis of questionnaire results using principles from fuzzy mathematics and artificial intelligence is developed and demonstrated. Conclusions from the longitudinal study include the importance of early experiences in setting patterns for use for CIS, the persistence of patterns of use over time and the dominance of designed usage of the technology over emergent use.
Resumo:
This thesis presents an effective methodology for the generation of a simulation which can be used to increase the understanding of viscous fluid processing equipment and aid in their development, design and optimisation. The Hampden RAPRA Torque Rheometer internal batch twin rotor mixer has been simulated with a view to establishing model accuracies, limitations, practicalities and uses. As this research progressed, via the analyses several 'snap-shot' analysis of several rotor configurations using the commercial code Polyflow, it was evident that the model was of some worth and its predictions are in good agreement with the validation experiments, however, several major restrictions were identified. These included poor element form, high man-hour requirements for the construction of each geometry and the absence of the transient term in these models. All, or at least some, of these limitations apply to the numerous attempts to model internal mixes by other researchers and it was clear that there was no generally accepted methodology to provide a practical three-dimensional model which has been adequately validated. This research, unlike others, presents a full complex three-dimensional, transient, non-isothermal, generalised non-Newtonian simulation with wall slip which overcomes these limitations using unmatched ridding and sliding mesh technology adapted from CFX codes. This method yields good element form and, since only one geometry has to be constructed to represent the entire rotor cycle, is extremely beneficial for detailed flow field analysis when used in conjunction with user defined programmes and automatic geometry parameterisation (AGP), and improves accuracy for investigating equipment design and operation conditions. Model validation has been identified as an area which has been neglected by other researchers in this field, especially for time dependent geometries, and has been rigorously pursued in terms of qualitative and quantitative velocity vector analysis of the isothermal, full fill mixing of generalised non-Newtonian fluids, as well as torque comparison, with a relatively high degree of success. This indicates that CFD models of this type can be accurate and perhaps have not been validated to this extent previously because of the inherent difficulties arising from most real processes.
Resumo:
Formative measurement has seen increasing acceptance in organizational research since the turn of the 21st Century. However, in more recent times, a number of criticisms of the formative approach have appeared. Such work argues that formatively-measured constructs are empirically ambiguous and thus flawed in a theory-testing context. The aim of the present paper is to examine the underpinnings of formative measurement theory in light of theories of causality and ontology in measurement in general. In doing so, a thesis is advanced which draws a distinction between reflective, formative, and causal theories of latent variables. This distinction is shown to be advantageous in that it clarifies the ontological status of each type of latent variable, and thus provides advice on appropriate conceptualization and application. The distinction also reconciles in part both recent supportive and critical perspectives on formative measurement. In light of this, advice is given on how most appropriately to model formative composites in theory-testing applications, placing the onus on the researcher to make clear their conceptualization and operationalisation.
Resumo:
Numerous studies find that monetary models of exchange rates cannot beat a random walk model. Such a finding, however, is not surprising given that such models are built upon money demand functions and traditional money demand functions appear to have broken down in many developed countries. In this paper we investigate whether using a more stable underlying money demand function results in improvements in forecasts of monetary models of exchange rates. More specifically, we use a sweepadjusted measure of US monetary aggregate M1 which has been shown to have a more stable money demand function than the official M1 measure. The results suggest that the monetary models of exchange rates contain information about future movements of exchange rates but the success of such models depends on the stability of money demand functions and the specifications of the models.
Resumo:
In this paper, we empirically examine how professional service firms are adapting their promotion and career models to new market and institutional pressures, without losing the benefits of the traditional up-or-out tournament. Based on an in-depth qualitative study of 10 large UK based law firms we find that most of these firms do not have a formal up-or-out policy but that the up-or-out rule operates in practice. We also find that most firms have introduced alternative roles and a novel career policy that offers a holistic learning and development deal to associates without any expectation that unsuccessful candidates for promotion to partner should quit the firm. While this policy and the new roles formally contradict the principle of up-or-out by creating permanent non-partner positions, in practice they coexist. We conclude that the motivational power of the up-or-out tournament remains intact, notwithstanding the changes to the internal labour market structure of these professional service firms.
Resumo:
A number of professional sectors have recently moved away from their longstanding career model of up-or-out promotion and embraced innovative alternatives. Professional labor is a critical resource in professional service firms. Therefore, changes to these internal labor markets are likely to trigger other innovations, for example in knowledge management, incentive schemes and team composition. In this chapter we look at how new career models affect the core organizing model of professional firms and, in turn, their capacity for and processes of innovation. We consider how professional firms link the development of human capital and the division of professional labor to distinctive demands for innovation and how novel career systems help them respond to these demands.
Resumo:
This report is based on discussions and submissions from an expert working group consisting of veterinarians, animal care staff and scientists with expert knowledge relevant to the field and aims to facilitate the implementation of the Three Rs (replacement, reduction and refinement) in the use of animal models or procedures involving seizures, convulsions and epilepsy. Each of these conditions will be considered, the specific welfare issues discussed, and practical measures to reduce animal use and suffering suggested. The emphasis is on refinement since this has the greatest potential for immediate implementation, and some general issues for refinement are summarised to help achieve this, with more detail provided on a range of specific refinements.
Resumo:
The appraisal and relative performance evaluation of nurses are very important and beneficial for both nurses and employers in an era of clinical governance, increased accountability and high standards of health care services. They enhance and consolidate the knowledge and practical skills of nurses by identification of training and career development plans as well as improvement in health care quality services, increase in job satisfaction and use of cost-effective resources. In this paper, a data envelopment analysis (DEA) model is proposed for the appraisal and relative performance evaluation of nurses. The model is validated on thirty-two nurses working at an Intensive Care Unit (ICU) at one of the most recognized hospitals in Lebanon. The DEA was able to classify nurses into efficient and inefficient ones. The set of efficient nurses was used to establish an internal best practice benchmark to project career development plans for improving the performance of other inefficient nurses. The DEA result confirmed the ranking of some nurses and highlighted injustice in other cases that were produced by the currently practiced appraisal system. Further, the DEA model is shown to be an effective talent management and motivational tool as it can provide clear managerial plans related to promoting, training and development activities from the perspective of nurses, hence increasing their satisfaction, motivation and acceptance of appraisal results. Due to such features, the model is currently being considered for implementation at ICU. Finally, the ratio of the number DEA units to the number of input/output measures is revisited with new suggested values on its upper and lower limits depending on the type of DEA models and the desired number of efficient units from a managerial perspective.