28 resultados para D63 - Equity, Justice, Inequality, and Other Normative Criteria and Measurement
em Aston University Research Archive
Resumo:
Wage inequality is a particular focus of attention not only in public debates over the need for social regulation to support equity, but those over the implications of social regulation for productive performance. The present paper employs panel techniques to examine the comparative historical relationship between wage inequality and hourly labour productivity growth in the manufacturing sectors of nine advanced industrialised nations over the period 1970-1995. The results show that whilst greater inequality in the top half of the wage distribution is associated with greater productivity growth, greater inequality in the bottom half is associated with lower productivity growth. It appears that whilst wage inequality in the top half of the distribution productively motivates higher earners, wage inequality in the bottom half of the distribution is detrimental for productivity performance. The latter result is most likely attributable to the weak incentives to reorganise production where extremely low pay is feasible.
Resumo:
This article presents part of the findings of a multi-method study into employee perceptions of fairness in relation to the organisational career management (OCM) practices of a large financial retailer. It focuses on exploring how employees construct fairness judgements of their career experiences and the role played by the organisational context and, in particular, OCM practices in forming these judgements. It concludes that individuals can, and do, separate the source and content of (in)justice when it comes to evaluating these experiences. The relative roles of the employer, line manager and career development opportunities in influencing employee fairness evaluations are discussed. Conceptual links with organisational justice theory are proposed, and it is argued that the academic and practitioner populations are provided with empirical evidence for a new theoretical framework for evaluating employee perceptions of, and reactions to, OCM practices.
Resumo:
Despite the increased attention on the impacts of globalisation, there has been little empirical investigation into the impact of multinational firms on the domestic labour market and in particular wage inequality, this is in spite of a rapid increase in foreign direct investment (FDI) at around the same time of rising inequality. Using UK panel data, this paper tests whether inward flows of FDI have contributed to increasing wage inequality. Even after controlling for the two most common explanations of wage inequality, technology and trade, we find that FDI has a significant effect upon wage inequality, with the overall impact of FDI explaining on average 11% of wage inequality. © 2003 Elsevier B.V. All rights reserved.
Resumo:
Although there is a large body of research on brand equity, little in terms of a literature review has been published on this since Feldwick’s (1996) paper. To address this gap, this paper brings together the scattered literature on consumer based brand equity’s conceptualisation and measurement. Measures of consumer based brand equity are classified as either direct or indirect. Indirect measures assess consumer-based brand equity through its demonstrable dimensions and are superior from a diagnostic level. The paper concludes with directions for future research and managerial pointers for setting up a brand equity measurement system.
Resumo:
This article explores the intersections of silence and transitional justice in Serbia, where, it is often suggested, the general public is silent and indifferent about human rights abuses that took place during the former Yugoslav conflicts. It considers both the 'silent' public and the ways in which transitional justice may be complicit in silencing it. Based on scholarship that suggests silences are not absences but rather sites of silent knowledge or a result of silencing, the article explores some of the dynamics hidden within the public's silence: shared knowledge, secret practices and inability to discuss violence. It also considers the ways in which audiences subvert and resist organized transitional justice initiatives or are caught up in a 'silent dilemma' in which they are unable to speak about the past under the discursive conditions created by transitional justice practitioners.
Resumo:
Background There is a paucity of data describing the prevalence of childhood refractive error in the United Kingdom. The Northern Ireland Childhood Errors of Refraction study, along with its sister study the Aston Eye Study, are the first population-based surveys of children using both random cluster sampling and cycloplegic autorefraction to quantify levels of refractive error in the United Kingdom. Methods Children aged 6–7 years and 12–13 years were recruited from a stratified random sample of primary and post-primary schools, representative of the population of Northern Ireland as a whole. Measurements included assessment of visual acuity, oculomotor balance, ocular biometry and cycloplegic binocular open-field autorefraction. Questionnaires were used to identify putative risk factors for refractive error. Results 399 (57%) of 6–7 years and 669 (60%) of 12–13 years participated. School participation rates did not vary statistically significantly with the size of the school, whether the school is urban or rural, or whether it is in a deprived/non-deprived area. The gender balance, ethnicity and type of schooling of participants are reflective of the Northern Ireland population. Conclusions The study design, sample size and methodology will ensure accurate measures of the prevalence of refractive errors in the target population and will facilitate comparisons with other population-based refractive data.
Resumo:
This thesis begins with a review of the literature on team-based working in organisations, highlighting the variations in research findings, and the need for greater precision in our measurement of teams. It continues with an illustration of the nature and prevalence of real and pseudo team-based working, by presenting results from a large sample of secondary data from the UK National Health Service. Results demonstrate that ‘real teams’ have an important and significant impact on the reduction of many work-related safety outcomes. Based on both theoretical and methodological limitations of existing approaches, the thesis moves on to provide a clarification and extension of the ‘real team’ construct, demarcating this from other (pseudo-like) team typologies on a sliding scale, rather than a simple dichotomy. A conceptual model for defining real teams is presented, providing a theoretical basis for the development of a scale on which teams can be measured for varying extents of ‘realness’. A new twelve-item scale is developed and tested with three samples of data comprising 53 undergraduate teams, 52 postgraduate teams, and 63 public sector teams from a large UK organisation. Evidence for the content, construct and criterion-related validity of the real team scale is examined over seven separate validation studies. Theoretical, methodological and practical implications of the real team scale are then discussed.
Resumo:
Whereas the competitive advantage of firms can arise from size and position within their industry as well as physical assets, the pattern of competition in advanced economies has increasingly come to favour those firms that can mobilise knowledge and technological skills to create novelty in their products. At the same time, regions are attracting growing attention as an economic unit of analysis, with firms increasingly locating their functions in select regions within the global space. This article introduces the concept of knowledge competitiveness, defined as an economy’s knowledge capacity, capability and sustainability, and the extent to which this knowledge is translated into economic value and transferred into the wealth of the citizens. The article discusses the way in which the knowledge competitiveness of regions is measured and further introduces the World Knowledge Competitiveness Index, which is the first composite and relative measure of the knowledge competitiveness of the globe’s best performing regions.
Resumo:
The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.
Resumo:
This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.
Resumo:
Organizations can use the valuable tool of data envelopment analysis (DEA) to make informed decisions on developing successful strategies, setting specific goals, and identifying underperforming activities to improve the output or outcome of performance measurement. The Handbook of Research on Strategic Performance Management and Measurement Using Data Envelopment Analysis highlights the advantages of using DEA as a tool to improve business performance and identify sources of inefficiency in public and private organizations. These recently developed theories and applications of DEA will be useful for policymakers, managers, and practitioners in the areas of sustainable development of our society including environment, agriculture, finance, and higher education sectors. All rights reserved.
Resumo:
Liquid-liquid extraction has long been known as a unit operation that plays an important role in industry. This process is well known for its complexity and sensitivity to operation conditions. This thesis presents an attempt to explore the dynamics and control of this process using a systematic approach and state of the art control system design techniques. The process was studied first experimentally under carefully selected. operation conditions, which resembles the ranges employed practically under stable and efficient conditions. Data were collected at steady state conditions using adequate sampling techniques for the dispersed and continuous phases as well as during the transients of the column with the aid of a computer-based online data logging system and online concentration analysis. A stagewise single stage backflow model was improved to mimic the dynamic operation of the column. The developed model accounts for the variation in hydrodynamics, mass transfer, and physical properties throughout the length of the column. End effects were treated by addition of stages at the column entrances. Two parameters were incorporated in the model namely; mass transfer weight factor to correct for the assumption of no mass transfer in the. settling zones at each stage and the backmixing coefficients to handle the axial dispersion phenomena encountered in the course of column operation. The parameters were estimated by minimizing the differences between the experimental and the model predicted concentration profiles at steady state conditions using non-linear optimisation technique. The estimated values were then correlated as functions of operating parameters and were incorporated in·the model equations. The model equations comprise a stiff differential~algebraic system. This system was solved using the GEAR ODE solver. The calculated concentration profiles were compared to those experimentally measured. A very good agreement of the two profiles was achieved within a percent relative error of ±2.S%. The developed rigorous dynamic model of the extraction column was used to derive linear time-invariant reduced-order models that relate the input variables (agitator speed, solvent feed flowrate and concentration, feed concentration and flowrate) to the output variables (raffinate concentration and extract concentration) using the asymptotic method of system identification. The reduced-order models were shown to be accurate in capturing the dynamic behaviour of the process with a maximum modelling prediction error of I %. The simplicity and accuracy of the derived reduced-order models allow for control system design and analysis of such complicated processes. The extraction column is a typical multivariable process with agitator speed and solvent feed flowrate considered as manipulative variables; raffinate concentration and extract concentration as controlled variables and the feeds concentration and feed flowrate as disturbance variables. The control system design of the extraction process was tackled as multi-loop decentralised SISO (Single Input Single Output) as well as centralised MIMO (Multi-Input Multi-Output) system using both conventional and model-based control techniques such as IMC (Internal Model Control) and MPC (Model Predictive Control). Control performance of each control scheme was. studied in terms of stability, speed of response, sensitivity to modelling errors (robustness), setpoint tracking capabilities and load rejection. For decentralised control, multiple loops were assigned to pair.each manipulated variable with each controlled variable according to the interaction analysis and other pairing criteria such as relative gain array (RGA), singular value analysis (SVD). Loops namely Rotor speed-Raffinate concentration and Solvent flowrate Extract concentration showed weak interaction. Multivariable MPC has shown more effective performance compared to other conventional techniques since it accounts for loops interaction, time delays, and input-output variables constraints.