16 resultados para New career models
em Aston University Research Archive
Resumo:
In this paper, we empirically examine how professional service firms are adapting their promotion and career models to new market and institutional pressures, without losing the benefits of the traditional up-or-out tournament. Based on an in-depth qualitative study of 10 large UK based law firms we find that most of these firms do not have a formal up-or-out policy but that the up-or-out rule operates in practice. We also find that most firms have introduced alternative roles and a novel career policy that offers a holistic learning and development deal to associates without any expectation that unsuccessful candidates for promotion to partner should quit the firm. While this policy and the new roles formally contradict the principle of up-or-out by creating permanent non-partner positions, in practice they coexist. We conclude that the motivational power of the up-or-out tournament remains intact, notwithstanding the changes to the internal labour market structure of these professional service firms.
Resumo:
A number of professional sectors have recently moved away from their longstanding career model of up-or-out promotion and embraced innovative alternatives. Professional labor is a critical resource in professional service firms. Therefore, changes to these internal labor markets are likely to trigger other innovations, for example in knowledge management, incentive schemes and team composition. In this chapter we look at how new career models affect the core organizing model of professional firms and, in turn, their capacity for and processes of innovation. We consider how professional firms link the development of human capital and the division of professional labor to distinctive demands for innovation and how novel career systems help them respond to these demands.
Resumo:
How are innovative new business models established if organizations constantly compare themselves against existing criteria and expectations? The objective is to address this question from the perspective of innovators and their ability to redefine established expectations and evaluation criteria. The research questions ask whether there are discernible patterns of discursive action through which innovators theorize institutional change and what role such theorizations play for mobilizing support and realizing change projects. These questions are investigated through a case study on a critical area of enterprise computing software, Java application servers. In the present case, business practices and models were already well established among incumbents with critical market areas allocated to few dominant firms. Fringe players started experimenting with a new business approach of selling services around freely available opensource application servers. While most new players struggled, one new entrant succeeded in leading incumbents to adopt and compete on the new model. The case demonstrates that innovative and substantially new models and practices are established in organizational fields when innovators are able to refine expectations and evaluation criteria within an organisational field. The study addresses the theoretical paradox of embedded agency. Actors who are embedded in prevailing institutional logics and structures find it hard to perceive potentially disruptive opportunities that fall outside existing ways of doing things. Changing prevailing institutional logics and structures requires strategic and institutional work aimed at overcoming barriers to innovation. The study addresses this problem through the lens of (new) institutional theory. This discourse methodology traces the process through which innovators were able to establish a new social and business model in the field.
Resumo:
The increasing intensity of global competition has led organizations to utilize various types of performance measurement tools for improving the quality of their products and services. Data envelopment analysis (DEA) is a methodology for evaluating and measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. All the data in the conventional DEA with input and/or output ratios assumes the form of crisp numbers. However, the observed values of data in real-world problems are sometimes expressed as interval ratios. In this paper, we propose two new models: general and multiplicative non-parametric ratio models for DEA problems with interval data. The contributions of this paper are fourfold: (1) we consider input and output data expressed as interval ratios in DEA; (2) we address the gap in DEA literature for problems not suitable or difficult to model with crisp values; (3) we propose two new DEA models for evaluating the relative efficiencies of DMUs with interval ratios, and (4) we present a case study involving 20 banks with three interval ratios to demonstrate the applicability and efficacy of the proposed models where the traditional indicators are mostly financial ratios. © 2011 Elsevier Inc.
Resumo:
Technological advancements enable new sourcing models in software development such as cloud computing, software-as-a-service, and crowdsourcing. While the first two are perceived as a re-emergence of older models (e.g., ASP), crowdsourcing is a new model that creates an opportunity for a global workforce to compete with established service providers. Organizations engaging in crowdsourcing need to develop the capabilities to successfully utilize this sourcing model in delivering services to their clients. To explore these capabilities we collected qualitative data from focus groups with crowdsourcing leaders at a large technology organization. New capabilities we identified stem from the need of the traditional service provider to assume a "client" role in the crowdsourcing context, while still acting as a "vendor" in providing services to the end client. This paper expands the research on vendor capabilities and IS outsourcing as well as offers important insights to organizations that are experimenting with, or considering, crowdsourcing.
Resumo:
Most prior new product diffusion (NPD) models do not specifically consider the role of the business model in the process. However, the context of NPD in today's market has been changed dramatically by the introduction of new business models. Through reinterpretation and extension, this paper empirically examines the feasibility of applying Bass-type NPD models to products that are commercialized by different business models. More specifically, the results and analysis of this study consider the subscription business model for service products, the freemium business model for digital products, and a pre-paid and post-paid business model that is widely used by mobile network providers. The paper offers new insights derived from implementing the models in real-life cases. It also highlights three themes for future research.
Resumo:
A CSSL- type modular FORTRAN package, called ACES, has been developed to assist in the simulation of the dynamic behaviour of chemical plant. ACES can be harnessed, for instance, to simulate the transients in startups or after a throughput change. ACES has benefited from two existing simulators. The structure was adapted from ICL SLAM and most plant models originate in DYFLO. The latter employs sequential modularisation which is not always applicable to chemical engineering problems. A novel device of twice- round execution enables ACES to achieve general simultaneous modularisation. During the FIRST ROUND, STATE-VARIABLES are retrieved from the integrator and local calculations performed. During the SECOND ROUND, fresh derivatives are estimated and stored for simultaneous integration. ACES further includes a version of DIFSUB, a variable-step integrator capable of handling stiff differential systems. ACES is highly formalised . It does not use pseudo steady- state approximations and excludes inconsistent and arbitrary features of DYFLO. Built- in debug traps make ACES robust. ACES shows generality, flexibility, versatility and portability, and is very convenient to use. It undertakes substantial housekeeping behind the scenes and thus minimises the detailed involvement of the user. ACES provides a working set of defaults for simulation to proceed as far as possible. Built- in interfaces allow for reactions and user supplied algorithms to be incorporated . New plant models can be easily appended. Boundary- value problems and optimisation may be tackled using the RERUN feature. ACES is file oriented; a STATE can be saved in a readable form and reactivated later. Thus piecewise simulation is possible. ACES has been illustrated and verified to a large extent using some literature-based examples. Actual plant tests are desirable however to complete the verification of the library. Interaction and graphics are recommended for future work.
Resumo:
Risk and knowledge are two concepts and components of business management which have so far been studied almost independently. This is especially true where risk management (RM) is conceived mainly in financial terms, as for example, in the financial institutions sector. Financial institutions are affected by internal and external changes with the consequent accommodation to new business models, new regulations and new global competition that includes new big players. These changes induce financial institutions to develop different methodologies for managing risk, such as the enterprise risk management (ERM) approach, in order to adopt a holistic view of risk management and, consequently, to deal with different types of risk, levels of risk appetite, and policies in risk management. However, the methodologies for analysing risk do not explicitly include knowledge management (KM). This research examines the potential relationships between KM and two RM concepts: perceived quality of risk control and perceived value of ERM. To fulfill the objective of identifying how KM concepts can have a positive influence on some RM concepts, a literature review of KM and its processes and RM and its processes was performed. From this literature review eight hypotheses were analysed using a classification into people, process and technology variables. The data for this research was gathered from a survey applied to risk management employees in financial institutions and 121 answers were analysed. The analysis of the data was based on multivariate techniques, more specifically stepwise regression analysis. The results showed that the perceived quality of risk control is significantly associated with the variables: perceived quality of risk knowledge sharing, perceived quality of communication among people, web channel functionality, and risk management information system functionality. However, the relationships of the KM variables to the perceived value of ERM are not identified because of the low performance of the models describing these relationships. The analysis reveals important insights into the potential KM support to RM such as: the better adoption of KM people and technology actions, the better the perceived quality of risk control. Equally, the results suggest that the quality of risk control and the benefits of ERM follow different patterns given that there is no correlation between both concepts and the distinct influence of the KM variables in each concept. The ERM scenario is different from that of risk control because ERM, as an answer to RM failures and adaptation to new regulation in financial institutions, has led organizations to adopt new processes, technologies, and governance models. Thus, the search for factors influencing the perceived value of ERM implementation needs additional analysis because what is improved in RM processes individually is not having the same effect on the perceived value of ERM. Based on these model results and the literature review the basis of the ERKMAS (Enterprise Risk Knowledge Management System) is presented.
Resumo:
There has been concern in the literature about the adequacy of the traditional model of marketing planning, which focuses on what decisions should be made and not on how to make them. The aim of this article is a new conceptualisation that proposes key management processes about how marketing planning decisions are made in a dynamic context. The motives for this conceptualisation are to contribute to understanding by advancing the traditional model of marketing planning, to stimulate academic and practitioner debate about how marketing planning decisions are made, and to initiate new directions in marketing planning research. Two new competing models of marketing planning are developed, which address key management processes about how marketing planning decisions are made in a dynamic context, and research directions are proposed.
Resumo:
This thesis presents an approach to cutting dynamics during turning based upon the mechanism of deformation of work material around the tool nose known as "ploughing". Starting from the shearing process in the cutting zone and accounting for "ploughing", new mathematical models relating turning force components to cutting conditions, tool geometry and tool vibration are developed. These models are developed separately for steady state and for oscillatory turning with new and worn tools. Experimental results are used to determine mathematical functions expressing the parameters introduced by the steady state model in the case of a new tool. The form of these functions are of general validity though their coefficients are dependent on work and tool materials. Good agreement is achieved between experimental and predicted forces. The model is extended on one hand to include different work material by introducing a hardness factor. The model provides good predictions when predicted forces are compared to present and published experimental results. On the other hand, the extension of the ploughing model to taming with a worn edge showed the ability of the model in predicting machining forces during steady state turning with the worn flank of the tool. In the development of the dynamic models, the dynamic turning force equations define the cutting process as being a system for which vibration of the tool tip in the feed direction is the input and measured forces are the output The model takes into account the shear plane oscillation and the cutting configuration variation in response to tool motion. Theoretical expressions of the turning forces are obtained for new and worn cutting edges. The dynamic analysis revealed the interaction between the cutting mechanism and the machine tool structure. The effect of the machine tool and tool post is accounted for by using experimental data of the transfer function of the tool post system. Steady state coefficients are corrected to include the changes in the cutting configuration with tool vibration and are used in the dynamic model. A series of oscillatory cutting tests at various conditions and various tool flank wear levels are carried out and experimental results are compared with model—predicted forces. Good agreement between predictions and experiments were achieved over a wide range of cutting conditions. This research bridges the gap between the analysis of vibration and turning forces in turning. It offers an explicit expression of the dynamic turning force generated during machining and highlights the relationships between tool wear, tool vibration and turning force. Spectral analysis of tool acceleration and turning force components led to define an "Inertance Power Ratio" as a flank wear monitoring factor. A formulation of an on—line flank wear monitoring methodology is presented and shows how the results of the present model can be applied to practical in—process tool wear monitoring in • turning operations.
Resumo:
Purpose – This paper aims to explore the antecedents of careerist orientations to work. Hypotheses are drawn from referent cognitions theory. First, it is proposed that trust mediates the relationship between an individual's perceptions of procedural justice and their careerist orientations to work. Second, perceptions of distributive justice, regarding the allocation of career development opportunities, will moderate the relationship between trust and careerist orientations to work. Design/methodology/approach – A total of 325 employees of a large UK financial institution completed a structured questionnaire. Regression analysis (using SPSS version 11) was used to test the presented hypotheses. Findings – All hypotheses were confirmed. However, the interaction effect observed was different from that hypothesised. It appears that trust only matters, in terms of the development of careerist orientations to work, when individuals feel that they are receiving equitable career development opportunities. Research limitations/implications – Much more research is required in different organisational contexts if one is to fully confirm and understand these relationships. However, these findings suggest that employers will only reduce the development of careerist attitudes in their workforce if they ensure the fair distribution of career development opportunities and engender trusting relations through the implementation of fair decision-making procedures. Originality/value – This paper adds much needed empirical research to the literature on new career realities and careerist orientations to work. Moreover, referent cognitions theory is presented as a new theoretical framework for understanding the cognitive processes involved in an individual's development of careerist attitudes.
Resumo:
Risk and knowledge are two concepts and components of business management which have so far been studied almost independently. This is especially true where risk management is conceived mainly in financial terms, as, for example, in the banking sector. The banking sector has sophisticated methodologies for managing risk, such as mathematical risk modeling. However. the methodologies for analyzing risk do not explicitly include knowledge management for risk knowledge creation and risk knowledge transfer. Banks are affected by internal and external changes with the consequent accommodation to new business models new regulations and the competition of big players around the world. Thus, banks have different levels of risk appetite and policies in risk management. This paper takes into consideration that business models are changing and that management is looking across the organization to identify the influence of strategic planning, information systems theory, risk management and knowledge management. These disciplines can handle the risks affecting banking that arise from different areas, but only if they work together. This creates a need to view them in an integrated way. This article sees enterprise risk management as a specific application of knowledge in order to control deviation from strategic objectives, shareholders' values and stakeholders' relationships. Before and after a modeling process it necessary to find insights into how the application of knowledge management processes can improve the understanding of risk and the implementation of enterprise risk management. The article presents a propose methodology to contribute to providing a guide for developing risk modeling knowledge and a reduction of knowledge silos, in order to improve the quality and quantity of solutions related to risk inquiries across the organization.
Resumo:
Crowdsourcing platforms that attract a large pool of potential workforce allow organizations to reduce permanent staff levels. However managing this "human cloud" requires new management models and skills. Therefore, Information Technology (IT) service providers engaging in crowdsourcing need to develop new capabilities to successfully utilize crowdsourcing in delivering services to their clients. To explore these capabilities we collected qualitative data from focus groups with crowdsourcing leaders at a large multinational technology organization. New capabilities we identified stem from the need of the traditional service provider to assume a "client" role in the crowdsourcing context, while still acting as a "vendor" in providing services to the end-client. This paper expands the research on vendor capabilities and IT outsourcing as well as offers important insights to organizations that are experimenting with, or considering, crowdsourcing. © 2014 Elsevier B.V. All rights reserved.
Resumo:
The objective of this study is to demonstrate using weak form partial differential equation (PDE) method for a finite-element (FE) modeling of a new constitutive relation without the need of user subroutine programming. The viscoelastic asphalt mixtures were modeled by the weak form PDE-based FE method as the examples in the paper. A solid-like generalized Maxwell model was used to represent the deforming mechanism of a viscoelastic material, the constitutive relations of which were derived and implemented in the weak form PDE module of Comsol Multiphysics, a commercial FE program. The weak form PDE modeling of viscoelasticity was verified by comparing Comsol and Abaqus simulations, which employed the same loading configurations and material property inputs in virtual laboratory test simulations. Both produced identical results in terms of axial and radial strain responses. The weak form PDE modeling of viscoelasticity was further validated by comparing the weak form PDE predictions with real laboratory test results of six types of asphalt mixtures with two air void contents and three aging periods. The viscoelastic material properties such as the coefficients of a Prony series model for the relaxation modulus were obtained by converting from the master curves of dynamic modulus and phase angle. Strain responses of compressive creep tests at three temperatures and cyclic load tests were predicted using the weak form PDE modeling and found to be comparable with the measurements of the real laboratory tests. It was demonstrated that the weak form PDE-based FE modeling can serve as an efficient method to implement new constitutive models and can free engineers from user subroutine programming.