176 resultados para Multinational Experimentation
Resumo:
Climate simulations by 16 atmospheric general circulation models (AGCMs) are compared on an aqua-planet, a water-covered Earth with prescribed sea surface temperature varying only in latitude. The idealised configuration is designed to expose differences in the circulation simulated by different models. Basic features of the aqua-planet climate are characterised by comparison with Earth. The models display a wide range of behaviour. The balanced component of the tropospheric mean flow, and mid-latitude eddy covariances subject to budget constraints, vary relatively little among the models. In contrast, differences in damping in the dynamical core strongly influence transient eddy amplitudes. Historical uncertainty in modelled lower stratospheric temperatures persists in APE. Aspects of the circulation generated more directly by interactions between the resolved fluid dynamics and parameterized moist processes vary greatly. The tropical Hadley circulation forms either a single or double inter-tropical convergence zone (ITCZ) at the equator, with large variations in mean precipitation. The equatorial wave spectrum shows a wide range of precipitation intensity and propagation characteristics. Kelvin mode-like eastward propagation with remarkably constant phase speed dominates in most models. Westward propagation, less dispersive than the equatorial Rossby modes, dominates in a few models or occurs within an eastward propagating envelope in others. The mean structure of the ITCZ is related to precipitation variability, consistent with previous studies. The aqua-planet global energy balance is unknown but the models produce a surprisingly large range of top of atmosphere global net flux, dominated by differences in shortwave reflection by clouds. A number of newly developed models, not optimised for Earth climate, contribute to this. Possible reasons for differences in the optimised models are discussed. The aqua-planet configuration is intended as one component of an experimental hierarchy used to evaluate AGCMs. This comparison does suggest that the range of model behaviour could be better understood and reduced in conjunction with Earth climate simulations. Controlled experimentation is required to explore individual model behaviour and investigate convergence of the aqua-planet climate with increasing resolution.
Resumo:
I argue that the initial set of firm-specific assets (FSAs) act as an envelope for the early stages of internationalization of multinational enterprises (MNEs) (of whatever nationality) AND THAT there is a threshold LEVEL of FSAs that IT must possess for such international expansion to be SUCCESSFUL. I also argue that the initial FSAs of an MNE tend to be constrained by the location-specific (L) assets of the home country. However, beyond different initial conditions, there are few obvious reasons to insist that INFANT developing country MNEs are of unique character THAN ADVANCED ECONOMY MNEs, and I predict that as they evolve, the observable differences between the two groups will diminish. Successful firms will increasingly explore internationalization, but there is also no reason to believe that this is likely to happen disproportionately from the developing countries.
Resumo:
This research aims to extend our understanding of the duality between global integration and local responsiveness in multinational corporations (MNCs) by exploring the perceptions of corporate HR actors regarding the intra-organisational factors that alter the balance between these pressures. It examines the perceptions and actions of key actors in the context of two Korean MNCs. The study shows the importance attributed to a range of socio-procedural factors by corporate actors and which, therefore, inform the practical management of the dual forces, notably: HR expertise, social ties, trustworthy relationships and co-involvement in decision processes.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.
Resumo:
An interdisciplinary theoretical framework is proposed for analysing justice in global working conditions. In addition to gender and race as popular criteria to identify disadvantaged groups in organizations, in multinational corporations (MNCs) local employees (i.e. host country nationals (HCNs) working in foreign subsidiaries) deserve special attention. Their working conditions are often substantially worse than those of expatriates (i.e. parent country nationals temporarily assigned to a foreign subsidiary). Although a number of reasons have been put forward to justify such inequalities—usually with efficiency goals in mind—recent studies have used equity theory to question the extent to which they are perceived as fair by HCNs. However, since perceptual equity theory has limitations, this study develops an alternative and non-perceptual framework for analysing such inequalities. Employment discrimination theory and elements of Rawls’s ‘Theory of Justice’ are the theoretical pillars of this framework. This article discusses the advantages of this approach for MNCs and identifies some expatriation practices that are fair according to our non-perceptual justice standards, whilst also reasonably (if not highly) efficient.
Resumo:
This Atlas presents statistical analyses of the simulations submitted to the Aqua-Planet Experiment (APE) data archive. The simulations are from global Atmospheric General Circulation Models (AGCM) applied to a water-covered earth. The AGCMs include ones actively used or being developed for numerical weather prediction or climate research. Some are mature, application models and others are more novel and thus less well tested in Earth-like applications. The experiment applies AGCMs with their complete parameterization package to an idealization of the planet Earth which has a greatly simplified lower boundary that consists of an ocean only. It has no land and its associated orography, and no sea ice. The ocean is represented by Sea Surface Temperatures (SST) which are specified everywhere with simple, idealized distributions. Thus in the hierarchy of tests available for AGCMs, APE falls between tests with simplified forcings such as those proposed by Held and Suarez (1994) and Boer and Denis (1997) and Earth-like simulations of the Atmospheric Modeling Intercomparison Project (AMIP, Gates et al., 1999). Blackburn and Hoskins (2013) summarize the APE and its aims. They discuss where the APE fits within a modeling hierarchy which has evolved to evaluate complete models and which provides a link between realistic simulation and conceptual models of atmospheric phenomena. The APE bridges a gap in the existing hierarchy. The goals of APE are to provide a benchmark of current model behaviors and to stimulate research to understand the cause of inter-model differences., APE is sponsored by the World Meteorological Organization (WMO) joint Commission on Atmospheric Science (CAS), World Climate Research Program (WCRP) Working Group on Numerical Experimentation (WGNE). Chapter 2 of this Atlas provides an overview of the specification of the eight APE experiments and of the data collected. Chapter 3 lists the participating models and includes brief descriptions of each. Chapters 4 through 7 present a wide variety of statistics from the 14 participating models for the eight different experiments. Additional intercomparison figures created by Dr. Yukiko Yamada in AGU group are available at http://www.gfd-dennou.org/library/ape/comparison/. This Atlas is intended to present and compare the statistics of the APE simulations but does not contain a discussion of interpretive analyses. Such analyses are left for journal papers such as those included in the Special Issue of the Journal of the Meteorological Society of Japan (2013, Vol. 91A) devoted to the APE. Two papers in that collection provide an overview of the simulations. One (Blackburn et al., 2013) concentrates on the CONTROL simulation and the other (Williamson et al., 2013) on the response to changes in the meridional SST profile. Additional papers provide more detailed analysis of the basic simulations, while others describe various sensitivities and applications. The APE experiment data base holds a wealth of data that is now publicly available from the APE web site: http://climate.ncas.ac.uk/ape/. We hope that this Atlas will stimulate future analyses and investigations to understand the large variation seen in the model behaviors.
Resumo:
Accelerated climate change affects components of complex biological interactions differentially, often causing changes that are difficult to predict. Crop yield and quality are affected by climate change directly, and indirectly, through diseases that themselves will change but remain important. These effects are difficult to dissect and model as their mechanistic bases are generally poorly understood. Nevertheless, a combination of integrated modelling from different disciplines and multi-factorial experimentation will advance our understanding and prioritisation of the challenges. Food security brings in additional socio-economic, geographical and political factors. Enhancing resilience to the effects of climate change is important for all these systems and functional diversity is one of the most effective targets for improved sustainability.
Resumo:
Purpose – The purpose of this paper is to demonstrate key strategic decisions involved in turning around a large multinational operating in a dynamic market. Design/methodology/approach – The paper is based on analysis of archival documents and a semi-structured interview with the chairman of the company credited with its rescue. Findings – Turnaround is complex and involves both planned and emergent strategies. The progress is non-linear requiring adjustment and change in direction of travel. Top management credibility and vision is critical to success. Rescue is only possible if the company has a strong cash generative business among its businesses. The speed of decision making, decisiveness and the ability to implement strategy are among the key ingredients of success. Originality/value – Turnaround is an under-researched area in strategy. This paper contributes to a better understanding in this important area and bridges the gap between theory and practice. It provides a practical view and demonstrates how a leading executive with significant expertise and successful turnaround track record deals with inherent dilemmas of turnaround
Resumo:
We derive energy-norm a posteriori error bounds, using gradient recovery (ZZ) estimators to control the spatial error, for fully discrete schemes for the linear heat equation. This appears to be the �rst completely rigorous derivation of ZZ estimators for fully discrete schemes for evolution problems, without any restrictive assumption on the timestep size. An essential tool for the analysis is the elliptic reconstruction technique.Our theoretical results are backed with extensive numerical experimentation aimed at (a) testing the practical sharpness and asymptotic behaviour of the error estimator against the error, and (b) deriving an adaptive method based on our estimators. An extra novelty provided is an implementation of a coarsening error "preindicator", with a complete implementation guide in ALBERTA in the appendix.
Resumo:
What are the microfoundations of dynamic capabilities that sustain competitive advantage in a highly volatile environment, such as a transition economy? We explore the detailed nature of these dynamic capabilities along with their antecedents by tracing the sequence of their development based on a longitudinal case study of an organization subject to an external context of radical transition — the Russian oil company, Yukos. Our rich qualitative data indicate two distinct types of dynamic capabilities that are pivotal for organizational transformation. Adaptation dynamic capabilities relate to routines of resource exploitation and deployment, which are supported by acquisition, internalization and dissemination of extant knowledge, as well as resource reconfiguration, divestment and integration. Innovation dynamic capabilities relate to the creation of completely new capabilities via exploration and path-creation processes, which are supported by search, experimentation and risk taking, as well as project selection, funding and implementation. Second, we find that sequencing the two types of dynamic capabilities, helped the organization both to secure short-term competitive advantage, and to create the basis for long-term competitive advantage. These dynamic capability constructs advance theoretical understanding of what dynamic capabilities are, whilst their sequencing explains how firms create, leverage and enhance them over time.
Resumo:
Practical realisation of Cyborgs opens up significant new opportunities in many fields. In particular when it comes to space travel many of the limitations faced by humans, in stand-alone form, are transposed by the adoption of a cyborg persona. In this article a look is taken at different types of Brain-Computer interface which can be employed to realise Cyborgs, biology-technology hybrids. e approach taken is a practical one with applications in mind, although some wider implications are also considered. In particular results from experiments are discussed in terms of their meaning and application possibilities. e article is written from the perspective of scientific experimentation opening up realistic possibilities to be faced in the future rather than giving conclusive comments on the technologies employed. Human implantation and the merger of biology and technology are though important elements.