892 resultados para ENTERPRISE STATISTICS
Resumo:
The effective and efficient management of diversified business firms that supply multiple products and operate in multiple, dynamic markets, especially large multinational enterprises (MNEs), builds upon a number of specific governance principles. These governance principles allow the alignment of environmental characteristics, strategy and organization. Given the rising need to “learn from the world”, Doz et al., in their influential Harvard Business School Press book entitled From Global to Metanational, have proposed a new set of governance principles described under the “metanational” umbrella concept. This paper revisits the metanational, using a comparative institutional perspective; here we contrast multidivisional and metanational governance principles. A comparative institutional analysis suggests that the metanational's application potential in terms of actually improving the effectiveness and efficiency of MNE governance may be subject to more qualification than suggested by Doz et al. Senior MNE management must therefore reflect carefully before substituting metanational governance principles for the more conventional, multidivisional ones with established contributions to managerial effectiveness and efficiency.
Resumo:
This paper extends the resource-based view (RBV) of the firm, as applied to multinational enterprises (MNEs), by distinguishing between two critical resource dimensions, namely relative resource superiority (capabilities) and slack. Both dimensions, in concert with specific environmental conditions, are required to increase entrepreneurial activities. We propose distinct configurations (three-way moderation effects) of capabilities, slack, and environmental factors (i.e. dynamism and hostility) to explain entrepreneurship. Using survey data from 66 Canadian subsidiaries operating in China, we find that higher subsidiary entrepreneurship requires both HR slack and strong downstream capabilities in subsidiaries, subject to the industry environment being dynamic and benign. However, high HR slack alone, in a dynamic and benign environment, but without the presence of strong capabilities, actually triggers the fewest initiatives, with HR slack redirected from entrepreneurial experimentation towards complacency and inefficiency. This paper has major implications for MNEs seeking to increase subsidiary entrepreneurship in fast growing emerging markets.
Resumo:
This note caveats standard statistics which accompany chess endgame tables, EGTs. It refers to Nalimov's double-counting of pawnless positions with both Kings on a long diagonal, and to the inclusion of positions which are not reachable from the initial position.
Resumo:
Mergers of Higher Education Institutions (HEIs) are organisational processes requiring tremendous amount of resources, in terms of time, work, and money. A number of mergers have been seen on previous years and more are to come. Several studies on mergers have been conducted, revealing some crucial factors that affect the success of mergers. Based on literature review on these studies, factors are: the initiator of merger, a reason for merger, geographical distance of merging institutions, organisational culture, the extend of overlapping course portfolio, and Quality Assurance Systems (QASs). Usually these kind of factors are not considered on mergers, but focus is on financial matters. In this paper, a framework (HMEF) for evaluating merging of HEIs is introduced. HMEF is based on Enterprise Architecture (EA), focusing on factors found to be affecting the success of mergers. By using HMEF, HEIs can focus on matters that crucial for merging.
Resumo:
Initial phase of all Enterprise Architecture (EA) initiatives is important. One of the most crucial tasks in that phase is to sell EA to the top management by explaining its purpose. In this paper, by using semiotic framework we show that there is a clear gap between the definition of EA and its purpose. Contribution of this paper is a taxonomy that expands knowledge of pragmatics of EA, and that can be used as a tool for explaining the purpose of EA. Grounded theory is used to form the taxonomy. Data is collected from a discussion group used by EA practitioners. Results indicate that the purpose of EA is to meet organisations‟ stakeholder‟s goals and to create value to organisation. Results are in line with current literature. Most interesting result is that EA practitioners seem to realise that technical solutions are not the purpose of EA, but means for fulfilling it.
Resumo:
Interest towards Enterprise Architecture (EA) has been increasing during the last few years. EA has been found to be a crucial aspect of business survival, and thus the importance of EA implementation success is also crucial. Current literature does not have a tool to be used to measure the success of EA implementation. In this paper, a tentative model for measuring success is presented and empirically validated in EA context. Results show that the success of EA implementation can be measured indirectly by measuring the achievement of the objectives set for the implementation. Results also imply that achieving individual's objectives do not necessarily mean that organisation's objectives are achieved. The presented Success Measurement Model can be used as basis for developing measurement metrics.
Resumo:
The number of published Enterprise Architecture (EA) research has increased during the last few years. As a discipline, EA is still young and lacking theoretical foundation. Lately some research trying to ground EA to theory has been published, including linkage to systems theory. Enterprise Architecture can be defined as; (i) a formal description of the current and future state(s) of an organisation, and (ii) a managed change between these states to meet organisation’s stakeholders’ goals and to create value to the organisation. Based on this definition, this conceptual paper tries to shed light to theoretical underpinnings of EA from three theoretical perspectives; EA as a communication media, EA as an activity, and EA as an information technology system. Our conclusions are that; (i) EA can be categorised as a communication media and theoretically underpinned by ontology and semiotics, (ii) EA can be explained and theoretically underpinned by Activity Theory, and (iii) EA can be categorised as an information technology system and theoretically underpinned by General Systems Theory and Technology Acceptance Theory.
Resumo:
Sensory thresholds are often collected through ascending forced-choice methods. Group thresholds are important for comparing stimuli or populations; yet, the method has two problems. An individual may correctly guess the correct answer at any concentration step and might detect correctly at low concentrations but become adapted or fatigued at higher concentrations. The survival analysis method deals with both issues. Individual sequences of incorrect and correct answers are adjusted, taking into account the group performance at each concentration. The technique reduces the chance probability where there are consecutive correct answers. Adjusted sequences are submitted to survival analysis to determine group thresholds. The technique was applied to an aroma threshold and a taste threshold study. It resulted in group thresholds similar to ASTM or logarithmic regression procedures. Significant differences in taste thresholds between younger and older adults were determined. The approach provides a more robust technique over previous estimation methods.
Resumo:
For certain observing types, such as those that are remotely sensed, the observation errors are correlated and these correlations are state- and time-dependent. In this work, we develop a method for diagnosing and incorporating spatially correlated and time-dependent observation error in an ensemble data assimilation system. The method combines an ensemble transform Kalman filter with a method that uses statistical averages of background and analysis innovations to provide an estimate of the observation error covariance matrix. To evaluate the performance of the method, we perform identical twin experiments using the Lorenz ’96 and Kuramoto-Sivashinsky models. Using our approach, a good approximation to the true observation error covariance can be recovered in cases where the initial estimate of the error covariance is incorrect. Spatial observation error covariances where the length scale of the true covariance changes slowly in time can also be captured. We find that using the estimated correlated observation error in the assimilation improves the analysis.
Resumo:
The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in two consulting studies carried out by Capgemini involving three UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.
Resumo:
With a rapidly increasing fraction of electricity generation being sourced from wind, extreme wind power generation events such as prolonged periods of low (or high) generation and ramps in generation, are a growing concern for the efficient and secure operation of national power systems. As extreme events occur infrequently, long and reliable meteorological records are required to accurately estimate their characteristics. Recent publications have begun to investigate the use of global meteorological “reanalysis” data sets for power system applications, many of which focus on long-term average statistics such as monthly-mean generation. Here we demonstrate that reanalysis data can also be used to estimate the frequency of relatively short-lived extreme events (including ramping on sub-daily time scales). Verification against 328 surface observation stations across the United Kingdom suggests that near-surface wind variability over spatiotemporal scales greater than around 300 km and 6 h can be faithfully reproduced using reanalysis, with no need for costly dynamical downscaling. A case study is presented in which a state-of-the-art, 33 year reanalysis data set (MERRA, from NASA-GMAO), is used to construct an hourly time series of nationally-aggregated wind power generation in Great Britain (GB), assuming a fixed, modern distribution of wind farms. The resultant generation estimates are highly correlated with recorded data from National Grid in the recent period, both for instantaneous hourly values and for variability over time intervals greater than around 6 h. This 33 year time series is then used to quantify the frequency with which different extreme GB-wide wind power generation events occur, as well as their seasonal and inter-annual variability. Several novel insights into the nature of extreme wind power generation events are described, including (i) that the number of prolonged low or high generation events is well approximated by a Poission-like random process, and (ii) whilst in general there is large seasonal variability, the magnitude of the most extreme ramps is similar in both summer and winter. An up-to-date version of the GB case study data as well as the underlying model are freely available for download from our website: http://www.met.reading.ac.uk/~energymet/data/Cannon2014/.
Resumo:
Using Triad-based multinational enterprises as their empirical setting, influential scholars in international management uncovered key organizational characteristics needed to create globally integrated and locally responsive multinationals. They proposed a “modern” theory of multinationals' organization (Hedlund, 1994). But recently, a new generation of multinationals from emerging markets has appeared. Little is known about their organizational choices and some scholars even doubt that they leverage organizational capabilities altogether. Does the “modern” theory still hold in their case? This exploratory study of three emerging-market multinationals (EMNEs) discloses that for reasons related to their origin in emerging economies and to the competitive specificities of these economies, EMNEs approach the global and local conundrum in ways which are both similar – and vastly different – from recommendations of the “modern” theory. We inductively develop a new theory that accounts for the evolution of organizational capabilities in EMNEs to reconcile global integration and local responsiveness. We discuss its implications for the executives of both emerging and Triad-based multinationals.