893 resultados para ENTERPRISE STATISTICS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in two consulting studies carried out by Capgemini involving three UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With a rapidly increasing fraction of electricity generation being sourced from wind, extreme wind power generation events such as prolonged periods of low (or high) generation and ramps in generation, are a growing concern for the efficient and secure operation of national power systems. As extreme events occur infrequently, long and reliable meteorological records are required to accurately estimate their characteristics. Recent publications have begun to investigate the use of global meteorological “reanalysis” data sets for power system applications, many of which focus on long-term average statistics such as monthly-mean generation. Here we demonstrate that reanalysis data can also be used to estimate the frequency of relatively short-lived extreme events (including ramping on sub-daily time scales). Verification against 328 surface observation stations across the United Kingdom suggests that near-surface wind variability over spatiotemporal scales greater than around 300 km and 6 h can be faithfully reproduced using reanalysis, with no need for costly dynamical downscaling. A case study is presented in which a state-of-the-art, 33 year reanalysis data set (MERRA, from NASA-GMAO), is used to construct an hourly time series of nationally-aggregated wind power generation in Great Britain (GB), assuming a fixed, modern distribution of wind farms. The resultant generation estimates are highly correlated with recorded data from National Grid in the recent period, both for instantaneous hourly values and for variability over time intervals greater than around 6 h. This 33 year time series is then used to quantify the frequency with which different extreme GB-wide wind power generation events occur, as well as their seasonal and inter-annual variability. Several novel insights into the nature of extreme wind power generation events are described, including (i) that the number of prolonged low or high generation events is well approximated by a Poission-like random process, and (ii) whilst in general there is large seasonal variability, the magnitude of the most extreme ramps is similar in both summer and winter. An up-to-date version of the GB case study data as well as the underlying model are freely available for download from our website: http://www.met.reading.ac.uk/~energymet/data/Cannon2014/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using Triad-based multinational enterprises as their empirical setting, influential scholars in international management uncovered key organizational characteristics needed to create globally integrated and locally responsive multinationals. They proposed a “modern” theory of multinationals' organization (Hedlund, 1994). But recently, a new generation of multinationals from emerging markets has appeared. Little is known about their organizational choices and some scholars even doubt that they leverage organizational capabilities altogether. Does the “modern” theory still hold in their case? This exploratory study of three emerging-market multinationals (EMNEs) discloses that for reasons related to their origin in emerging economies and to the competitive specificities of these economies, EMNEs approach the global and local conundrum in ways which are both similar – and vastly different – from recommendations of the “modern” theory. We inductively develop a new theory that accounts for the evolution of organizational capabilities in EMNEs to reconcile global integration and local responsiveness. We discuss its implications for the executives of both emerging and Triad-based multinationals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the last few years Enterprise Architecture (EA) has received increasing attention among industry and academia. By adopting EA, organisations may gain a number of benefits such as better decision making,increased revenues and cost reduction, and alignment of business and IT. However, EA adoption has been found to be difficult. In this paper a model to explain resistance during EA adoption process (REAP) is introduced and validated. The model reveals relationships between strategic level of EA, resulting organisational changes, and sources of resistance. By utilising REAP model, organisations may anticipate and prepare for the organisational change resistance during EA adoption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the last few years Enterprise Architecture has received increasing attention among industry and academia. Enterprise Architecture (EA) can be defined as (i) a formal description of the current and future state(s) of an organisation, and (ii) a managed change between these states to meet organisation’s stakeholders’ goals and to create value to the organisation. By adopting EA, organisations may gain a number of benefits such as better decision making, increased revenues and cost reductions, and alignment of business and IT. To increase the performance of public sector operations, and to improve public services and their availability, the Finnish Parliament has ratified the Act on Information Management Governance in Public Administration in 2011. The Act mandates public sector organisations to start adopting EA by 2014, including Higher Education Institutions (HEIs). Despite the benefits of EA and the Act, EA adoption level and maturity in Finnish HEIs are low. This is partly caused by the fact that EA adoption has been found to be difficult. Thus there is a need for a solution to help organisations to adopt EA successfully. This thesis follows Design Science (DS) approach to improve traditional EA adoption method in order to increase the likelihood of successful adoption. First a model is developed to explain the change resistance during EA adoption. To find out problems associated with EA adoption, an EA-pilot conducted in 2010 among 12 Finnish HEIs was analysed using the model. It was found that most of the problems were caused by misunderstood EA concepts, attitudes, and lack of skills. The traditional EA adoption method does not pay attention to these. To overcome the limitations of the traditional EA adoption method, an improved EA Adoption Method (EAAM) is introduced. By following EAAM, organisations may increase the likelihood of successful EA adoption. EAAM helps in acquiring the mandate for EA adoption from top-management, which has been found to be crucial to success. It also helps in supporting individual and organisational learning, which has also found to be essential in successful adoption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper argues that the problems commonly associated with the joint enterprise doctrine might be alleviated by supplementing the cognitive mens rea standard of foresight with a volitional element that looks to how the defendant related to the foreseen risk. A re-examination of the case law suggests that a mens rea conception of foresight plus endorsement might be within interpretative reach. The paper considers possible objections to such a development but ultimately rejects them. It concludes that it is not necessary to wait for Parliament to put in place reforms: joint enterprise is a creature of the common law, and the common law is able to tame it unaided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To improve the quantity and impact of observations used in data assimilation it is necessary to take into account the full, potentially correlated, observation error statistics. A number of methods for estimating correlated observation errors exist, but a popular method is a diagnostic that makes use of statistical averages of observation-minus-background and observation-minus-analysis residuals. The accuracy of the results it yields is unknown as the diagnostic is sensitive to the difference between the exact background and exact observation error covariances and those that are chosen for use within the assimilation. It has often been stated in the literature that the results using this diagnostic are only valid when the background and observation error correlation length scales are well separated. Here we develop new theory relating to the diagnostic. For observations on a 1D periodic domain we are able to the show the effect of changes in the assumed error statistics used in the assimilation on the estimated observation error covariance matrix. We also provide bounds for the estimated observation error variance and eigenvalues of the estimated observation error correlation matrix. We demonstrate that it is still possible to obtain useful results from the diagnostic when the background and observation error length scales are similar. In general, our results suggest that when correlated observation errors are treated as uncorrelated in the assimilation, the diagnostic will underestimate the correlation length scale. We support our theoretical results with simple illustrative examples. These results have potential use for interpreting the derived covariances estimated using an operational system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper empirically tests the effectiveness of information and communications technology (ICT) knowledge transfer and adoption in the multinational enterprise (MNE) as an issue of critical importance to contemporary MNE functioning. In contrast to mainstream thinking on absorptive capacity, but in line with prevailing international business theory, our research supports the proposition that perceptions of procedural justice, rather than absorptive capacity, determine effectiveness, especially in cases of high tacit knowledge transfers. Data was collected from senior ICT representatives in 86 Canadian subsidiaries of foreign owned MNEs. Each of these subsidiaries recently experienced a significant ICT transfer imposed by the parent organization. Support was found for the main propositions: Procedural justice significantly predicted successful ICT transfer and adoption, while absorptive capacity was not significant. These findings are consistent even when knowledge tacitness was high. The perceived success of the ICT transfer as well as its adoption varied widely across these firms. The potential reasons for this divergence in effectiveness are manifold, but our findings suggest that in situations of substantial knowledge tacitness, a higher level of procedural justice, rather than a higher level of absorptive capacity, is critical to effective transfer and adoption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces a pragmatic and practical method for requirements modeling. The method is built using the concepts of our goal sketching technique together with techniques from an enterprise architecture modeling language. Our claim is that our method will help project managers who want to establish early control of their projects and will also give managers confidence in the scope of their project. In particular we propose the inclusion of assumptions as first class entities in the ArchiMate enterprise architecture modeling language and an extension of the ArchiMate Motivation Model principle to allow radical as well as normative analyses. We demonstrate the usefulness of this method using a simple university library system as an example.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although the sunspot-number series have existed since the mid-19th century, they are still the subject of intense debate, with the largest uncertainty being related to the "calibration" of the visual acuity of individual observers in the past. Daisy-chain regression methods are applied to inter-calibrate the observers which may lead to significant bias and error accumulation. Here we present a novel method to calibrate the visual acuity of the key observers to the reference data set of Royal Greenwich Observatory sunspot groups for the period 1900-1976, using the statistics of the active-day fraction. For each observer we independently evaluate their observational thresholds [S_S] defined such that the observer is assumed to miss all of the groups with an area smaller than S_S and report all the groups larger than S_S. Next, using a Monte-Carlo method we construct, from the reference data set, a correction matrix for each observer. The correction matrices are significantly non-linear and cannot be approximated by a linear regression or proportionality. We emphasize that corrections based on a linear proportionality between annually averaged data lead to serious biases and distortions of the data. The correction matrices are applied to the original sunspot group records for each day, and finally the composite corrected series is produced for the period since 1748. The corrected series displays secular minima around 1800 (Dalton minimum) and 1900 (Gleissberg minimum), as well as the Modern grand maximum of activity in the second half of the 20th century. The uniqueness of the grand maximum is confirmed for the last 250 years. It is shown that the adoption of a linear relationship between the data of Wolf and Wolfer results in grossly inflated group numbers in the 18th and 19th centuries in some reconstructions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the development of convection-permitting numerical weather prediction the efficient use of high resolution observations in data assimilation is becoming increasingly important. The operational assimilation of these observations, such as Dopplerradar radial winds, is now common, though to avoid violating the assumption of un- correlated observation errors the observation density is severely reduced. To improve the quantity of observations used and the impact that they have on the forecast will require the introduction of the full, potentially correlated, error statistics. In this work, observation error statistics are calculated for the Doppler radar radial winds that are assimilated into the Met Office high resolution UK model using a diagnostic that makes use of statistical averages of observation-minus-background and observation-minus-analysis residuals. This is the first in-depth study using the diagnostic to estimate both horizontal and along-beam correlated observation errors. By considering the new results obtained it is found that the Doppler radar radial wind error standard deviations are similar to those used operationally and increase as the observation height increases. Surprisingly the estimated observation error correlation length scales are longer than the operational thinning distance. They are dependent on both the height of the observation and on the distance of the observation away from the radar. Further tests show that the long correlations cannot be attributed to the use of superobservations or the background error covariance matrix used in the assimilation. The large horizontal correlation length scales are, however, in part, a result of using a simplified observation operator.