93 resultados para ENTERPRISE STATISTICS
Resumo:
The probability of a quantum particle being detected in a given solid angle is determined by the S-matrix. The explanation of this fact in time-dependent scattering theory is often linked to the quantum flux, since the quantum flux integrated against a (detector-) surface and over a time interval can be viewed as the probability that the particle crosses this surface within the given time interval. Regarding many particle scattering, however, this argument is no longer valid, as each particle arrives at the detector at its own random time. While various treatments of this problem can be envisaged, here we present a straightforward Bohmian analysis of many particle potential scattering from which the S-matrix probability emerges in the limit of large distances.
Resumo:
The doctrine of joint criminal enterprise is in disarray. Despite repeated judicial scrutiny at the highest level, the doctrine's scope, proper doctrinal basis and function in relation to other modes of complicity remain uncertain. This article examines the doctrine's elements and underlying principles. It argues that while joint criminal enterprise is largely used to make individuals liable for offences committed by their associates in excess of the common criminal purpose, its proper function is to police the limits of associate liability and thus to exculpate rather than inculpate. The doctrine governs not only instances of accessorial liability; it also applies where the parties involved are joint principal offenders. As this puts into question the prevalent view that joint criminal enterprise is a form of secondary participation that results in accessorial liability, the article concludes that it is best seen as a doctrine sui generis.
Resumo:
This article investigates the nature of enterprise pedagogy in music. It presents the results of a research project that applied the practices of enterprise learning developed in the post-compulsory music curriculum in England to the teaching of the National Curriculum for music for 11-to-14-year-olds. In doing so, the article explores the nature of enterprise learning and the nature of pedagogy, in order to consider whether enterprise pedagogy offers an effective way to teach the National Curriculum. Enterprise pedagogy was found to have a positive effect on the motivation of students and on the potential to match learning to the needs of students of different abilities. Crucially, it was found that, to be effective, not only did the teacher’s practice need to be congruent with the beliefs and theories on which it rests, but that the students also needed to share in these underlying assumptions through their learning. The study has implications for the way in which teachers work multiple pedagogies in the process of developing their pedagogical identity.
Resumo:
Enterprise Resource Planning is often endorsed as a means to facilitate strategic advantage for businesses. The scarcity of resources is the method by which some businesses maintain their position. However, the ubiquitous trend towards the adoption of Enterprise Resourcing Planning systems coupled with market saturation makes the promise of advantage less compelling. Reported in this paper is a proposed solution based upon semiotic theory that takes a typical Enterprise Resource Planning deployment scenario and shapes it according to the needs of people in post-implementation contexts to leverage strategic advantage in different ways.
Conditioning model output statistics of regional climate model precipitation on circulation patterns
Resumo:
Dynamical downscaling of Global Climate Models (GCMs) through regional climate models (RCMs) potentially improves the usability of the output for hydrological impact studies. However, a further downscaling or interpolation of precipitation from RCMs is often needed to match the precipitation characteristics at the local scale. This study analysed three Model Output Statistics (MOS) techniques to adjust RCM precipitation; (1) a simple direct method (DM), (2) quantile-quantile mapping (QM) and (3) a distribution-based scaling (DBS) approach. The modelled precipitation was daily means from 16 RCMs driven by ERA40 reanalysis data over the 1961–2000 provided by the ENSEMBLES (ENSEMBLE-based Predictions of Climate Changes and their Impacts) project over a small catchment located in the Midlands, UK. All methods were conditioned on the entire time series, separate months and using an objective classification of Lamb's weather types. The performance of the MOS techniques were assessed regarding temporal and spatial characteristics of the precipitation fields, as well as modelled runoff using the HBV rainfall-runoff model. The results indicate that the DBS conditioned on classification patterns performed better than the other methods, however an ensemble approach in terms of both climate models and downscaling methods is recommended to account for uncertainties in the MOS methods.
Resumo:
Many modern statistical applications involve inference for complex stochastic models, where it is easy to simulate from the models, but impossible to calculate likelihoods. Approximate Bayesian computation (ABC) is a method of inference for such models. It replaces calculation of the likelihood by a step which involves simulating artificial data for different parameter values, and comparing summary statistics of the simulated data with summary statistics of the observed data. Here we show how to construct appropriate summary statistics for ABC in a semi-automatic manner. We aim for summary statistics which will enable inference about certain parameters of interest to be as accurate as possible. Theoretical results show that optimal summary statistics are the posterior means of the parameters. Although these cannot be calculated analytically, we use an extra stage of simulation to estimate how the posterior means vary as a function of the data; and we then use these estimates of our summary statistics within ABC. Empirical results show that our approach is a robust method for choosing summary statistics that can result in substantially more accurate ABC analyses than the ad hoc choices of summary statistics that have been proposed in the literature. We also demonstrate advantages over two alternative methods of simulation-based inference.
Resumo:
The effective and efficient management of diversified business firms that supply multiple products and operate in multiple, dynamic markets, especially large multinational enterprises (MNEs), builds upon a number of specific governance principles. These governance principles allow the alignment of environmental characteristics, strategy and organization. Given the rising need to “learn from the world”, Doz et al., in their influential Harvard Business School Press book entitled From Global to Metanational, have proposed a new set of governance principles described under the “metanational” umbrella concept. This paper revisits the metanational, using a comparative institutional perspective; here we contrast multidivisional and metanational governance principles. A comparative institutional analysis suggests that the metanational's application potential in terms of actually improving the effectiveness and efficiency of MNE governance may be subject to more qualification than suggested by Doz et al. Senior MNE management must therefore reflect carefully before substituting metanational governance principles for the more conventional, multidivisional ones with established contributions to managerial effectiveness and efficiency.
Resumo:
This paper extends the resource-based view (RBV) of the firm, as applied to multinational enterprises (MNEs), by distinguishing between two critical resource dimensions, namely relative resource superiority (capabilities) and slack. Both dimensions, in concert with specific environmental conditions, are required to increase entrepreneurial activities. We propose distinct configurations (three-way moderation effects) of capabilities, slack, and environmental factors (i.e. dynamism and hostility) to explain entrepreneurship. Using survey data from 66 Canadian subsidiaries operating in China, we find that higher subsidiary entrepreneurship requires both HR slack and strong downstream capabilities in subsidiaries, subject to the industry environment being dynamic and benign. However, high HR slack alone, in a dynamic and benign environment, but without the presence of strong capabilities, actually triggers the fewest initiatives, with HR slack redirected from entrepreneurial experimentation towards complacency and inefficiency. This paper has major implications for MNEs seeking to increase subsidiary entrepreneurship in fast growing emerging markets.
Resumo:
This note caveats standard statistics which accompany chess endgame tables, EGTs. It refers to Nalimov's double-counting of pawnless positions with both Kings on a long diagonal, and to the inclusion of positions which are not reachable from the initial position.
Resumo:
Mergers of Higher Education Institutions (HEIs) are organisational processes requiring tremendous amount of resources, in terms of time, work, and money. A number of mergers have been seen on previous years and more are to come. Several studies on mergers have been conducted, revealing some crucial factors that affect the success of mergers. Based on literature review on these studies, factors are: the initiator of merger, a reason for merger, geographical distance of merging institutions, organisational culture, the extend of overlapping course portfolio, and Quality Assurance Systems (QASs). Usually these kind of factors are not considered on mergers, but focus is on financial matters. In this paper, a framework (HMEF) for evaluating merging of HEIs is introduced. HMEF is based on Enterprise Architecture (EA), focusing on factors found to be affecting the success of mergers. By using HMEF, HEIs can focus on matters that crucial for merging.
Resumo:
Initial phase of all Enterprise Architecture (EA) initiatives is important. One of the most crucial tasks in that phase is to sell EA to the top management by explaining its purpose. In this paper, by using semiotic framework we show that there is a clear gap between the definition of EA and its purpose. Contribution of this paper is a taxonomy that expands knowledge of pragmatics of EA, and that can be used as a tool for explaining the purpose of EA. Grounded theory is used to form the taxonomy. Data is collected from a discussion group used by EA practitioners. Results indicate that the purpose of EA is to meet organisations‟ stakeholder‟s goals and to create value to organisation. Results are in line with current literature. Most interesting result is that EA practitioners seem to realise that technical solutions are not the purpose of EA, but means for fulfilling it.
Resumo:
Interest towards Enterprise Architecture (EA) has been increasing during the last few years. EA has been found to be a crucial aspect of business survival, and thus the importance of EA implementation success is also crucial. Current literature does not have a tool to be used to measure the success of EA implementation. In this paper, a tentative model for measuring success is presented and empirically validated in EA context. Results show that the success of EA implementation can be measured indirectly by measuring the achievement of the objectives set for the implementation. Results also imply that achieving individual's objectives do not necessarily mean that organisation's objectives are achieved. The presented Success Measurement Model can be used as basis for developing measurement metrics.
Resumo:
The number of published Enterprise Architecture (EA) research has increased during the last few years. As a discipline, EA is still young and lacking theoretical foundation. Lately some research trying to ground EA to theory has been published, including linkage to systems theory. Enterprise Architecture can be defined as; (i) a formal description of the current and future state(s) of an organisation, and (ii) a managed change between these states to meet organisation’s stakeholders’ goals and to create value to the organisation. Based on this definition, this conceptual paper tries to shed light to theoretical underpinnings of EA from three theoretical perspectives; EA as a communication media, EA as an activity, and EA as an information technology system. Our conclusions are that; (i) EA can be categorised as a communication media and theoretically underpinned by ontology and semiotics, (ii) EA can be explained and theoretically underpinned by Activity Theory, and (iii) EA can be categorised as an information technology system and theoretically underpinned by General Systems Theory and Technology Acceptance Theory.
Resumo:
Sensory thresholds are often collected through ascending forced-choice methods. Group thresholds are important for comparing stimuli or populations; yet, the method has two problems. An individual may correctly guess the correct answer at any concentration step and might detect correctly at low concentrations but become adapted or fatigued at higher concentrations. The survival analysis method deals with both issues. Individual sequences of incorrect and correct answers are adjusted, taking into account the group performance at each concentration. The technique reduces the chance probability where there are consecutive correct answers. Adjusted sequences are submitted to survival analysis to determine group thresholds. The technique was applied to an aroma threshold and a taste threshold study. It resulted in group thresholds similar to ASTM or logarithmic regression procedures. Significant differences in taste thresholds between younger and older adults were determined. The approach provides a more robust technique over previous estimation methods.