892 resultados para ENTERPRISE STATISTICS
Resumo:
Social Networking Sites have recently become a mainstream communications technology for many people around the world. Major IT vendors are releasing social software designed for use in a business/commercial context. These Enterprise 2.0 technologies have impressive collaboration and information sharing functionality, but so far they do not have any organizational network analysis (ONA) features that reveal any patterns of connectivity within business units. This paper shows the impact of organizational network analysis techniques and social networks on organizational performance, we also give an overview on current enterprise social software, and most importantly, we highlight how Enterprise 2.0 can help automate an organizational network analysis.
Resumo:
Multiple versions of information and associated problems are well documented in both academic research and industry best practices. Many solutions have proposed a single version of the truth, with Business intelligence being adopted by many organizations. Business Intelligence (BI), however, is largely based on the collection of data, processing and presentation of information to meet different stakeholders’ requirement. This paper reviews the promise of Enterprise Intelligence, which promises to support decision-making based on a defined strategic understanding of the organizations goals and a unified version of the truth.
Resumo:
The article presents an essay that deals with the study conducted by Donald MacKenzie and the case studies comparing the use of population statistics in France and Great Britain in the periods of 1825 and 1885. It analyzes Donald MacKenzie's study on the ways professional and political commitments informed the choice of statistical indexes in the British statistical community. Furthermore, the author is interested in knowing how this influenced the development of mathematical statistics in Great Britain. The author concludes that the differences in the debates over population statistics are accounted to the differences in the social and epistemological logics of population statistics.
Resumo:
This paper examines how innovation-related capabilities for production, design and marketing develop at the subsidiary level within multinational enterprises (MNEs). We focus on how subsidiary autonomy and changing opportunities to access internal (MNE) and external (host country) sources of capability contribute in a combined way to the accumulation of specialist capabilities in five Taiwan-based MNE subsidiaries in the semiconductor industry. Longitudinal analysis shows how the accumulation process is subject to discontinuities, as functional divisions are (re)opened and closed during the lifetime of the subsidiary. A composite set of innovation output measures also shows significant variations in within-function levels of capability across our sample. We conclude that subsidiary specialisation and unique subsidiary-specific advantages have evolved in a way that is strongly influenced by the above factors.
Resumo:
An evaluation is undertaken of the statistics of daily precipitation as simulated by five regional climate models using comprehensive observations in the region of the European Alps. Four limited area models and one variable-resolution global model are considered, all with a grid spacing of 50 km. The 15-year integrations were forced from reanalyses and observed sea surface temperature and sea ice (global model from sea surface only). The observational reference is based on 6400 rain gauge records (10–50 stations per grid box). Evaluation statistics encompass mean precipitation, wet-day frequency, precipitation intensity, and quantiles of the frequency distribution. For mean precipitation, the models reproduce the characteristics of the annual cycle and the spatial distribution. The domain mean bias varies between −23% and +3% in winter and between −27% and −5% in summer. Larger errors are found for other statistics. In summer, all models underestimate precipitation intensity (by 16–42%) and there is a too low frequency of heavy events. This bias reflects too dry summer mean conditions in three of the models, while it is partly compensated by too many low-intensity events in the other two models. Similar intermodel differences are found for other European subregions. Interestingly, the model errors are very similar between the two models with the same dynamical core (but different parameterizations) and they differ considerably between the two models with similar parameterizations (but different dynamics). Despite considerable biases, the models reproduce prominent mesoscale features of heavy precipitation, which is a promising result for their use in climate change downscaling over complex topography.
Resumo:
The probability of a quantum particle being detected in a given solid angle is determined by the S-matrix. The explanation of this fact in time-dependent scattering theory is often linked to the quantum flux, since the quantum flux integrated against a (detector-) surface and over a time interval can be viewed as the probability that the particle crosses this surface within the given time interval. Regarding many particle scattering, however, this argument is no longer valid, as each particle arrives at the detector at its own random time. While various treatments of this problem can be envisaged, here we present a straightforward Bohmian analysis of many particle potential scattering from which the S-matrix probability emerges in the limit of large distances.
Resumo:
The doctrine of joint criminal enterprise is in disarray. Despite repeated judicial scrutiny at the highest level, the doctrine's scope, proper doctrinal basis and function in relation to other modes of complicity remain uncertain. This article examines the doctrine's elements and underlying principles. It argues that while joint criminal enterprise is largely used to make individuals liable for offences committed by their associates in excess of the common criminal purpose, its proper function is to police the limits of associate liability and thus to exculpate rather than inculpate. The doctrine governs not only instances of accessorial liability; it also applies where the parties involved are joint principal offenders. As this puts into question the prevalent view that joint criminal enterprise is a form of secondary participation that results in accessorial liability, the article concludes that it is best seen as a doctrine sui generis.
Resumo:
This article investigates the nature of enterprise pedagogy in music. It presents the results of a research project that applied the practices of enterprise learning developed in the post-compulsory music curriculum in England to the teaching of the National Curriculum for music for 11-to-14-year-olds. In doing so, the article explores the nature of enterprise learning and the nature of pedagogy, in order to consider whether enterprise pedagogy offers an effective way to teach the National Curriculum. Enterprise pedagogy was found to have a positive effect on the motivation of students and on the potential to match learning to the needs of students of different abilities. Crucially, it was found that, to be effective, not only did the teacher’s practice need to be congruent with the beliefs and theories on which it rests, but that the students also needed to share in these underlying assumptions through their learning. The study has implications for the way in which teachers work multiple pedagogies in the process of developing their pedagogical identity.
Resumo:
Enterprise Resource Planning is often endorsed as a means to facilitate strategic advantage for businesses. The scarcity of resources is the method by which some businesses maintain their position. However, the ubiquitous trend towards the adoption of Enterprise Resourcing Planning systems coupled with market saturation makes the promise of advantage less compelling. Reported in this paper is a proposed solution based upon semiotic theory that takes a typical Enterprise Resource Planning deployment scenario and shapes it according to the needs of people in post-implementation contexts to leverage strategic advantage in different ways.
Conditioning model output statistics of regional climate model precipitation on circulation patterns
Resumo:
Dynamical downscaling of Global Climate Models (GCMs) through regional climate models (RCMs) potentially improves the usability of the output for hydrological impact studies. However, a further downscaling or interpolation of precipitation from RCMs is often needed to match the precipitation characteristics at the local scale. This study analysed three Model Output Statistics (MOS) techniques to adjust RCM precipitation; (1) a simple direct method (DM), (2) quantile-quantile mapping (QM) and (3) a distribution-based scaling (DBS) approach. The modelled precipitation was daily means from 16 RCMs driven by ERA40 reanalysis data over the 1961–2000 provided by the ENSEMBLES (ENSEMBLE-based Predictions of Climate Changes and their Impacts) project over a small catchment located in the Midlands, UK. All methods were conditioned on the entire time series, separate months and using an objective classification of Lamb's weather types. The performance of the MOS techniques were assessed regarding temporal and spatial characteristics of the precipitation fields, as well as modelled runoff using the HBV rainfall-runoff model. The results indicate that the DBS conditioned on classification patterns performed better than the other methods, however an ensemble approach in terms of both climate models and downscaling methods is recommended to account for uncertainties in the MOS methods.
Resumo:
Many modern statistical applications involve inference for complex stochastic models, where it is easy to simulate from the models, but impossible to calculate likelihoods. Approximate Bayesian computation (ABC) is a method of inference for such models. It replaces calculation of the likelihood by a step which involves simulating artificial data for different parameter values, and comparing summary statistics of the simulated data with summary statistics of the observed data. Here we show how to construct appropriate summary statistics for ABC in a semi-automatic manner. We aim for summary statistics which will enable inference about certain parameters of interest to be as accurate as possible. Theoretical results show that optimal summary statistics are the posterior means of the parameters. Although these cannot be calculated analytically, we use an extra stage of simulation to estimate how the posterior means vary as a function of the data; and we then use these estimates of our summary statistics within ABC. Empirical results show that our approach is a robust method for choosing summary statistics that can result in substantially more accurate ABC analyses than the ad hoc choices of summary statistics that have been proposed in the literature. We also demonstrate advantages over two alternative methods of simulation-based inference.