63 resultados para dependency
em CentAUR: Central Archive University of Reading - UK
Resumo:
Stakeholder analysis plays a critical role in business analysis. However, the majority of the stakeholder identification and analysis methods focus on the activities and processes and ignore the artefacts being processed by human beings. By focusing on the outputs of the organisation, an artefact-centric view helps create a network of artefacts, and a component-based structure of the organisation and its supply chain participants. Since the relationship is based on the components, i.e. after the stakeholders are identified, the interdependency between stakeholders and the focal organisation can be measured. Each stakeholder is associated with two types of dependency, namely the stakeholder’s dependency on the focal organisation and the focal organisation’s dependency on the stakeholder. We identify three factors for each type of dependency and propose the equations that calculate the dependency indexes. Once both types of the dependency indexes are calculated, each stakeholder can be placed and categorised into one of the four groups, namely critical stakeholder, mutual benefits stakeholder, replaceable stakeholder, and easy care stakeholder. The mutual dependency grid and the dependency gap analysis, which further investigates the priority of each stakeholder by calculating the weighted dependency gap between the focal organisation and the stakeholder, subsequently help the focal organisation to better understand its stakeholders and manage its stakeholder relationships.
Resumo:
Business process modelling can help an organisation better understand and improve its business processes. Most business process modelling methods adopt a task- or activity-based approach to identifying business processes. Within our work, we use activity theory to categorise elements within organisations as being either human beings, activities or artefacts. Due to the direct relationship between these three elements, an artefact-oriented approach to organisation analysis emerges. Organisational semiotics highlights the ontological dependency between affordances within an organisation. We analyse the ontological dependency between organisational elements, and therefore produce the ontology chart for artefact-oriented business process modelling in order to clarify the relationship between the elements of an organisation. Furthermore, we adopt the techniques from semantic analysis and norm analysis, of organisational semiotics, to develop the artefact-oriented method for business process modelling. The proposed method provides a novel perspective for identifying and analysing business processes, as well as agents and artefacts, as the artefact-oriented perspective demonstrates the fundamental flow of an organisation. The modelling results enable an organisation to understand and model its processes from an artefact perspective, viewing an organisation as a network of artefacts. The information and practice captured and stored in artefact can also be shared and reused between organisations that produce similar artefacts.
Resumo:
This article describes a novel algorithmic development extending the contour advective semi-Lagrangian model to include nonconservative effects. The Lagrangian contour representation of finescale tracer fields, such as potential vorticity, allows for conservative, nondiffusive treatment of sharp gradients allowing very high numerical Reynolds numbers. It has been widely employed in accurate geostrophic turbulence and tracer advection simulations. In the present, diabatic version of the model the constraint of conservative dynamics is overcome by including a parallel Eulerian field that absorbs the nonconservative ( diabatic) tendencies. The diabatic buildup in this Eulerian field is limited through regular, controlled transfers of this field to the contour representation. This transfer is done with a fast newly developed contouring algorithm. This model has been implemented for several idealized geometries. In this paper a single-layer doubly periodic geometry is used to demonstrate the validity of the model. The present model converges faster than the analogous semi-Lagrangian models at increased resolutions. At the same nominal spatial resolution the new model is 40 times faster than the analogous semi-Lagrangian model. Results of an orographically forced idealized storm track show nontrivial dependency of storm-track statistics on resolution and on the numerical model employed. If this result is more generally applicable, this may have important consequences for future high-resolution climate modeling.
Resumo:
In this paper it is argued that rotational wind is not the best choice of leading control variable for variational data assimilation, and an alternative is suggested and tested. A rotational wind parameter is used in most global variational assimilation systems as a pragmatic way of approximately representing the balanced component of the assimilation increments. In effect, rotational wind is treated as a proxy for potential vorticity, but one that it is potentially not a good choice in flow regimes characterised by small Burger number. This paper reports on an alternative set of control variables which are based around potential vorticity. This gives rise to a new formulation of the background error covariances for the Met Office's variational assimilation system, which leads to flow dependency. It uses similar balance relationships to traditional schemes, but recognises the existence of unbalanced rotational wind which is used with a new anti-balance relationship. The new scheme is described and its performance is evaluated and compared to a traditional scheme using a sample of diagnostics.
Resumo:
Diffuse reflectance spectroscopy (DRS) is increasingly being used to predict numerous soil physical, chemical and biochemical properties. However, soil properties and processes vary at different scales and, as a result, relationships between soil properties often depend on scale. In this paper we report on how the relationship between one such property, cation exchange capacity (CEC), and the DRS of the soil depends on spatial scale. We show this by means of a nested analysis of covariance of soils sampled on a balanced nested design in a 16 km × 16 km area in eastern England. We used principal components analysis on the DRS to obtain a reduced number of variables while retaining key variation. The first principal component accounted for 99.8% of the total variance, the second for 0.14%. Nested analysis of the variation in the CEC and the two principal components showed that the substantial variance components are at the > 2000-m scale. This is probably the result of differences in soil composition due to parent material. We then developed a model to predict CEC from the DRS and used partial least squares (PLS) regression do to so. Leave-one-out cross-validation results suggested a reasonable predictive capability (R2 = 0.71 and RMSE = 0.048 molc kg− 1). However, the results from the independent validation were not as good, with R2 = 0.27, RMSE = 0.056 molc kg− 1 and an overall correlation of 0.52. This would indicate that DRS may not be useful for predictions of CEC. When we applied the analysis of covariance between predicted and observed we found significant scale-dependent correlations at scales of 50 and 500 m (0.82 and 0.73 respectively). DRS measurements can therefore be useful to predict CEC if predictions are required, for example, at the field scale (50 m). This study illustrates that the relationship between DRS and soil properties is scale-dependent and that this scale dependency has important consequences for prediction of soil properties from DRS data
Resumo:
While the standard models of concentration addition and independent action predict overall toxicity of multicomponent mixtures reasonably, interactions may limit the predictive capability when a few compounds dominate a mixture. This study was conducted to test if statistically significant systematic deviations from concentration addition (i.e. synergism/antagonism, dose ratio- or dose level-dependency) occur when two taxonomically unrelated species, the earthworm Eisenia fetida and the nematode Caenorhabditis elegans were exposed to a full range of mixtures of the similar acting neonicotinoid pesticides imidacloprid and thiacloprid. The effect of the mixtures on C. elegans was described significantly better (p<0.01) by a dose level-dependent deviation from the concentration addition model than by the reference model alone, while the reference model description of the effects on E. fetida could not be significantly improved. These results highlight that deviations from concentration addition are possible even with similar acting compounds, but that the nature of such deviations are species dependent. For improving ecological risk assessment of simple mixtures, this implies that the concentration addition model may need to be used in a probabilistic context, rather than in its traditional deterministic manner. Crown Copyright (C) 2008 Published by Elsevier Inc. All rights reserved.
Resumo:
In a recent investigation, Landsat TM and ETM+ data were used to simulate different resolutions of remotely-sensed images (from 30 to 1100 m) and to analyze the effect of resolution on a range of landscape metrics associated with spatial patterns of forest fragmentation in Chapare, Bolivia since the mid-1980s. Whereas most metrics were found to be highly dependent on pixel size, several fractal metrics (DLFD, MPFD, and AWMPFD) were apparently independent of image resolution, in contradiction with a sizeable body of literature indicating that fractal dimensions of natural objects depend strongly on image characteristics. The present re-analysis of the Chapare images, using two alternative algorithms routinely used for the evaluation of fractal dimensions, shows that the values of the box-counting and information fractal dimensions are systematically larger, sometimes by as much as 85%, than the "fractal" indices DLFD, MPFD, and AWMFD for the same images. In addition, the geometrical fractal features of the forest and non-forest patches in the Chapare region strongly depend on the resolution of images used in the analysis. The largest dependency on resolution occurs for the box-counting fractal dimension in the case of the non-forest patches in 1993, where the difference between the 30 and I 100 m-resolution images corresponds to 24% of the full theoretical range (1.0 to 2.0) of the mass fractal dimension. The observation that the indices DLFD, MPFD, and AWMPFD, unlike the classical fractal dimensions, appear relatively unaffected by resolution in the case of the Chapare images seems due essentially to the fact that these indices are based on a heuristic, "non-geometric" approach to fractals. Because of their lack of a foundation in fractal geometry, nothing guarantees that these indices will be resolution-independent in general. (C) 2006 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS). Published by Elsevier B.V. All rights reserved.
Resumo:
Development geography has long sought to understand why inequalities exist and the best ways to address them. Dependency theory sets out an historical rationale for under development based on colonialism and a legacy of developed core and under-developed periphery. Race is relevant in this theory only insofar that Europeans are white and the places they colonised were occupied by people with darker skin colour. There are no innate biological reasons why it happened in that order. However, a new theory for national inequalities proposed by Lynn and Vanhanen in a series of publications makes the case that poorer countries have that status because of a poorer genetic stock rather than an accident of history. They argue that IQ has a genetic basis and IQ is linked to ability. Thus races with a poorer IQ have less ability, and thus national IQ can be positively correlated with performance as measured by an indicator like GDP/capita. Their thesis is one of despair, as little can be done to improve genetic stock significantly other than a programme of eugenics. This paper summarises and critiques the Lynn and Vanhanen hypothesis and the assumptions upon which it is based, and uses this analysis to show how a human desire to simplify in order to manage can be dangerous in development geography. While the attention may naturally be focused on the 'national IQ' variables as a proxy measure of 'innate ability', the assumption of GDP per capita as an indicator of 'success' and 'achievement' is far more readily accepted without criticism. The paper makes the case that the current vogue for indicators, indices and cause-effect can be tyrannical.
Resumo:
Improvements in the resolution of satellite imagery have enabled extraction of water surface elevations at the margins of the flood. Comparison between modelled and observed water surface elevations provides a new means for calibrating and validating flood inundation models, however the uncertainty in this observed data has yet to be addressed. Here a flood inundation model is calibrated using a probabilistic treatment of the observed data. A LiDAR guided snake algorithm is used to determine an outline of a flood event in 2006 on the River Dee, North Wales, UK, using a 12.5m ERS-1 image. Points at approximately 100m intervals along this outline are selected, and the water surface elevation recorded as the LiDAR DEM elevation at each point. With a planar water surface from the gauged upstream to downstream water elevations as an approximation, the water surface elevations at points along this flooded extent are compared to their ‘expected’ value. The pattern of errors between the two show a roughly normal distribution, however when plotted against coordinates there is obvious spatial autocorrelation. The source of this spatial dependency is investigated by comparing errors to the slope gradient and aspect of the LiDAR DEM. A LISFLOOD-FP model of the flood event is set-up to investigate the effect of observed data uncertainty on the calibration of flood inundation models. Multiple simulations are run using different combinations of friction parameters, from which the optimum parameter set will be selected. For each simulation a T-test is used to quantify the fit between modelled and observed water surface elevations. The points chosen for use in this T-test are selected based on their error. The criteria for selection enables evaluation of the sensitivity of the choice of optimum parameter set to uncertainty in the observed data. This work explores the observed data in detail and highlights possible causes of error. The identification of significant error (RMSE = 0.8m) between approximate expected and actual observed elevations from the remotely sensed data emphasises the limitations of using this data in a deterministic manner within the calibration process. These limitations are addressed by developing a new probabilistic approach to using the observed data.
Resumo:
This article explores how data envelopment analysis (DEA), along with a smoothed bootstrap method, can be used in applied analysis to obtain more reliable efficiency rankings for farms. The main focus is the smoothed homogeneous bootstrap procedure introduced by Simar and Wilson (1998) to implement statistical inference for the original efficiency point estimates. Two main model specifications, constant and variable returns to scale, are investigated along with various choices regarding data aggregation. The coefficient of separation (CoS), a statistic that indicates the degree of statistical differentiation within the sample, is used to demonstrate the findings. The CoS suggests a substantive dependency of the results on the methodology and assumptions employed. Accordingly, some observations are made on how to conduct DEA in order to get more reliable efficiency rankings, depending on the purpose for which they are to be used. In addition, attention is drawn to the ability of the SLICE MODEL, implemented in GAMS, to enable researchers to overcome the computational burdens of conducting DEA (with bootstrapping).
Resumo:
There is consensus worldwide that the artisanal and small-scale mining (ASM) sector is comprised of individuals who are trapped in a vicious cycle of poverty, lacking the necessary financial and technological means to improve their standards of living. Minimal work, however, has been undertaken to identify the very factors behind miners' plight, which inevitably vary from country to country. This paper uses a case study of Ghana to argue that an increased dependence upon mercury for amalgamation In artisanal gold-mining communities is one such-albeit overlooked-"agent of poverty". There is mounting empirical evidence which suggests that dealings with the monoponistic middlemen who supply mercury, purchases of costly medicines to remedy ailments caused by mercury poisoning, and a lack of appropriate safeguards and alternatives to amalgamation, are preventing gold miners from improving their practices and livelihoods. The solution to the problem lies in breaking this cycle of dependency, which can be achieved by providing miners with robust support services, mercury-free technologies and education. (c) 2006 Elsevier Ltd. All rights reserved.