944 resultados para Business Process Modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission-driven rather than concentration-driven perturbed parameter ensemble of a global climate model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration-driven simulations (with 10–90th percentile ranges of 1.7 K for the aggressive mitigation scenario, up to 3.9 K for the high-end, business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 K (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission-driven experiments, they do not change existing expectations (based on previous concentration-driven experiments) on the timescales over which different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in the case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration scenarios used to drive GCM ensembles, lies towards the lower end of our simulated distribution. This design decision (a legacy of previous assessments) is likely to lead concentration-driven experiments to under-sample strong feedback responses in future projections. Our ensemble of emission-driven simulations span the global temperature response of the CMIP5 emission-driven simulations, except at the low end. Combinations of low climate sensitivity and low carbon cycle feedbacks lead to a number of CMIP5 responses to lie below our ensemble range. The ensemble simulates a number of high-end responses which lie above the CMIP5 carbon cycle range. These high-end simulations can be linked to sampling a number of stronger carbon cycle feedbacks and to sampling climate sensitivities above 4.5 K. This latter aspect highlights the priority in identifying real-world climate-sensitivity constraints which, if achieved, would lead to reductions on the upper bound of projected global mean temperature change. The ensembles of simulations presented here provides a framework to explore relationships between present-day observables and future changes, while the large spread of future-projected changes highlights the ongoing need for such work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Four CO2 concentration inversions and the Global Fire Emissions Database (GFED) versions 2.1 and 3 are used to provide benchmarks for climate-driven modeling of the global land-atmosphere CO2 flux and the contribution of wildfire to this flux. The Land surface Processes and exchanges (LPX) model is introduced. LPX is based on the Lund-Potsdam-Jena Spread and Intensity of FIRE (LPJ-SPITFIRE) model with amended fire probability calculations. LPX omits human ignition sources yet simulates many aspects of global fire adequately. It captures the major features of observed geographic pattern in burnt area and its seasonal timing and the unimodal relationship of burnt area to precipitation. It simulates features of geographic variation in the sign of the interannual correlations of burnt area with antecedent dryness and precipitation. It simulates well the interannual variability of the global total land-atmosphere CO2 flux. There are differences among the global burnt area time series from GFED2.1, GFED3 and LPX, but some features are common to all. GFED3 fire CO2 fluxes account for only about 1/3 of the variation in total CO2 flux during 1997–2005. This relationship appears to be dominated by the strong climatic dependence of deforestation fires. The relationship of LPX-modeled fire CO2 fluxes to total CO2 fluxes is weak. Observed and modeled total CO2 fluxes track the El Niño–Southern Oscillation (ENSO) closely; GFED3 burnt area and global fire CO2 flux track the ENSO much less so. The GFED3 fire CO2 flux-ENSO connection is most prominent for the El Niño of 1997–1998, which produced exceptional burning conditions in several regions, especially equatorial Asia. The sign of the observed relationship between ENSO and fire varies regionally, and LPX captures the broad features of this variation. These complexities underscore the need for process-based modeling to assess the consequences of global change for fire and its implications for the carbon cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing capability models lack qualitative and quantitative means to compare business capabilities. This paper extends previous work and uses affordance theories to consistently model and analyse capabilities. We use the concept of objective and subjective affordances to model capability as a tuple of a set of resource affordance system mechanisms and action paths, dependent on one or more critical affordance factors. We identify an affordance chain of subjective affordances by which affordances work together to enable an action and an affordance path that links action affordances to create a capability system. We define the mechanism and path underlying capability. We show how affordance modelling notation, AMN, can represent affordances comprising a capability. We propose a method to quantitatively and qualitatively compare capabilities using efficiency, effectiveness and quality metrics. The method is demonstrated by a medical example comparing the capability of syringe and needless anaesthetic systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last decade issues related to the financial viability of development have become increasingly important to the English planning system. As part of a wider shift towards the compartmentalisation of planning tasks, expert consultants are required to quantify, in an attempt to rationalise, planning decisions in terms of economic ‘viability’. Often with a particular focus on planning obligations, the results of development viability modelling have emerged as a key part of the evidence base used in site-specific negotiations and in planning policy formation. Focussing on the role of clients and other stakeholders, this paper investigates how development viability is tested in practice. It draws together literature on the role of calculative practices in policy formation, client feedback and influence in real estate appraisals and stakeholder engagement and consultation in the planning literature to critically evaluate the role of clients and other interest groups in influencing the production and use of development viability appraisal models. The paper draws upon semi-structured interviews with the main producers of development viability appraisals to conclude that, whilst appraisals have the potential to be biased by client and stakeholder interests, there are important controlling influences on potential opportunistic behaviour. One such control is local authorities’ weak understanding of development viability appraisal techniques which limits their capacity to question the outputs of appraisal models. However, this also is of concern given that viability is now a central feature of the town planning system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tests for business cycle asymmetries are developed for Markov-switching autoregressive models. The tests of deepness, steepness, and sharpness are Wald statistics, which have standard asymptotics. For the standard two-regime model of expansions and contractions, deepness is shown to imply sharpness (and vice versa), whereas the process is always nonsteep. Two and three-state models of U.S. GNP growth are used to illustrate the approach, along with models of U.S. investment and consumption growth. The robustness of the tests to model misspecification, and the effects of regime-dependent heteroscedasticity, are investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Runoff generation processes and pathways vary widely between catchments. Credible simulations of solute and pollutant transport in surface waters are dependent on models which facilitate appropriate, catchment-specific representations of perceptual models of the runoff generation process. Here, we present a flexible, semi-distributed landscape-scale rainfall-runoff modelling toolkit suitable for simulating a broad range of user-specified perceptual models of runoff generation and stream flow occurring in different climatic regions and landscape types. PERSiST (the Precipitation, Evapotranspiration and Runoff Simulator for Solute Transport) is designed for simulating present-day hydrology; projecting possible future effects of climate or land use change on runoff and catchment water storage; and generating hydrologic inputs for the Integrated Catchments (INCA) family of models. PERSiST has limited data requirements and is calibrated using observed time series of precipitation, air temperature and runoff at one or more points in a river network. Here, we apply PERSiST to the river Thames in the UK and describe a Monte Carlo tool for model calibration, sensitivity and uncertainty analysis

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information technology has become heavily embedded in business operations. As business needs change over time, IT applications are expected to continue providing required support. Whether the existing IT applications are still fit for the business purpose they were intended or new IT applications should be introduced, is a strategic decision for business, IT and business-aligned IT. In this paper, we present a method which aims to analyse business functions and IT roles, and to evaluate business-aligned IT from both social and technical perspectives. The method introduces a set of techniques that systematically supports the evaluation of the existing IT applications in relation to their technical capabilities for maximising business value. Furthermore, we discuss the evaluation process and results which are illustrated and validated through a real-life case study of a UK borough council, and followed by discussion on implications for researchers and practitioners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding how and why the capability of one set of business resources, its structural arrangements and mechanisms compared to another works can provide competitive advantage in terms of new business processes and product and service development. However, most business models of capability are descriptive and lack formal modelling language to qualitatively and quantifiably compare capabilities, Gibson’s theory of affordance, the potential for action, provides a formal basis for a more robust and quantitative model, but most formal affordance models are complex and abstract and lack support for real-world applications. We aim to understand the ‘how’ and ‘why’ of business capability, by developing a quantitative and qualitative model that underpins earlier work on Capability-Affordance Modelling – CAM. This paper integrates an affordance based capability model and the formalism of Coloured Petri Nets to develop a simulation model. Using the model, we show how capability depends on the space time path of interacting resources, the mechanism of transition and specific critical affordance factors relating to the values of the variables for resources, people and physical objects. We show how the model can identify the capabilities of resources to enable the capability to inject a drug and anaesthetise a patient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is ongoing work on conceptual modelling of such busi- ness notions as Affordance and Capability. We have found that such business notions as Affordance and Capability are constructively defned using elements and properties of exe- cutable behaviour models. In this paper, we clarify the def- initions of Affordance and Capability using Coloured Petri Nets and Protocol models.The illustrating case is the process of drug injection. We show that different behaviour modelling techniques provide different precision for definition of Affordance and Capability and clarify the conceptual models of these notions. We generalise that the behaviour models can be used to improve the precision of conceptualization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in three consulting studies carried out by Capgemini involving four UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earthworms are significant ecosystem engineers and are an important component of the diet of many vertebrates and invertebrates, so the ability to predict their distribution and abundance would have wide application in ecology, conservation and land management. Earthworm viability is known to be affected by the availability and quality of food resources, soil water conditions and temperature, but has not yet been modelled mechanistically to link effects on individuals to field population responses. Here we present a novel model capable of predicting the effects of land management and environmental conditions on the distribution and abundance of Aporrectodea caliginosa, the dominant earthworm species in agroecosystems. Our process-based approach uses individual based modelling (IBM), in which each individual has its own energy budget. Individual earthworm energy budgets follow established principles of physiological ecology and are parameterised for A. caliginosa from experimental measurements under optimal conditions. Under suboptimal conditions (e.g. food limitation, low soil temperatures and water contents) reproduction is prioritised over growth. Good model agreement to independent laboratory data on individual cocoon production and growth of body mass, under variable feeding and temperature conditions support our representation of A. caliginosa physiology through energy budgets. Our mechanistic model is able to accurately predict A. caliginosa distribution and abundance in spatially heterogeneous soil profiles representative of field study conditions. Essential here is the explicit modelling of earthworm behaviour in the soil profile. Local earthworm movement responds to a trade-off between food availability and soil water conditions, and this determines the spatiotemporal distribution of the population in the soil profile. Importantly, multiple environmental variables can be manipulated simultaneously in the model to explore earthworm population exposure and effects to combinations of stressors. Potential applications include prediction of the population-level effects of pesticides and changes in soil management e.g. conservation tillage and climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article reviews the experiences of a practising business consultancy division. It discusses the reasons for the failure of the traditional, expert consultancy approach and states the requirements for a more suitable consultancy methodology. An approach called ‘Modelling as Learning’ is introduced, its three defining aspects being: client ownership of all analytical work performed, consultant acting as facilitator and sensitivity to soft issues within and surrounding a problem. The goal of such an approach is set as the acceleration of the client's learning about the business. The tools that are used within this methodological framework are discussed and some case studies of the methodology are presented. It is argued that a learning experience was necessary before arriving at the new methodology but that it is now a valuable and significant component of the division's work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Home-based online business ventures are an increasingly pervasive yet under-researched phenomenon. The experiences and mindset of entrepreneurs setting up and running such enterprises require better understanding. Using data from a qualitative study of 23 online home-based business entrepreneurs, we propose the augmented concept of ‘mental mobility’ to encapsulate how they approach their business activities. Drawing on Howard P. Becker's early theorising of mobility, together with Victor Turner's later notion of liminality, we conceptualise mental mobility as the process through which individuals navigate the liminal spaces between the physical and digital spheres of work and the overlapping home/workplace, enabling them to manipulate and partially reconcile the spatial, temporal and emotional tensions that are present in such work environments. Our research also holds important applications for alternative employment contexts and broader social orderings because of the increasingly pervasive and disruptive influence of technology on experiences of remunerated work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.