905 resultados para Process Modelling, Viewpoint Modelling, Process Management


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Steady state and dynamic models have been developed and applied to the River Kennet system. Annual nitrogen exports from the land surface to the river have been estimated based on land use from the 1930s and the 1990s. Long term modelled trends indicate that there has been a large increase in nitrogen transport into the river system driven by increased fertiliser application associated with increased cereal production, increased population and increased livestock levels. The dynamic model INCA Integrated Nitrogen in Catchments. has been applied to simulate the day-to-day transport of N from the terrestrial ecosystem to the riverine environment. This process-based model generates spatial and temporal data and reproduces the observed instream concentrations. Applying the model to current land use and 1930s land use indicates that there has been a major shift in the short term dynamics since the 1930s, with increased river and groundwater concentrations caused by both non-point source pollution from agriculture and point source discharges. �

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The contribution non-point P sources make to the total P loading on water bodies in agricultural catchments has not been fully appreciated. Using data derived from plot scale experimental studies, and modelling approaches developed to simulate system behaviour under differing management scenarios, a fuller understanding of the processes controlling P export and transformations along non-point transport pathways can be achieved. One modelling approach which has been successfully applied to large UK catchments (50-350km2 in area) is applied here to a small, 1.5 km2 experimental catchment. The importance of scaling is discussed in the context of how such approaches can extrapolate the results from plot-scale experimental studies to full catchment scale. However, the scope of such models is limited, since they do not at present directly simulate the processes controlling P transport and transformation dynamics. As such, they can only simulate total P export on an annual basis, and are not capable of prediction over shorter time scales. The need for development of process-based models to help answer these questions, and for more comprehensive UK experimental studies is highlighted as a pre-requisite for the development of suitable and sustainable management strategies to reduce non-point P loading on water bodies in agricultural catchments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The evidence provided by modelled assessments of future climate impact on flooding is fundamental to water resources and flood risk decision making. Impact models usually rely on climate projections from global and regional climate models (GCM/RCMs). However, challenges in representing precipitation events at catchment-scale resolution mean that decisions must be made on how to appropriately pre-process the meteorological variables from GCM/RCMs. Here the impacts on projected high flows of differing ensemble approaches and application of Model Output Statistics to RCM precipitation are evaluated while assessing climate change impact on flood hazard in the Upper Severn catchment in the UK. Various ensemble projections are used together with the HBV hydrological model with direct forcing and also compared to a response surface technique. We consider an ensemble of single-model RCM projections from the current UK Climate Projections (UKCP09); multi-model ensemble RCM projections from the European Union's FP6 ‘ENSEMBLES’ project; and a joint probability distribution of precipitation and temperature from a GCM-based perturbed physics ensemble. The ensemble distribution of results show that flood hazard in the Upper Severn is likely to increase compared to present conditions, but the study highlights the differences between the results from different ensemble methods and the strong assumptions made in using Model Output Statistics to produce the estimates of future river discharge. The results underline the challenges in using the current generation of RCMs for local climate impact studies on flooding. Copyright © 2012 Royal Meteorological Society

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Communities are increasingly empowered with the ability and responsibility of working with national governments to make decisions about marine resources in decentralized co-management arrangements. This transition toward decentralized management represents a changing governance landscape. This paper explores the transition to decentralisation in marine resource management systems in three East African countries. The paper draws upon expert opinion and literature from both political science and linked social-ecological systems fields to guide exploration of five key governance transition concepts in each country: (1) drivers of change; (2) institutional arrangments; (3 institutional fit; (4) actor interactions; and (5) adaptive management. Key findings are that decentralized management in the region was largely donor-driven and only partly tranferred power to local stakeholders. However, increased accountability created a degree of democracy in regards to natural resource governance that was not previously present. Additionally, increased local-level adaptive management has emerged in most systems and, to date, this experimental management has helped to change resource user's views from metaphysical to more scientific cause-and-effect attribution of changes to resource conditions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper will introduce the Baltex research programme and summarize associated numerical modelling work which has been undertaken during the last five years. The research has broadly managed to clarify the main mechanisms determining the water and energy cycle in the Baltic region, such as the strong dependence upon the large scale atmospheric circulation. It has further been shown that the Baltic Sea has a positive water balance, albeit with large interannual variations. The focus on the modelling studies has been the use of limited area models at ultra-high resolution driven by boundary conditions from global models or from reanalysis data sets. The programme has further initiated a comprehensive integration of atmospheric, land surface and hydrological modelling incorporating snow, sea ice and special lake models. Other aspects of the programme include process studies such as the role of deep convection, air sea interaction and the handling of land surface moisture. Studies have also been undertaken to investigate synoptic and sub-synoptic events over the Baltic region, thus exploring the role of transient weather systems for the hydrological cycle. A special aspect has been the strong interests and commitments of the meteorological and hydrological services because of the potentially large societal interests of operational applications of the research. As a result of this interests special attention has been put on data-assimilation aspects and the use of new types of data such as SSM/I, GPS-measurements and digital radar. A series of high resolution data sets are being produced. One of those, a 1/6 degree daily precipitation climatology for the years 1996–1999, is such a unique contribution. The specific research achievements to be presented in this volume of Meteorology and Atmospheric Physics is the result of a cooperative venture between 11 European research groups supported under the EU-Framework programmes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper introduces an architecture for identifying and modelling in real-time at a copper mine using new technologies as M2M and cloud computing with a server in the cloud and an Android client inside the mine. The proposed design brings up pervasive mining, a system with wider coverage, higher communication efficiency, better fault-tolerance, and anytime anywhere availability. This solution was designed for a plant inside the mine which cannot tolerate interruption and for which their identification in situ, in real time, is an essential part of the system to control aspects such as instability by adjusting their corresponding parameters without stopping the process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Atmospheric CO2 concentration is hypothesized to influence vegetation distribution via tree–grass competition, with higher CO2 concentrations favouring trees. The stable carbon isotope (δ13C) signature of vegetation is influenced by the relative importance of C4 plants (including most tropical grasses) and C3 plants (including nearly all trees), and the degree of stomatal closure – a response to aridity – in C3 plants. Compound-specific δ13C analyses of leaf-wax biomarkers in sediment cores of an offshore South Atlantic transect are used here as a record of vegetation changes in subequatorial Africa. These data suggest a large increase in C3 relative to C4 plant dominance after the Last Glacial Maximum. Using a process-based biogeography model that explicitly simulates 13C discrimination, it is shown that precipitation and temperature changes cannot explain the observed shift in δ13C values. The physiological effect of increasing CO2 concentration is decisive, altering the C3/C4 balance and bringing the simulated and observed δ13C values into line. It is concluded that CO2 concentration itself was a key agent of vegetation change in tropical southern Africa during the last glacial–interglacial transition. Two additional inferences follow. First, long-term variations in terrestrial δ13Cvalues are not simply a proxy for regional rainfall, as has sometimes been assumed. Although precipitation and temperature changes have had major effects on vegetation in many regions of the world during the period between the Last Glacial Maximum and recent times, CO2 effects must also be taken into account, especially when reconstructing changes in climate between glacial and interglacial states. Second, rising CO2 concentration today is likely to be influencing tree–grass competition in a similar way, and thus contributing to the "woody thickening" observed in savannas worldwide. This second inference points to the importance of experiments to determine how vegetation composition in savannas is likely to be influenced by the continuing rise of CO2 concentration.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The construction industry is widely being criticised as a fragmented industry. There are mounting calls for the industry to change. The espoused change calls for collaboration as well as embracing innovation in the process of design, construction and across the supply chain. Innovation and the application of emerging technologies are seen as enablers for integrating the processes ‘integrating the team’ such as building information modelling (BIM). A questionnaire survey was conducted to ascertain change in construction with regard to design management, innovation and the application of BIM as cutting edge pathways for collaboration. The respondents to the survey were from an array of designations across the construction industry such as construction managers, designers, engineers, design coordinators, design managers, architects, architectural technologists and surveyors. There was a general agreement by most respondents that the design team was responsible for design management in their organisation. There is a perception that the design manager and the client are the catalyst for advancing innovation. The current state of industry in terms of incorporating BIM technologies is posing a challenge as well as providing an opportunity for accomplishment. BIM technologies provide a new paradigm shift in the way buildings are designed, constructed and maintained. This paradigm shift calls for rethinking the curriculum for educating building professionals, collectively.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Four CO2 concentration inversions and the Global Fire Emissions Database (GFED) versions 2.1 and 3 are used to provide benchmarks for climate-driven modeling of the global land-atmosphere CO2 flux and the contribution of wildfire to this flux. The Land surface Processes and exchanges (LPX) model is introduced. LPX is based on the Lund-Potsdam-Jena Spread and Intensity of FIRE (LPJ-SPITFIRE) model with amended fire probability calculations. LPX omits human ignition sources yet simulates many aspects of global fire adequately. It captures the major features of observed geographic pattern in burnt area and its seasonal timing and the unimodal relationship of burnt area to precipitation. It simulates features of geographic variation in the sign of the interannual correlations of burnt area with antecedent dryness and precipitation. It simulates well the interannual variability of the global total land-atmosphere CO2 flux. There are differences among the global burnt area time series from GFED2.1, GFED3 and LPX, but some features are common to all. GFED3 fire CO2 fluxes account for only about 1/3 of the variation in total CO2 flux during 1997–2005. This relationship appears to be dominated by the strong climatic dependence of deforestation fires. The relationship of LPX-modeled fire CO2 fluxes to total CO2 fluxes is weak. Observed and modeled total CO2 fluxes track the El Niño–Southern Oscillation (ENSO) closely; GFED3 burnt area and global fire CO2 flux track the ENSO much less so. The GFED3 fire CO2 flux-ENSO connection is most prominent for the El Niño of 1997–1998, which produced exceptional burning conditions in several regions, especially equatorial Asia. The sign of the observed relationship between ENSO and fire varies regionally, and LPX captures the broad features of this variation. These complexities underscore the need for process-based modeling to assess the consequences of global change for fire and its implications for the carbon cycle.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the last decade issues related to the financial viability of development have become increasingly important to the English planning system. As part of a wider shift towards the compartmentalisation of planning tasks, expert consultants are required to quantify, in an attempt to rationalise, planning decisions in terms of economic ‘viability’. Often with a particular focus on planning obligations, the results of development viability modelling have emerged as a key part of the evidence base used in site-specific negotiations and in planning policy formation. Focussing on the role of clients and other stakeholders, this paper investigates how development viability is tested in practice. It draws together literature on the role of calculative practices in policy formation, client feedback and influence in real estate appraisals and stakeholder engagement and consultation in the planning literature to critically evaluate the role of clients and other interest groups in influencing the production and use of development viability appraisal models. The paper draws upon semi-structured interviews with the main producers of development viability appraisals to conclude that, whilst appraisals have the potential to be biased by client and stakeholder interests, there are important controlling influences on potential opportunistic behaviour. One such control is local authorities’ weak understanding of development viability appraisal techniques which limits their capacity to question the outputs of appraisal models. However, this also is of concern given that viability is now a central feature of the town planning system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Runoff generation processes and pathways vary widely between catchments. Credible simulations of solute and pollutant transport in surface waters are dependent on models which facilitate appropriate, catchment-specific representations of perceptual models of the runoff generation process. Here, we present a flexible, semi-distributed landscape-scale rainfall-runoff modelling toolkit suitable for simulating a broad range of user-specified perceptual models of runoff generation and stream flow occurring in different climatic regions and landscape types. PERSiST (the Precipitation, Evapotranspiration and Runoff Simulator for Solute Transport) is designed for simulating present-day hydrology; projecting possible future effects of climate or land use change on runoff and catchment water storage; and generating hydrologic inputs for the Integrated Catchments (INCA) family of models. PERSiST has limited data requirements and is calibrated using observed time series of precipitation, air temperature and runoff at one or more points in a river network. Here, we apply PERSiST to the river Thames in the UK and describe a Monte Carlo tool for model calibration, sensitivity and uncertainty analysis

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Facility management (FM), from a service oriented approach, addresses the functions and requirements of different services such as energy management, space planning and security service. Different service requires different information to meet the needs arising from the service. Object-based Building Information Modelling (BIM) is limited to support FM services; though this technology is able to generate 3D models that semantically represent facility’s information dynamically over the lifecycle of a building. This paper presents a semiotics-inspired framework to extend BIM from a service-oriented perspective. The extended BIM, which specifies FM services and required information, will be able to express building service information in the right format for the right purposes. The service oriented approach concerns pragmatic aspect of building’s information beyond semantic level. The pragmatics defines and provides context for utilisation of building’s information. Semiotics theory adopted in this paper is to address pragmatic issues of utilisation of BIM for FM services.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article reviews the experiences of a practising business consultancy division. It discusses the reasons for the failure of the traditional, expert consultancy approach and states the requirements for a more suitable consultancy methodology. An approach called ‘Modelling as Learning’ is introduced, its three defining aspects being: client ownership of all analytical work performed, consultant acting as facilitator and sensitivity to soft issues within and surrounding a problem. The goal of such an approach is set as the acceleration of the client's learning about the business. The tools that are used within this methodological framework are discussed and some case studies of the methodology are presented. It is argued that a learning experience was necessary before arriving at the new methodology but that it is now a valuable and significant component of the division's work.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.