813 resultados para Constraint based modelling
Resumo:
Context: Variation in photosynthetic activity of trees induced by climatic stress can be effectively evaluated using remote sensing data. Although adverse effects of climate on temperate forests have been subjected to increased scrutiny, the suitability of remote sensing imagery for identification of drought stress in such forests has not been explored fully. Aim: To evaluate the sensitivity of MODIS-based vegetation index to heat and drought stress in temperate forests, and explore the differences in stress response of oaks and beech. Methods: We identified 8 oak and 13 beech pure and mature stands, each covering between 4 and 13 MODIS pixels. For each pixel, we extracted a time series of MODIS NDVI from 2000 to 2010. We identified all sequences of continuous unseasonal NDVI decline to be used as the response variable indicative of environmental stress. Neural Networks-based regression modelling was then applied to identify the climatic variables that best explain observed NDVI declines. Results: Tested variables explained 84–97% of the variation in NDVI, whilst air temperature-related climate extremes were found to be the most influential. Beech showed a linear response to the most influential climatic predictors, while oak responded in a unimodal pattern suggesting a better coping mechanism. Conclusions: MODIS NDVI has proved sufficiently sensitive as a stand-level indicator of climatic stress acting upon temperate broadleaf forests, leading to its potential use in predicting drought stress from meteorological observations and improving parameterisation of forest stress indices.
Resumo:
The Maritime Continent archipelago, situated on the equator at 95-165E, has the strongest land-based precipitation on Earth. The latent heat release associated with the rainfall affects the atmospheric circulation throughout the tropics and into the extra-tropics. The greatest source of variability in precipitation is the diurnal cycle. The archipelago is within the convective region of the Madden-Julian Oscillation (MJO), which provides the greatest variability on intra-seasonal time scales: large-scale (∼10^7 km^2) active and suppressed convective envelopes propagate slowly (∼5 m s^-1) eastwards between the Indian and Pacific Oceans. High-resolution satellite data show that a strong diurnal cycle is triggered to the east of the advancing MJO envelope, leading the active MJO by one-eighth of an MJO cycle (∼6 days). Where the diurnal cycle is strong its modulation accounts for 81% of the variability in MJO precipitation. Over land this determines the structure of the diagnosed MJO. This is consistent with the equatorial wave dynamics in existing theories of MJO propagation. The MJO also affects the speed of gravity waves propagating offshore from the Maritime Continent islands. This is largely consistent with changes in static stability during the MJO cycle. The MJO and its interaction with the diurnal cycle are investigated in HiGEM, a high-resolution coupled model. Unlike many models, HiGEM represents the MJO well with eastward-propagating variability on intra-seasonal time scales at the correct zonal wavenumber, although the inter-tropical convergence zone's precipitation peaks strongly at the wrong time, interrupting the MJO's spatial structure. However, the modelled diurnal cycle is too weak and its phase is too early over land. The modulation of the diurnal amplitude by the MJO is also too weak and accounts for only 51% of the variability in MJO precipitation. Implications for forecasting and possible causes of the model errors are discussed, and further modelling studies are proposed.
Resumo:
We discussed a floating mechanism based on quasi-magnetic levitation method that can be attached at the endpoint of a robot arm in order to construct a novel redundant robot arm for producing compliant motions. The floating mechanism can be composed of magnets and a constraint mechanism such that the repelling force of the magnets floats the endpoint part of the mechanism stable for the guided motions. The analytical and experimental results show that the proposed floating mechanism can produce stable floating motions with small inertia and viscosity. The results also show that the proposed mechanism can detect small force applied to the endpoint part because the friction force of the mechanism is very small.
Resumo:
Understanding how and why the capability of one set of business resources, its structural arrangements and mechanisms compared to another works can provide competitive advantage in terms of new business processes and product and service development. However, most business models of capability are descriptive and lack formal modelling language to qualitatively and quantifiably compare capabilities, Gibson’s theory of affordance, the potential for action, provides a formal basis for a more robust and quantitative model, but most formal affordance models are complex and abstract and lack support for real-world applications. We aim to understand the ‘how’ and ‘why’ of business capability, by developing a quantitative and qualitative model that underpins earlier work on Capability-Affordance Modelling – CAM. This paper integrates an affordance based capability model and the formalism of Coloured Petri Nets to develop a simulation model. Using the model, we show how capability depends on the space time path of interacting resources, the mechanism of transition and specific critical affordance factors relating to the values of the variables for resources, people and physical objects. We show how the model can identify the capabilities of resources to enable the capability to inject a drug and anaesthetise a patient.
Resumo:
The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in two consulting studies carried out by Capgemini involving three UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.
Resumo:
Facility management (FM), from a service oriented approach, addresses the functions and requirements of different services such as energy management, space planning and security service. Different service requires different information to meet the needs arising from the service. Object-based Building Information Modelling (BIM) is limited to support FM services; though this technology is able to generate 3D models that semantically represent facility’s information dynamically over the lifecycle of a building. This paper presents a semiotics-inspired framework to extend BIM from a service-oriented perspective. The extended BIM, which specifies FM services and required information, will be able to express building service information in the right format for the right purposes. The service oriented approach concerns pragmatic aspect of building’s information beyond semantic level. The pragmatics defines and provides context for utilisation of building’s information. Semiotics theory adopted in this paper is to address pragmatic issues of utilisation of BIM for FM services.
Resumo:
We consider the forecasting of macroeconomic variables that are subject to revisions, using Bayesian vintage-based vector autoregressions. The prior incorporates the belief that, after the first few data releases, subsequent ones are likely to consist of revisions that are largely unpredictable. The Bayesian approach allows the joint modelling of the data revisions of more than one variable, while keeping the concomitant increase in parameter estimation uncertainty manageable. Our model provides markedly more accurate forecasts of post-revision values of inflation than do other models in the literature.
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.
Resumo:
Population modelling is increasingly recognised as a useful tool for pesticide risk assessment. For vertebrates that may ingest pesticides with their food, such as woodpigeon (Columba palumbus), population models that simulate foraging behaviour explicitly can help predicting both exposure and population-level impact. Optimal foraging theory is often assumed to explain the individual-level decisions driving distributions of individuals in the field, but it may not adequately predict spatial and temporal characteristics of woodpigeon foraging because of the woodpigeons’ excellent memory, ability to fly long distances, and distinctive flocking behaviour. Here we present an individual-based model (IBM) of the woodpigeon. We used the model to predict distributions of foraging woodpigeons that use one of six alternative foraging strategies: optimal foraging, memory-based foraging and random foraging, each with or without flocking mechanisms. We used pattern-oriented modelling to determine which of the foraging strategies is best able to reproduce observed data patterns. Data used for model evaluation were gathered during a long-term woodpigeon study conducted between 1961 and 2004 and a radiotracking study conducted in 2003 and 2004, both in the UK, and are summarised here as three complex patterns: the distributions of foraging birds between vegetation types during the year, the number of fields visited daily by individuals, and the proportion of fields revisited by them on subsequent days. The model with a memory-based foraging strategy and a flocking mechanism was the only one to reproduce these three data patterns, and the optimal foraging model produced poor matches to all of them. The random foraging strategy reproduced two of the three patterns but was not able to guarantee population persistence. We conclude that with the memory-based foraging strategy including a flocking mechanism our model is realistic enough to estimate the potential exposure of woodpigeons to pesticides. We discuss how exposure can be linked to our model, and how the model could be used for risk assessment of pesticides, for example predicting exposure and effects in heterogeneous landscapes planted seasonally with a variety of crops, while accounting for differences in land use between landscapes.
Resumo:
Earthworms are important organisms in soil communities and so are used as model organisms in environmental risk assessments of chemicals. However current risk assessments of soil invertebrates are based on short-term laboratory studies, of limited ecological relevance, supplemented if necessary by site-specific field trials, which sometimes are challenging to apply across the whole agricultural landscape. Here, we investigate whether population responses to environmental stressors and pesticide exposure can be accurately predicted by combining energy budget and agent-based models (ABMs), based on knowledge of how individuals respond to their local circumstances. A simple energy budget model was implemented within each earthworm Eisenia fetida in the ABM, based on a priori parameter estimates. From broadly accepted physiological principles, simple algorithms specify how energy acquisition and expenditure drive life cycle processes. Each individual allocates energy between maintenance, growth and/or reproduction under varying conditions of food density, soil temperature and soil moisture. When simulating published experiments, good model fits were obtained to experimental data on individual growth, reproduction and starvation. Using the energy budget model as a platform we developed methods to identify which of the physiological parameters in the energy budget model (rates of ingestion, maintenance, growth or reproduction) are primarily affected by pesticide applications, producing four hypotheses about how toxicity acts. We tested these hypotheses by comparing model outputs with published toxicity data on the effects of copper oxychloride and chlorpyrifos on E. fetida. Both growth and reproduction were directly affected in experiments in which sufficient food was provided, whilst maintenance was targeted under food limitation. Although we only incorporate toxic effects at the individual level we show how ABMs can readily extrapolate to larger scales by providing good model fits to field population data. The ability of the presented model to fit the available field and laboratory data for E. fetida demonstrates the promise of the agent-based approach in ecology, by showing how biological knowledge can be used to make ecological inferences. Further work is required to extend the approach to populations of more ecologically relevant species studied at the field scale. Such a model could help extrapolate from laboratory to field conditions and from one set of field conditions to another or from species to species.
Resumo:
The potential risk of agricultural pesticides to mammals typically depends on internal concentrations within individuals, and these are determined by the amount ingested and by absorption, distribution, metabolism, and excretion (ADME). Pesticide residues ingested depend, amongst other things, on individual spatial choices which determine how much and when feeding sites and areas of pesticide application overlap, and can be calculated using individual-based models (IBMs). Internal concentrations can be calculated using toxicokinetic (TK) models, which are quantitative representations of ADME processes. Here we provide a population model for the wood mouse (Apodemus sylvaticus) in which TK submodels were incorporated into an IBM representation of individuals making choices about where to feed. This allows us to estimate the contribution of individual spatial choice and TK processes to risk. We compared the risk predicted by four IBMs: (i) “AllExposed-NonTK”: assuming no spatial choice so all mice have 100% exposure, no TK, (ii) “AllExposed-TK”: identical to (i) except that the TK processes are included where individuals vary because they have different temporal patterns of ingestion in the IBM, (iii) “Spatial-NonTK”: individual spatial choice, no TK, and (iv) “Spatial-TK”: individual spatial choice and with TK. The TK parameters for hypothetical pesticides used in this study were selected such that a conventional risk assessment would fail. Exposures were standardised using risk quotients (RQ; exposure divided by LD50 or LC50). We found that for the exposed sub-population including either spatial choice or TK reduced the RQ by 37–85%, and for the total population the reduction was 37–94%. However spatial choice and TK together had little further effect in reducing RQ. The reasons for this are that when the proportion of time spent in treated crop (PT) approaches 1, TK processes dominate and spatial choice has very little effect, and conversely if PT is small spatial choice dominates and TK makes little contribution to exposure reduction. The latter situation means that a short time spent in the pesticide-treated field mimics exposure from a small gavage dose, but TK only makes a substantial difference when the dose was consumed over a longer period. We concluded that a combined TK-IBM is most likely to bring added value to the risk assessment process when the temporal pattern of feeding, time spent in exposed area and TK parameters are at an intermediate level; for instance wood mice in foliar spray scenarios spending more time in crop fields because of better plant cover.
Resumo:
The aim of this paper is to develop a comprehensive taxonomy of green supply chain management (GSCM) practices and develop a structural equation modelling-driven decision support system following GSCM taxonomy for managers to provide better understanding of the complex relationship between the external and internal factors and GSCM operational practices. Typology and/or taxonomy play a key role in the development of social science theories. The current taxonomies focus on a single or limited component of the supply chain. Furthermore, they have not been tested using different sample compositions and contexts, yet replication is a prerequisite for developing robust concepts and theories. In this paper, we empirically replicate one such taxonomy extending the original study by (a) developing broad (containing the key components of supply chain) taxonomy; (b) broadening the sample by including a wider range of sectors and organisational size; and (c) broadening the geographic scope of the previous studies. Moreover, we include both objective measures and subjective attitudinal measurements. We use a robust two-stage cluster analysis to develop our GSCM taxonomy. The main finding validates the taxonomy previously proposed and identifies size, attitude and level of environmental risk and impact as key mediators between internal drivers, external drivers and GSCM operational practices.
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.
Resumo:
The classic vertical advection-diffusion (VAD) balance is a central concept in studying the ocean heat budget, in particular in simple climate models (SCMs). Here we present a new framework to calibrate the parameters of the VAD equation to the vertical ocean heat balance of two fully-coupled climate models that is traceable to the models’ circulation as well as to vertical mixing and diffusion processes. Based on temperature diagnostics, we derive an effective vertical velocity w∗ and turbulent diffusivity k∗ for each individual physical process. In steady-state, we find that the residual vertical velocity and diffusivity change sign in mid-depth, highlighting the different regional contributions of isopycnal and diapycnal diffusion in balancing the models’ residual advection and vertical mixing. We quantify the impacts of the time-evolution of the effective quantities under a transient 1%CO2 simulation and make the link to the parameters of currently employed SCMs.
Resumo:
An evidence-based review of the potential impact that the introduction of genetically-modified (GM) cereal and oilseed crops could have for the UK was carried out. The inter-disciplinary research project addressed the key research questions using scenarios for the uptake, or not, of GM technologies. This was followed by an extensive literature review, stakeholder consultation and financial modelling. The world area of canola, oilseed rape (OSR) low in both erucic acid in the oil and glucosinolates in the meal, was 34M ha in 2012 of which 27% was GM; Canada is the lead producer but it is also grown in the USA, Australia and Chile. Farm level effects of adopting GM OSR include: lower production costs; higher yields and profits; and ease of farm management. Growing GM OSR instead of conventional OSR reduces both herbicide usage and environmental impact. Some 170M ha of maize was grown in the world in 2011 of which 28% was GM; the main producers are the USA, China and Brazil. Spain is the main EU producer of GM maize although it is also grown widely in Portugal. Insect resistant (IR) and herbicide tolerant (HT) are the GM maize traits currently available commercially. Farm level benefits of adopting GM maize are lower costs of production through reduced use of pesticides and higher profits. GM maize adoption results in less pesticide usage than on conventional counterpart crops leading to less residues in food and animal feed and allowing increasing diversity of bees and other pollinators. In the EU, well-tried coexistence measures for growing GM crops in the proximity of conventional crops have avoided gene flow issues. Scientific evidence so far seems to indicate that there has been no environmental damage from growing GM crops. They may possibly even be beneficial to the environment as they result in less pesticides and herbicides being applied and improved carbon sequestration from less tillage. A review of work on GM cereals relevant for the UK found input trait work on: herbicide and pathogen tolerance; abiotic stress such as from drought or salinity; and yield traits under different field conditions. For output traits, work has mainly focussed on modifying the nutritional components of cereals and in connection with various enzymes, diagnostics and vaccines. Scrutiny of applications submitted for field trial testing of GM cereals found around 9000 applications in the USA, 15 in Australia and 10 in the EU since 1996. There have also been many patent applications and granted patents for GM cereals in the USA for both input and output traits;an indication of the scale of such work is the fact that in a 6 week period in the spring of 2013, 12 patents were granted relating to GM cereals. A dynamic financial model has enabled us to better understand and examine the likely performance of Bt maize and HT OSR for the south of the UK, if cultivation is permitted in the future. It was found that for continuous growing of Bt maize and HT OSR, unless there was pest pressure for the former and weed pressure for the latter, the seed premia and likely coexistence costs for a buffer zone between other crops would reduce the financial returns for the GM crops compared with their conventional counterparts. When modelling HT OSR in a four crop rotation, it was found that gross margins increased significantly at the higher levels of such pest or weed pressure, particularly for farm businesses with larger fields where coexistence costs would be scaled down. The impact of the supply of UK-produced GM crops on the wider supply chain was examined through an extensive literature review and widespread stakeholder consultation with the feed supply chain. The animal feed sector would benefit from cheaper supplies of raw materials if GM crops were grown and, in the future, they might also benefit from crops with enhanced nutritional profile (such as having higher protein levels) becoming available. This would also be beneficial to livestock producers enabling lower production costs and higher margins. Whilst coexistence measures would result in increased costs, it is unlikely that these would cause substantial changes in the feed chain structure. Retailers were not concerned about a future increase in the amount of animal feed coming from GM crops. To conclude, we (the project team) feel that the adoption of currently available and appropriate GM crops in the UK in the years ahead would benefit farmers, consumers and the feed chain without causing environmental damage. Furthermore, unless British farmers are allowed to grow GM crops in the future, the competitiveness of farming in the UK is likely to decline relative to that globally.