928 resultados para DISTRIBUTION MODELS


Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is still much discussion on the most appropriate location, size and shape of marine protected areas (MPAs). These three factors were analyzed for a small coastal MPA, the Luiz Saldanha Marine Park (LSMP), for which a very limited amount of local ecological information was available when implemented in 1998. Marxan was used to provide a number of near-optimal solutions considering different levels of protection for the various conservation features and different costs. These solutions were compared with the existing no-take area of the LSMP. Information on 11 habitat types and distribution models for 3 of the most important species for the local artisanal fisheries was considered. The human activities with the highest economic and ecological impact in the study area (commercial and recreational fishing and scuba diving) were used as costs. The results show that the existing no-take area is actually located in the best area. However, the no-take area offers limited protection to vagile fish and covers a very small proportion of some of the available habitats. An increase in the conservation targets led to an increase in the number of no-take areas. The comparative framework used in this study can be applied elsewhere, providing relevant information to local stakeholders and managers in order to proceed with adaptive management. (C) 2015 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

ABSTRACT: This study aimed to estimate the probability of climatological water deficit in an experimental watershed in the Cerrado biome, located in the central plateau of Brazil. For that, it was used a time series of 31 years (1982?2012). The probable climatological water deficit was calculated by the difference between rainfall and probable reference evapotranspiration, on a decennial scale. The reference evapotranspiration (ET0) was estimated by the standard FAO-56 Penman-Monteith method. To estimate water deficit, it was used gamma distribution, time series of rainfall and reference evapotranspiration. The adherence of the estimated probabilities to the observed data was verified by the Kolmogorov-Smirnov nonparametric test, with significance level (a-0.05), which presented a good adjustment to the distribution models. It was observed a climatological water deficit, in greater or lesser intensity, between the annual decennials 2 and 32.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis presents a novel approach to building large-scale agent-based models of networked physical systems using a compositional approach to provide extensibility and flexibility in building the models and simulations. A software framework (MODAM - MODular Agent-based Model) was implemented for this purpose, and validated through simulations. These simulations allow assessment of the impact of technological change on the electricity distribution network looking at the trajectories of electricity consumption at key locations over many years.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Deterministic models have been widely used to predict water quality in distribution systems, but their calibration requires extensive and accurate data sets for numerous parameters. In this study, alternative data-driven modeling approaches based on artificial neural networks (ANNs) were used to predict temporal variations of two important characteristics of water quality chlorine residual and biomass concentrations. The authors considered three types of ANN algorithms. Of these, the Levenberg-Marquardt algorithm provided the best results in predicting residual chlorine and biomass with error-free and ``noisy'' data. The ANN models developed here can generate water quality scenarios of piped systems in real time to help utilities determine weak points of low chlorine residual and high biomass concentration and select optimum remedial strategies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recently, probability models on rankings have been proposed in the field of estimation of distribution algorithms in order to solve permutation-based combinatorial optimisation problems. Particularly, distance-based ranking models, such as Mallows and Generalized Mallows under the Kendall’s-t distance, have demonstrated their validity when solving this type of problems. Nevertheless, there are still many trends that deserve further study. In this paper, we extend the use of distance-based ranking models in the framework of EDAs by introducing new distance metrics such as Cayley and Ulam. In order to analyse the performance of the Mallows and Generalized Mallows EDAs under the Kendall, Cayley and Ulam distances, we run them on a benchmark of 120 instances from four well known permutation problems. The conducted experiments showed that there is not just one metric that performs the best in all the problems. However, the statistical test pointed out that Mallows-Ulam EDA is the most stable algorithm among the studied proposals.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The features of two popular models used to describe the observed response characteristics of typical oxygen optical sensors based on luminescence quenching are examined critically. The models are the 'two-site' and 'Gaussian distribution in natural lifetime, tau(o),' models. These models are used to characterise the response features of typical optical oxygen sensors; features which include: downward curving Stern-Volmer plots and increasingly non-first order luminescence decay kinetics with increasing partial pressures of oxygen, pO(2). Neither model appears able to unite these latter features, let alone the observed disparate array of response features exhibited by the myriad optical oxygen sensors reported in the literature, and still maintain any level of physical plausibility. A model based on a Gaussian distribution in quenching rate constant, k(q), is developed and, although flawed by a limited breadth in distribution, rho, does produce Stern-Volmer plots which would cover the range in curvature seen with real optical oxygen sensors. A new 'log-Gaussian distribution in tau(o) or k(q)' model is introduced which has the advantage over a Gaussian distribution model of placing no limitation on the value of rho. Work on a 'log-Gaussian distribution in tau(o)' model reveals that the Stern-Volmer quenching plots would show little degree in curvature, even at large rho values and the luminescence decays would become increasingly first order with increasing pO(2). In fact, with real optical oxygen sensors, the opposite is observed and thus the model appears of little value. In contrast, a 'log-Gaussian distribution in k(o)' model does produce the trends observed with real optical oxygen sensors; although it is technically restricted in use to those in which the kinetics of luminescence decay are good first order in the absence of oxygen. The latter model gives a good fit to the major response features of sensors which show the latter feature, most notably the [Ru(dpp)(3)(2+)(Ph4B-)(2)] in cellulose optical oxygen sensors. The scope of a log-Gaussian model for further expansion and, therefore, application to optical oxygen sensors, by combining both a log-Gaussian distribution in k(o) with one in tau(o) is briefly discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A conceptual model is described for generating distributions of grazing animals, according to their searching behavior, to investigate the mechanisms animals may use to achieve their distributions. The model simulates behaviors ranging from random diffusion, through taxis and cognitively aided navigation (i.e., using memory), to the optimization extreme of the Ideal Free Distribution. These behaviors are generated from simulation of biased diffusion that operates at multiple scales simultaneously, formalizing ideas of multiple-scale foraging behavior. It uses probabilistic bias to represent decisions, allowing multiple search goals to be combined (e.g., foraging and social goals) and the representation of suboptimal behavior. By allowing bias to arise at multiple scales within the environment, each weighted relative to the others, the model can represent different scales of simultaneous decision-making and scale-dependent behavior. The model also allows different constraints to be applied to the animal's ability (e.g., applying food-patch accessibility and information limits). Simulations show that foraging-decision randomness and spatial scale of decision bias have potentially profound effects on both animal intake rate and the distribution of resources in the environment. Spatial variograms show that foraging strategies can differentially change the spatial pattern of resource abundance in the environment to one characteristic of the foraging strategy.</

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The ideal free distribution model which relates the spatial distribution of mobile consumers to that of their resource is shown to be a limiting case of a more general model which we develop using simple concepts of diffusion. We show how the ideal free distribution model can be derived from a more general model and extended by incorporating simple models of social influences on predator spacing. First, a free distribution model based on patch switching rules, with a power-law interference term, which represents instantaneous biased diffusion is derived. A social bias term is then introduced to represent the effect of predator aggregation on predator fitness, separate from any effects which act through intake rate. The social bias term is expanded to express an optimum spacing for predators and example solutions of the resulting biased diffusion models are shown. The model demonstrates how an empirical interference coefficient, derived from measurements of predator and prey densities, may include factors expressing the impact of social spacing behaviour on fitness. We conclude that empirical values of log predator/log prey ratio may contain information about more than the relationship between consumer and resource densities. Unlike many previous models, the model shown here applies to conditions without continual input. (C) 1997 Academic Press Limited.</p>