788 resultados para agent-based simulation
Resumo:
The behaviour of single installations of solar energy systems is well understood; however, what happens at an aggregated location, such as a distribution substation, when output of groups of installations cumulate is not so well understood. This paper considers groups of installations attached to distributions substations on which the load is primarily commercial and industrial. Agent-based modelling has been used to model the physical electrical distribution system and the behaviour of equipment outputs towards the consumer end of the network. The paper reports the approach used to simulate both the electricity consumption of groups of consumers and the output of solar systems subject to weather variability with the inclusion of cloud data from the Bureau of Meteorology (BOM). The data sets currently used are for Townsville, North Queensland. The initial characteristics that indicate whether solar installations are cost effective from an electricity distribution perspective are discussed.
Resumo:
The contextuality of changing attitudes makes them extremely difficult to model. This paper scales up Quantum Decision Theory (QDT) to a social setting, using it to model the manner in which social contexts can interact with the process of low elaboration attitude change. The elements of this extended theory are presented, along with a proof of concept computational implementation in a low dimensional subspace. This model suggests that a society's understanding of social issues will settle down into a static or frozen configuration unless that society consists of a range of individuals with varying personality types and norms.
Resumo:
Global awareness for cleaner and renewable energy is transforming the electricity sector at many levels. New technologies are being increasingly integrated into the electricity grid at high, medium and low voltage levels, new taxes on carbon emissions are being introduced and individuals can now produce electricity, mainly through rooftop photovoltaic (PV) systems. While leading to improvements, these changes also introduce challenges, and a question that often rises is ‘how can we manage this constantly evolving grid?’ The Queensland Government and Ergon Energy, one of the two Queensland distribution companies, have partnered with some Australian and German universities on a project to answer this question in a holistic manner. The project investigates the impact the integration of renewables and other new technologies has on the physical structure of the grid, and how this evolving system can be managed in a sustainable and economical manner. To aid understanding of what the future might bring, a software platform has been developed that integrates two modelling techniques: agent-based modelling (ABM) to capture the characteristics of the different system units accurately and dynamically, and particle swarm optimization (PSO) to find the most economical mix of network extension and integration of distributed generation over long periods of time. Using data from Ergon Energy, two types of networks (3 phase, and Single Wired Earth Return or SWER) have been modelled; three-phase networks are usually used in dense networks such as urban areas, while SWER networks are widely used in rural Queensland. Simulations can be performed on these networks to identify the required upgrades, following a three-step process: a) what is already in place and how it performs under current and future loads, b) what can be done to manage it and plan the future grid and c) how these upgrades/new installations will perform over time. The number of small-scale distributed generators, e.g. PV and battery, is now sufficient (and expected to increase) to impact the operation of the grid, which in turn needs to be considered by the distribution network manager when planning for upgrades and/or installations to stay within regulatory limits. Different scenarios can be simulated, with different levels of distributed generation, in-place as well as expected, so that a large number of options can be assessed (Step a). Once the location, sizing and timing of assets upgrade and/or installation are found using optimisation techniques (Step b), it is possible to assess the adequacy of their daily performance using agent-based modelling (Step c). One distinguishing feature of this software is that it is possible to analyse a whole area at once, while still having a tailored solution for each of the sub-areas. To illustrate this, using the impact of battery and PV can have on the two types of networks mentioned above, three design conditions can be identified (amongst others): · Urban conditions o Feeders that have a low take-up of solar generators, may benefit from adding solar panels o Feeders that need voltage support at specific times, may be assisted by installing batteries · Rural conditions - SWER network o Feeders that need voltage support as well as peak lopping may benefit from both battery and solar panel installations. This small example demonstrates that no single solution can be applied across all three areas, and there is a need to be selective in which one is applied to each branch of the network. This is currently the function of the engineer who can define various scenarios against a configuration, test them and iterate towards an appropriate solution. Future work will focus on increasing the level of automation in identifying areas where particular solutions are applicable.
Resumo:
We consider a discrete agent-based model on a one-dimensional lattice, where each agent occupies L sites and attempts movements over a distance of d lattice sites. Agents obey a strict simple exclusion rule. A discrete-time master equation is derived using a mean-field approximation and careful probability arguments. In the continuum limit, nonlinear diffusion equations that describe the average agent occupancy are obtained. Averaged discrete simulation data are generated and shown to compare very well with the solution to the derived nonlinear diffusion equations. This framework allows us to approach a lattice-free result using all the advantages of lattice methods. Since different cell types have different shapes and speeds of movement, this work offers insight into population-level behavior of collective cellular motion.
Resumo:
We consider a discrete agent-based model on a one-dimensional lattice and a two-dimensional square lattice, where each agent is a dimer occupying two sites. Agents move by vacating one occupied site in favor of a nearest-neighbor site and obey either a strict simple exclusion rule or a weaker constraint that permits partial overlaps between dimers. Using indicator variables and careful probability arguments, a discrete-time master equation for these processes is derived systematically within a mean-field approximation. In the continuum limit, nonlinear diffusion equations that describe the average agent occupancy of the dimer population are obtained. In addition, we show that multiple species of interacting subpopulations give rise to advection-diffusion equations. Averaged discrete simulation data compares very well with the solution to the continuum partial differential equation models. Since many cell types are elongated rather than circular, this work offers insight into population-level behavior of collective cellular motion.
Resumo:
Observations conducted by researchers revealed that the group interaction within crowds is a common phenomenon and has great influence on pedestrian behaviour. However, most research currently undertaken by various researchers failed to consider the group dynamics when developing pedestrian flow models. This paper presented a critical review of pedestrian models that incorporates group behaviour. Models reviewed in this paper are mainly created by microscopic modelling approaches such as social force, cellular automata, and agent-based method. The purpose of this literature review is to improve the understanding of group dynamics among pedestrians and highlight the need for considering group dynamics when developing pedestrian simulation models.
Resumo:
This paper presents simulation results for future electricity grids using an agent-based model developed with MODAM (MODular Agent-based Model). MODAM is introduced and its use demonstrated through four simulations based on a scenario that expects a rise of on-site renewable generators and electric vehicles (EV) usage. The simulations were run over many years, for two areas in Townsville, Australia, capturing variability in space of the technology uptake, and for two charging methods for EV, capturing people's behaviours and their impact on the time of the peak load. Impact analyses of these technologies were performed over the areas, down to the distribution transformer level, where greater variability of their contribution to the assets peak load was observed. The MODAM models can be used for different purposes such as impact of renewables on grid sizing, or on greenhouse gas emissions. The insights gained from using MODAM for technology assessment are discussed.
Resumo:
The aim of this dissertation is to provide conceptual tools for the social scientist for clarifying, evaluating and comparing explanations of social phenomena based on formal mathematical models. The focus is on relatively simple theoretical models and simulations, not statistical models. These studies apply a theory of explanation according to which explanation is about tracing objective relations of dependence, knowledge of which enables answers to contrastive why and how-questions. This theory is developed further by delineating criteria for evaluating competing explanations and by applying the theory to social scientific modelling practices and to the key concepts of equilibrium and mechanism. The dissertation is comprised of an introductory essay and six published original research articles. The main theses about model-based explanations in the social sciences argued for in the articles are the following. 1) The concept of explanatory power, often used to argue for the superiority of one explanation over another, compasses five dimensions which are partially independent and involve some systematic trade-offs. 2) All equilibrium explanations do not causally explain the obtaining of the end equilibrium state with the multiple possible initial states. Instead, they often constitutively explain the macro property of the system with the micro properties of the parts (together with their organization). 3) There is an important ambivalence in the concept mechanism used in many model-based explanations and this difference corresponds to a difference between two alternative research heuristics. 4) Whether unrealistic assumptions in a model (such as a rational choice model) are detrimental to an explanation provided by the model depends on whether the representation of the explanatory dependency in the model is itself dependent on the particular unrealistic assumptions. Thus evaluating whether a literally false assumption in a model is problematic requires specifying exactly what is supposed to be explained and by what. 5) The question of whether an explanatory relationship depends on particular false assumptions can be explored with the process of derivational robustness analysis and the importance of robustness analysis accounts for some of the puzzling features of the tradition of model-building in economics. 6) The fact that economists have been relatively reluctant to use true agent-based simulations to formulate explanations can partially be explained by the specific ideal of scientific understanding implicit in the practise of orthodox economics.
Resumo:
Prickly acacia (Vachellia nilotica subsp. indica), a native multipurpose tree in India, is a weed of National significance, and a target for biological control in Australia. Based on plant genetic and climatic similarities, native range surveys for identifying potential biological control agents for prickly acacia were conducted in India during 2008-2011. In the survey leaf-feeding geometrid, Isturgia disputaria Guenee (syn. Tephrina pulinda), widespread in Tamil Nadu and Karnataka States, was prioritized as a potential biological control agent based on field host range, damage potential and no choice test on non target plant species. Though the field host range study exhibited that V. nilotica ssp. indica and V. nilotica ssp. tomentosa were the primary hosts for successful development of the insect, I. disputaria, replicated no - choice larval feeding and development tests conducted on cut foliage and live plants of nine non-target acacia test plant species in India revealed the larval feeding and development on three of the nine non-target acacia species, V. tortilis, V. planiferons and V. leucophloea in addition to the V. nilotica ssp. indica and V. nilotica ssp. tomentosa. However, the proportion of larvae developing into adults was higher on V. nilotica subsp. indica and V. nilotica subsp. tomentosa, with 90% and 80% of the larvae completing development, respectively. In contrast, the larval mortality was higher on V. tortilis (70%), V. leucophloea (90%) and V. planiferons (70%). The no-choice test results support the earlier host specificity test results of I. disputaria from Pakistan, Kenya and under quarantine in Australia. Contrasting results between field host range and host use pattern under no-choice conditions are discussed.
Resumo:
Multi-agent systems (MAS) advocate an agent-based approach to software engineering based on decomposing problems in terms of decentralized, autonomous agents that can engage in flexible, high-level interactions. This chapter introduces scalable fault tolerant agent grooming environment (SAGE), a second-generation Foundation for Intelligent Physical Agents (FIPA)-compliant multi-agent system developed at NIIT-Comtec, which provides an environment for creating distributed, intelligent, and autonomous entities that are encapsulated as agents. The chapter focuses on the highlight of SAGE, which is its decentralized fault-tolerant architecture that can be used to develop applications in a number of areas such as e-health, e-government, and e-science. In addition, SAGE architecture provides tools for runtime agent management, directory facilitation, monitoring, and editing messages exchange between agents. SAGE also provides a built-in mechanism to program agent behavior and their capabilities with the help of its autonomous agent architecture, which is the other major highlight of this chapter. The authors believe that the market for agent-based applications is growing rapidly, and SAGE can play a crucial role for future intelligent applications development. © 2007, IGI Global.
Resumo:
NK model, proposed by Kauffman (1993), is a strong simulation framework to study competing dynamics. It has been applied in some social science fields, for instance, organization science. However, like many other simulation methods, NK model has not received much attention from Management Information Systems (MIS) discipline. This tutorial, thus, is trying to introduce NK model in a simple way and encourage related studies. To demonstrate how NK model works, this tutorial reproduces several Levinthal’s (1997) experiments. Besides, this tutorial attempts to make clear the relevance between NK model and agent-based modeling (ABM). The relevance can be a theoretical basis to further develop NK model framework for other research scenarios. For example, this tutorial provides an NK model solution to study IT value cocreation process by extending network structure and agent interactions.
Resumo:
The paper presents an adaptive Fourier filtering technique and a relaying scheme based on a combination of a digital band-pass filter along with a three-sample algorithm, for applications in high-speed numerical distance protection. To enhance the performance of above-mentioned technique, a high-speed fault detector has been used. MATLAB based simulation studies show that the adaptive Fourier filtering technique provides fast tripping for near faults and security for farther faults. The digital relaying scheme based on a combination of digital band-pass filter along with three-sample data window algorithm also provides accurate and high-speed detection of faults. The paper also proposes a high performance 16-bit fixed point DSP (Texas Instruments TMS320LF2407A) processor based hardware scheme suitable for implementation of the above techniques. To evaluate the performance of the proposed relaying scheme under steady state and transient conditions, PC based menu driven relay test procedures are developed using National Instruments LabVIEW software. The test signals are generated in real time using LabVIEW compatible analog output modules. The results obtained from the simulation studies as well as hardware implementations are also presented.
Resumo:
We present a new Hessian estimator based on the simultaneous perturbation procedure, that requires three system simulations regardless of the parameter dimension. We then present two Newton-based simulation optimization algorithms that incorporate this Hessian estimator. The two algorithms differ primarily in the manner in which the Hessian estimate is used. Both our algorithms do not compute the inverse Hessian explicitly, thereby saving on computational effort. While our first algorithm directly obtains the product of the inverse Hessian with the gradient of the objective, our second algorithm makes use of the Sherman-Morrison matrix inversion lemma to recursively estimate the inverse Hessian. We provide proofs of convergence for both our algorithms. Next, we consider an interesting application of our algorithms on a problem of road traffic control. Our algorithms are seen to exhibit better performance than two Newton algorithms from a recent prior work.
Resumo:
Social scientists have used agent-based models (ABMs) to explore the interaction and feedbacks among social agents and their environments. The bottom-up structure of ABMs enables simulation and investigation of complex systems and their emergent behaviour with a high level of detail; however the stochastic nature and potential combinations of parameters of such models create large non-linear multidimensional “big data,” which are difficult to analyze using traditional statistical methods. Our proposed project seeks to address this challenge by developing algorithms and web-based analysis and visualization tools that provide automated means of discovering complex relationships among variables. The tools will enable modellers to easily manage, analyze, visualize, and compare their output data, and will provide stakeholders, policy makers and the general public with intuitive web interfaces to explore, interact with and provide feedback on otherwise difficult-to-understand models.
Resumo:
This is an Author's Accepted Manuscript of an article published in “Emergence: Complexity and Organization”, 15 (2), pp. 14-22 (2013), copyright Taylor & Francis.