870 resultados para Agent-Based Models
Resumo:
Actualmente existen aplicaciones que permiten simular el comportamiento de bacterias en distintos hábitats y los procesos que ocurren en estos para facilitar su estudio y experimentación sin la necesidad de un laboratorio. Una de las aplicaciones de software libre para la simulación de poblaciones bacteriológicas mas usada es iDynoMiCS (individual-based Dynamics of Microbial Communities Simulator), un simulador basado en agentes que permite trabajar con varios modelos computacionales de bacterias en 2D y 3D. Este simulador permite una gran libertad al configurar una numerosa cantidad de variables con respecto al entorno, reacciones químicas y otros detalles importantes. Una característica importante es el poder simular de manera sencilla la conjugación de plásmidos entre bacterias. Los plásmidos son moléculas de ADN diferentes del cromosoma celular, generalmente circularles, que se replican, transcriben y conjugan independientemente del ADN cromosómico. Estas están presentes normalmente en bacterias procariotas, y en algunas ocasiones en eucariotas, sin embargo, en este tipo de células son llamados episomas. Dado el complejo comportamiento de los plásmidos y la gama de posibilidades que estos presentan como mecanismos externos al funcionamiento básico de la célula, en la mayoría de los casos confiriéndole distintas ventajas evolutivas, como por ejemplo: resistencia antibiótica, entre otros, resulta importante su estudio y subsecuente manipulación. Sin embargo, el marco operativo del iDynoMiCS, en cuanto a simulación de plásmidos se refiere, es demasiado sencillo y no permite realizar operaciones más complejas que el análisis de la propagación de un plásmido en la comunidad. El presente trabajo surge para resolver esta deficiencia de iDynomics. Aquí se analizarán, desarrollarán e implementarán las modificaciones necesarias para que iDynomics pueda simular satisfactoriamente y mas apegado a la realidad la conjugación de plásmidos y permita así mismo resolver distintas operaciones lógicas, como lo son los circuitos genéticos, basadas en plásmidos. También se analizarán los resultados obtenidos de acuerdo a distintos estudios relevantes y a la comparación de los resultados obtenidos con el código original de iDynomics. Adicionalmente se analizará un estudio comparando la eficiencia de detección de una sustancia mediante dos circuitos genéticos distintos. Asimismo el presente trabajo puede tener interés para el grupo LIA de la Facultad de Informática de la Universidad Politécnica de Madrid, el cual está participando en el proyecto europeo BACTOCOM que se centra en el estudio de la conjugación de plásmidos y circuitos genéticos. --ABSTRACT--Currently there are applications that simulate the behavior of bacteria in different habitats and the ongoing processes inside them to facilitate their study and experimentation without the need for an actual laboratory. One of the most used open source applications to simulate bacterial populations is iDynoMiCS (individual-based Dynamics of Microbial Communities Simulator), an agent-based simulator that allows working with several computer models of 2D and 3D bacteria in biofilms. This simulator allows great freedom by means of a large number of configurable variables regarding environment, chemical reactions and other important details of the simulation. Within these characteristics there exists a very basic framework to simulate plasmid conjugation. Plasmids are DNA molecules physically different from the cell’s chromosome, commonly found as small circular, double-stranded DNA molecules that are replicated, conjugated and transcribed independently of chromosomal DNA. These bacteria are normally present in prokaryotes and sometimes in eukaryotes, which in this case these cells are called episomes. Plasmids are external mechanisms to the cells basic operations, and as such, in the majority of the cases, confer to the host cell various evolutionary advantages, like antibiotic resistance for example. It is mperative to further study plasmids and the possibilities they present. However, the operational framework of the iDynoMiCS plasmid simulation is too simple, and does not allow more complex operations that the analysis of the spread of a plasmid in the community. This project was conceived to resolve this particular deficiency in iDynomics, moreover, in this paper is discussed, developed and implemented the necessary changes to iDynomics simulation software so it can satisfactorily and realistically simulate plasmid conjugation, and allow the possibility to solve various ogic operations, such as plasmid-based genetic circuits. Moreover the results obtained will be analyzed and compared with other relevant studies and with those obtained with the original iDynomics code. Conjointly, an additional study detailing the sensing of a substance with two different genetic circuits will be presented. This work may also be relevant to the LIA group of the Faculty of Informatics of the Polytechnic University of Madrid, which is participating in the European project BACTOCOM that focuses on the study of the of plasmid conjugation and genetic circuits.
Resumo:
This document contains detailed description of the design and the implementation of a multi-agent application controlling traffic lights in a city together with a system for simulating traffic and testing. The goal of this thesis is to design and build a simplified intelligent and distributed solution to the problem with the traffic in the big cities following different good practices in order to allow future refining of the model of the real world. The problem of the traffic in the big cities is still a problem that cannot be solved. Not only is the increasing number of cars a reason for the traffic jams, but also the way the traffic is organized. Usually, the intersections with traffic lights are replaced by roundabouts or interchanges to increase the number of cars that can cross the intersection in certain time. But still there are places where the infrastructure cannot be changed and the traffic light semaphores are the only way to control the car flows. In real life, the traffic lights have a predefined plan for change or they receive information from a centralized system when and how they have to change. But what if the traffic lights can cooperate and decide on their own when and how to change? Using this problem, the purpose of the thesis is to explore different agent-based software engineering approaches to design and build a non-conventional distributed system. From the software engineering point of view, the goal of the thesis is to apply the knowledge and use the skills, acquired during the various courses of the master program in Software Engineering, while solving a practical and complex problem such as the traffic in the cities.
Resumo:
In the recent years, the computer vision community has shown great interest on depth-based applications thanks to the performance and flexibility of the new generation of RGB-D imagery. In this paper, we present an efficient background subtraction algorithm based on the fusion of multiple region-based classifiers that processes depth and color data provided by RGB-D cameras. Foreground objects are detected by combining a region-based foreground prediction (based on depth data) with different background models (based on a Mixture of Gaussian algorithm) providing color and depth descriptions of the scene at pixel and region level. The information given by these modules is fused in a mixture of experts fashion to improve the foreground detection accuracy. The main contributions of the paper are the region-based models of both background and foreground, built from the depth and color data. The obtained results using different database sequences demonstrate that the proposed approach leads to a higher detection accuracy with respect to existing state-of-the-art techniques.
Resumo:
Introdução: Grande parte das ações para promover a atividade física no lazer em populações tem apresentado tamanhos de efeito pequenos ou inexistentes, ou resultados inconsistentes. Abordar o problema a partir da perspectiva sistêmica pode ser uma das formas de superar esse descompasso. Objetivo: Desenvolver um modelo baseado em agentes para investigar a conformação e evolução de padrões populacionais de atividade física no lazer em adultos a partir da interação entre atributos psicológicos dos indivíduos e atributos dos ambientes físico construído e social em que vivem. Métodos: O processo de modelagem foi composto por três etapas: elaboração de um mapa conceitual, com base em revisão da literatura e consulta com especialistas; criação e verificação do algoritmo do modelo; e parametrização e análise de consistência e sensibilidade. Os resultados da revisão da literatura foram consolidados e relatados de acordo com os domínios da busca (aspectos psicológicos, ambiente social e ambiente físico construído). Os resultados quantitativos da consulta com os especialistas foram descritos por meio de frequências e o conteúdo das respostas questões abertas foi analisado e compilado pelo autor desta tese. O algoritmo do modelo foi criado no software NetLogo, versão 5.2.1., seguindo-se um protocolo de verificação para garantir que o algoritmo fosse implementado acuradamente. Nas análises de consistência e sensibilidade, utilizaram-se o Teste A de Vargha-Delaney, coeficiente de correlação de postos parcial, boxplots e gráficos de linha e de dispersão. Resultados: Definiram-se como elementos do mapa conceitual a intenção da pessoa, o comportamento de pessoas próximas e da comunidade, e a percepção da qualidade, do acesso e das atividades disponíveis nos locais em que atividade física no lazer pode ser praticada. O modelo representa uma comunidade hipotética contendo dois tipos de agentes: pessoas e locais em que atividade física no lazer pode ser praticada. As pessoas interagem entre si e com o ambiente construído, gerando tendências temporais populacionais de prática de atividade física no lazer e de intenção. As análises de sensibilidade indicaram que as tendências temporais de atividade física no lazer e de intenção são altamente sensíveis à influência do comportamento atual da pessoa sobre a sua intenção futura, ao tamanho do raio de percepção da pessoa e à proporção de locais em que a atividade física no lazer pode ser praticada. Considerações finais: O mapa conceitual e o modelo baseado em agentes se mostraram adequados para investigar a conformação e evolução de padrões populacionais de atividade física no lazer em adultos. A influência do comportamento da pessoa sobre a sua intenção, o tamanho do raio de percepção da pessoa e a proporção de locais em que a atividade física no lazer pode ser praticada são importantes determinantes da conformação e evolução dos padrões populacionais de atividade física no lazer entre adultos no modelo.
Resumo:
Proper management of supply chains is fundamental in the overall system performance of forestbased activities. Usually, efficient management techniques rely on a decision support software, which needs to be able to generate fast and effective outputs from the set of possibilities. In order to do this, it is necessary to provide accurate models representative of the dynamic interactions of systems. Due to forest-based supply chains’ nature, event-based models are more suited to describe their behaviours. This work proposes the modelling and simulation of a forestbased supply chain, in particular the biomass supply chain, through the SimPy framework. This Python based tool allows the modelling of discrete-event systems using operations such as events, processes and resources. The developed model was used to access the impact of changes in the daily working plan in three situations. First, as a control case, the deterministic behaviour was simulated. As a second approach, a machine delay was introduced and its implications in the plan accomplishment were analysed. Finally, to better address real operating conditions, stochastic behaviours of processing and driving times were simulated. The obtained results validate the SimPy simulation environment as a framework for modelling supply chains in general and for the biomass problem in particular.
Resumo:
Community-based models for injury prevention have become an accepted part of the overall injury control strategy. This systematic review of the scientific literature examines the evidence for their effectiveness in reducing pedestrian injury in children 0-14 years of age. A comprehensive search of the literature was performed using the following study selection criteria: community-based intervention study; target population was children under 14 years; outcome measure is either pedestrian injury rates or observed child pedestrian or vehicle driver behaviour; and use of a community control or an historical control in the study design. Quality assessment and data abstraction was guided by a standardized procedure and performed independently by two authors. Data synthesis was in tabular and text form with meta-analysis not being possible due to the discrepancy in methods and measures between the studies.
Resumo:
We present unified, systematic derivations of schemes in the two known measurement-based models of quantum computation. The first model (introduced by Raussendorf and Briegel, [Phys. Rev. Lett. 86, 5188 (2001)]) uses a fixed entangled state, adaptive measurements on single qubits, and feedforward of the measurement results. The second model (proposed by Nielsen, [Phys. Lett. A 308, 96 (2003)] and further simplified by Leung, [Int. J. Quant. Inf. 2, 33 (2004)]) uses adaptive two-qubit measurements that can be applied to arbitrary pairs of qubits, and feedforward of the measurement results. The underlying principle of our derivations is a variant of teleportation introduced by Zhou, Leung, and Chuang, [Phys. Rev. A 62, 052316 (2000)]. Our derivations unify these two measurement-based models of quantum computation and provide significantly simpler schemes.
Resumo:
Study Objective: Community-based models for injury prevention have become an accepted part of the overall injury control strategy. This systematic review of the scientific literature examines the evidence for their effectiveness in reducing injury due to inadequate car seat restraint use in children 0-16 years of age. Methods: A comprehensive search of the literature was performed using the following study selection criteria: community-based intervention study: target population was children aged 0-16 years of age; outcome measure was either injury rates due to motor vehicle crashes or observed changes in child restraint use; and use of community control or historical control in the study design. Quality assessment and data abstraction was guided by a standardized procedure and performed independently by two authors. Data synthesis was in tabular and text form with meta-analysis not being possible due to the discrepancy in methods and measures between the studies. Results: This review found eight studies, that met all the inclusion criteria. In the studies that measured injury outcomes, significant reductions in risk of motor vehicle occupant injury (33-55%) were reported in the study communities. For those studies reporting observed car seat restraint use the community-based programs were successful in increasing toddler restraint use in 1-5 year aged children by up to 11%; child booster seat use in 4-8 year aged children by up to 13%; rear restraint use in children aged 0-15 years by 8%; a 50% increase in restraint use in pre-school aged children in a high-risk community; and a 44% increase in children aged 5-11 years. Conclusion: While this review highlights that there is some evidence to support the effectiveness of community-based programs to promote car restraint use and/or motor vehicle occupant injury, limitations in the evaluation methodologies of the studies requires the results to be interpreted with caution. There is clearly a need for further high quality program evaluation research to develop an evidence base. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
An appreciation of the physical mechanisms which cause observed seismicity complexity is fundamental to the understanding of the temporal behaviour of faults and single slip events. Numerical simulation of fault slip can provide insights into fault processes by allowing exploration of parameter spaces which influence microscopic and macroscopic physics of processes which may lead towards an answer to those questions. Particle-based models such as the Lattice Solid Model have been used previously for the simulation of stick-slip dynamics of faults, although mainly in two dimensions. Recent increases in the power of computers and the ability to use the power of parallel computer systems have made it possible to extend particle-based fault simulations to three dimensions. In this paper a particle-based numerical model of a rough planar fault embedded between two elastic blocks in three dimensions is presented. A very simple friction law without any rate dependency and no spatial heterogeneity in the intrinsic coefficient of friction is used in the model. To simulate earthquake dynamics the model is sheared in a direction parallel to the fault plane with a constant velocity at the driving edges. Spontaneous slip occurs on the fault when the shear stress is large enough to overcome the frictional forces on the fault. Slip events with a wide range of event sizes are observed. Investigation of the temporal evolution and spatial distribution of slip during each event shows a high degree of variability between the events. In some of the larger events highly complex slip patterns are observed.
Resumo:
The design, development, and use of complex systems models raises a unique class of challenges and potential pitfalls, many of which are commonly recurring problems. Over time, researchers gain experience in this form of modeling, choosing algorithms, techniques, and frameworks that improve the quality, confidence level, and speed of development of their models. This increasing collective experience of complex systems modellers is a resource that should be captured. Fields such as software engineering and architecture have benefited from the development of generic solutions to recurring problems, called patterns. Using pattern development techniques from these fields, insights from communities such as learning and information processing, data mining, bioinformatics, and agent-based modeling can be identified and captured. Collections of such 'pattern languages' would allow knowledge gained through experience to be readily accessible to less-experienced practitioners and to other domains. This paper proposes a methodology for capturing the wisdom of computational modelers by introducing example visualization patterns, and a pattern classification system for analyzing the relationship between micro and macro behaviour in complex systems models. We anticipate that a new field of complex systems patterns will provide an invaluable resource for both practicing and future generations of modelers.
Resumo:
We investigate the policies of (1) restricting social influence and (2) imposing curfews upon interacting citizens in a community. We compare and contrast their effects on the social order and the emerging levels of civil violence. Influence models have been used in the past in the context of decision making in a variety of application domains. The policy of curfews has been utilised with the aim of curbing social violence but little research has been done on its effectiveness. We develop a multi-agent-based model that is used to simulate a community of citizens and the police force that guards it. We find that restricting social influence does indeed pacify rebellious societies, but has the opposite effect on peaceful ones. On the other hand, our simple model indicates that restricting mobility through curfews has a pacifying effect across all types of society.
Resumo:
In today's market, the global competition has put manufacturing businesses in great pressures to respond rapidly to dynamic variations in demand patterns across products and changing product mixes. To achieve substantial responsiveness, the manufacturing activities associated with production planning and control must be integrated dynamically, efficiently and cost-effectively. This paper presents an iterative agent bidding mechanism, which performs dynamic integration of process planning and production scheduling to generate optimised process plans and schedules in response to dynamic changes in the market and production environment. The iterative bidding procedure is carried out based on currency-like metrics in which all operations (e.g. machining processes) to be performed are assigned with virtual currency values, and resource agents bid for the operations if the costs incurred for performing them are lower than the currency values. The currency values are adjusted iteratively and resource agents re-bid for the operations based on the new set of currency values until the total production cost is minimised. A simulated annealing optimisation technique is employed to optimise the currency values iteratively. The feasibility of the proposed methodology has been validated using a test case and results obtained have proven the method outperforming non-agent-based methods.
Resumo:
Multi-agent systems are complex systems comprised of multiple intelligent agents that act either independently or in cooperation with one another. Agent-based modelling is a method for studying complex systems like economies, societies, ecologies etc. Due to their complexity, very often mathematical analysis is limited in its ability to analyse such systems. In this case, agent-based modelling offers a practical, constructive method of analysis. The objective of this book is to shed light on some emergent properties of multi-agent systems. The authors focus their investigation on the effect of knowledge exchange on the convergence of complex, multi-agent systems.
Resumo:
This paper compares the UK/US exchange rate forecasting performance of linear and nonlinear models based on monetary fundamentals, to a random walk (RW) model. Structural breaks are identified and taken into account. The exchange rate forecasting framework is also used for assessing the relative merits of the official Simple Sum and the weighted Divisia measures of money. Overall, there are four main findings. First, the majority of the models with fundamentals are able to beat the RW model in forecasting the UK/US exchange rate. Second, the most accurate forecasts of the UK/US exchange rate are obtained with a nonlinear model. Third, taking into account structural breaks reveals that the Divisia aggregate performs better than its Simple Sum counterpart. Finally, Divisia-based models provide more accurate forecasts than Simple Sum-based models provided they are constructed within a nonlinear framework.
Resumo:
Simulation is an effective method for improving supply chain performance. However, there is limited advice available to assist practitioners in selecting the most appropriate method for a given problem. Much of the advice that does exist relies on custom and practice rather than a rigorous conceptual or empirical analysis. An analysis of the different modelling techniques applied in the supply chain domain was conducted, and the three main approaches to simulation used were identified; these are System Dynamics (SD), Discrete Event Simulation (DES) and Agent Based Modelling (ABM). This research has examined these approaches in two stages. Firstly, a first principles analysis was carried out in order to challenge the received wisdom about their strengths and weaknesses and a series of propositions were developed from this initial analysis. The second stage was to use the case study approach to test these propositions and to provide further empirical evidence to support their comparison. The contributions of this research are both in terms of knowledge and practice. In terms of knowledge, this research is the first holistic cross paradigm comparison of the three main approaches in the supply chain domain. Case studies have involved building ‘back to back’ models of the same supply chain problem using SD and a discrete approach (either DES or ABM). This has led to contributions concerning the limitations of applying SD to operational problem types. SD has also been found to have risks when applied to strategic and policy problems. Discrete methods have been found to have potential for exploring strategic problem types. It has been found that discrete simulation methods can model material and information feedback successfully. Further insights have been gained into the relationship between modelling purpose and modelling approach. In terms of practice, the findings have been summarised in the form of a framework linking modelling purpose, problem characteristics and simulation approach.