871 resultados para Agent based moduling stimulation
Resumo:
While the use of distributed intelligence has been incrementally spreading in the design of a great number of intelligent systems, the field of Artificial Intelligence in Real Time Strategy games has remained mostly a centralized environment. Despite turn-based games have attained AIs of world-class level, the fast paced nature of RTS games has proven to be a significant obstacle to the quality of its AIs. Chapter 1 introduces RTS games describing their characteristics, mechanics and elements. Chapter 2 introduces Multi-Agent Systems and the use of the Beliefs-Desires-Intentions abstraction, analysing the possibilities given by self-computing properties. In Chapter 3 the current state of AI development in RTS games is analyzed highlighting the struggles of the gaming industry to produce valuable. The focus on improving multiplayer experience has impacted gravely on the quality of the AIs thus leaving them with serious flaws that impair their ability to challenge and entertain players. Chapter 4 explores different aspects of AI development for RTS, evaluating the potential strengths and weaknesses of an agent-based approach and analysing which aspects can benefit the most against centralized AIs. Chapter 5 describes a generic agent-based framework for RTS games where every game entity becomes an agent, each of which having its own knowledge and set of goals. Different aspects of the game, like economy, exploration and warfare are also analysed, and some agent-based solutions are outlined. The possible exploitation of self-computing properties to efficiently organize the agents activity is then inspected. Chapter 6 presents the design and implementation of an AI for an existing Open Source game in beta development stage: 0 a.d., an historical RTS game on ancient warfare which features a modern graphical engine and evolved mechanics. The entities in the conceptual framework are implemented in a new agent-based platform seamlessly nested inside the existing game engine, called ABot, widely described in Chapters 7, 8 and 9. Chapter 10 and 11 include the design and realization of a new agent based language useful for defining behavioural modules for the agents in ABot, paving the way for a wider spectrum of contributors. Chapter 12 concludes the work analysing the outcome of tests meant to evaluate strategies, realism and pure performance, finally drawing conclusions and future works in Chapter 13.
Resumo:
Biomedical analyses are becoming increasingly complex, with respect to both the type of the data to be produced and the procedures to be executed. This trend is expected to continue in the future. The development of information and protocol management systems that can sustain this challenge is therefore becoming an essential enabling factor for all actors in the field. The use of custom-built solutions that require the biology domain expert to acquire or procure software engineering expertise in the development of the laboratory infrastructure is not fully satisfactory because it incurs undesirable mutual knowledge dependencies between the two camps. We propose instead an infrastructure concept that enables the domain experts to express laboratory protocols using proper domain knowledge, free from the incidence and mediation of the software implementation artefacts. In the system that we propose this is made possible by basing the modelling language on an authoritative domain specific ontology and then using modern model-driven architecture technology to transform the user models in software artefacts ready for execution in a multi-agent based execution platform specialized for biomedical laboratories.
Resumo:
Co-culture systems, consisting of outgrowth endothelial cells (OEC) and primary osteoblasts (pOB), represent a prom¬ising instrument to mimick the natural conditions in bone repair processes and provide a new concept to develop constructs for bone replacement. Furthermore, co-culture of OEC and pOB could provide new insights into the molecular and cellular mechanisms that control essential processes during bone repair. The present study described several advantages of the co-culture of pOB and OEC for bone tissue engineering applications, including beneficial effects on the angiogenic activation of OEC, as well as on the assembly of basement membrane matrix molecules and factors involved in vessel maturation and stabilization. The ongoing angiogenic process in the co-culture system proceeded during the course of co-cultivation and correlated with the upregulation of essential angiogenic factors, such as VEGF, angiopoietins, basement membrane molecules and mural cell-specific markers. Furthermore the co-culture system appeared to maintain osteogenic differentiation capacity.rnrnAdditional treatment of co-cultures with growth factors or morphogens might accelerate and improve bone formation and furthermore could be useful for potential clinical applications. In this context, the present study highlights the central role of the morphogen, sonic hedgehog, which has been shown to affect angiogenic activation as well as osteogenic differentiation in the co-culture model of OEC and pOB. Treatment of co-cultures with sonic hedgehog resulted in an increased formation of microvessel-like structures as early as after 24 hours. This proangiogenic effect was induced by the upregulation of the proangiogenic factors, VEGF, angiopoietin1 and angiopoietin 2. In contrast to treatment with a commonly used proangiogenic agent, VEGF, Shh stimulation induced an increased expression of factors associated with vessel maturation and stabilization, mediated through the upregulation of growth factors that are strongly involved in pericyte differentiation and recruitment, including PDGF-BB and TGFbeta. In addition, Shh treatment of co-cultures also resulted in an upregulation of osteogenic differentiation markers like alkaline phosphatase, osteocalcin, osteonectin and osteopontin, as well as an increased matrix calcification. This was a result of upregulation of the osteogenic differentiation regulating factors, BMP2 and RUNX2 which could be assessed in response to Shh treatment. rn
Resumo:
Rechnergestützte Modellansätze, die Logistiksysteme gestalten und generieren, sind eine hochkomplexe Aufgabenstellung. Die bisher in der Praxis existierenden Planungs- und Steuerungsmodelle für Intralogistiksysteme weisen für die aktuellen und zukünftigen Anforderungen wie der Komplexitätsbewältigung, Reaktionsschnelligkeit und Anpassungsfähigkeit Schwachstellen auf. – Ein innovativer Ansatz, diesen Ansprüchen gerecht zu werden, stellen Multiagentensysteme dar. Mit ihrem dezentralen und modularen Charakter sind sie für ein komplexes Problem mit einem geringen Grad an Strukturiertheit geeignet. Außerdem ermöglichen diese computergestützten intelligenten Systeme den Anwendern eine einfache und aufwandsarme Handhabung.
Resumo:
Die hohe Komplexität zellularer intralogistischer Systeme und deren Steuerungsarchitektur legt die Verwendung moderner Simulations- und Visualisierungstechniken nahe, um schon im Vorfeld Aussagen über die Leistungsfähigkeit und Zukunftssicherheit eines geplanten Systems treffen zu können. In dieser Arbeit wird ein Konzept für ein Simulationssystem zur VR-basierten Steuerungsverifikation zellularer Intralogistiksysteme vorgestellt. Beschrieben wird die Erstellung eines Simulationsmodells für eine real existierende Anlage und es wird ein Überblick über die Bestandteile der Simulation, insbesondere die Anbindung der Steuerung des realen agentenbasierten Systems, gegeben.
Resumo:
Zur Sicherstellung einer schnellen und flexiblen Anpassung an sich ändernde Anforderungen sind innerbetriebliche Materialbereitstellungskonzepte in immer stärkerem Maße zu flexibilisieren. Hierdurch kann die Erreichung logistischer Ziele in einem dynamischen Produktionsumfeld gesteigert werden. Der Beitrag stellt ein Konzept für eine adaptive Materialbereitstellung in flexiblen Produktionssystemen auf Grundlage einer agentenbasierten Transportplanung und -steuerung vor. Der Fokus liegt hierbei auf der Planung und Steuerung der auf Basis von Materialbedarfsmeldungen ausgelösten innerbetrieblichen Transporte. Neben Pendeltouren zur Versorgung des Produktionssystems findet auch das dynamische Pickup-and-Delivery-Problem Berücksichtigung. Das vorgestellte Konzept ist an den Anforderungen selbstorganisierender Produktionsprozesse ausgerichtet.
Resumo:
The shift from host-centric to information-centric networking (ICN) promises seamless communication in mobile networks. However, most existing works either consider well-connected networks with high node density or introduce modifications to {ICN} message processing for delay-tolerant Networking (DTN). In this work, we present agent-based content retrieval, which provides information-centric {DTN} support as an application module without modifications to {ICN} message processing. This enables flexible interoperability in changing environments. If no content source can be found via wireless multi-hop routing, requesters may exploit the mobility of neighbor nodes (called agents) by delegating content retrieval to them. Agents that receive a delegation and move closer to content sources can retrieve data and return it back to requesters. We show that agent-based content retrieval may be even more efficient in scenarios where multi-hop communication is possible. Furthermore, we show that broadcast communication may not be necessarily the best option since dynamic unicast requests have little overhead and can better exploit short contact times between nodes (no broadcast delays required for duplicate suppression).
Resumo:
Information-centric networking (ICN) is a new communication paradigm that has been proposed to cope with drawbacks of host-based communication protocols, namely scalability and security. In this thesis, we base our work on Named Data Networking (NDN), which is a popular ICN architecture, and investigate NDN in the context of wireless and mobile ad hoc networks. In a first part, we focus on NDN efficiency (and potential improvements) in wireless environments by investigating NDN in wireless one-hop communication, i.e., without any routing protocols. A basic requirement to initiate informationcentric communication is the knowledge of existing and available content names. Therefore, we develop three opportunistic content discovery algorithms and evaluate them in diverse scenarios for different node densities and content distributions. After content names are known, requesters can retrieve content opportunistically from any neighbor node that provides the content. However, in case of short contact times to content sources, content retrieval may be disrupted. Therefore, we develop a requester application that keeps meta information of disrupted content retrievals and enables resume operations when a new content source has been found. Besides message efficiency, we also evaluate power consumption of information-centric broadcast and unicast communication. Based on our findings, we develop two mechanisms to increase efficiency of information-centric wireless one-hop communication. The first approach called Dynamic Unicast (DU) avoids broadcast communication whenever possible since broadcast transmissions result in more duplicate Data transmissions, lower data rates and higher energy consumption on mobile nodes, which are not interested in overheard Data, compared to unicast communication. Hence, DU uses broadcast communication only until a content source has been found and then retrieves content directly via unicast from the same source. The second approach called RC-NDN targets efficiency of wireless broadcast communication by reducing the number of duplicate Data transmissions. In particular, RC-NDN is a Data encoding scheme for content sources that increases diversity in wireless broadcast transmissions such that multiple concurrent requesters can profit from each others’ (overheard) message transmissions. If requesters and content sources are not in one-hop distance to each other, requests need to be forwarded via multi-hop routing. Therefore, in a second part of this thesis, we investigate information-centric wireless multi-hop communication. First, we consider multi-hop broadcast communication in the context of rather static community networks. We introduce the concept of preferred forwarders, which relay Interest messages slightly faster than non-preferred forwarders to reduce redundant duplicate message transmissions. While this approach works well in static networks, the performance may degrade in mobile networks if preferred forwarders may regularly move away. Thus, to enable routing in mobile ad hoc networks, we extend DU for multi-hop communication. Compared to one-hop communication, multi-hop DU requires efficient path update mechanisms (since multi-hop paths may expire quickly) and new forwarding strategies to maintain NDN benefits (request aggregation and caching) such that only a few messages need to be transmitted over the entire end-to-end path even in case of multiple concurrent requesters. To perform quick retransmission in case of collisions or other transmission errors, we implement and evaluate retransmission timers from related work and compare them to CCNTimer, which is a new algorithm that enables shorter content retrieval times in information-centric wireless multi-hop communication. Yet, in case of intermittent connectivity between requesters and content sources, multi-hop routing protocols may not work because they require continuous end-to-end paths. Therefore, we present agent-based content retrieval (ACR) for delay-tolerant networks. In ACR, requester nodes can delegate content retrieval to mobile agent nodes, which move closer to content sources, can retrieve content and return it to requesters. Thus, ACR exploits the mobility of agent nodes to retrieve content from remote locations. To enable delay-tolerant communication via agents, retrieved content needs to be stored persistently such that requesters can verify its authenticity via original publisher signatures. To achieve this, we develop a persistent caching concept that maintains received popular content in repositories and deletes unpopular content if free space is required. Since our persistent caching concept can complement regular short-term caching in the content store, it can also be used for network caching to store popular delay-tolerant content at edge routers (to reduce network traffic and improve network performance) while real-time traffic can still be maintained and served from the content store.
Resumo:
The social processes that lead to destructive behavior in celebratory crowds can be studied through an agent-based computer simulation. Riots are an increasingly common outcome of sports celebrations, and pose the potential for harm to participants, bystanders, property, and the reputation of the groups with whom participants are associated. Rioting cannot necessarily be attributed to the negative emotions of individuals, such as anger, rage, frustration and despair. For instance, the celebratory behavior (e.g., chanting, cheering, singing) during UConn’s “Spring Weekend” and after the 2004 NCAA Championships resulted in several small fires and overturned cars. Further, not every individual in the area of a riot engages in violence, and those who do, do not do so continuously. Instead, small groups carry out the majority of violent acts in relatively short-lived episodes. Agent-based computer simulations are an ideal method for modeling complex group-level social phenomena, such as celebratory gatherings and riots, which emerge from the interaction of relatively “simple” individuals. By making simple assumptions about individuals’ decision-making and behaviors and allowing actors to affect one another, behavioral patterns emerge that cannot be predicted by the characteristics of individuals. The computer simulation developed here models celebratory riot behavior by repeatedly evaluating a single algorithm for each individual, the inputs of which are affected by the characteristics of nearby actors. Specifically, the simulation assumes that (a) actors possess 1 of 5 distinct social identities (group memberships), (b) actors will congregate with actors who possess the same identity, (c) the degree of social cohesion generated in the social context determines the stability of relationships within groups, and (d) actors’ level of aggression is affected by the aggression of other group members. Not only does this simulation provide a systematic investigation of the effects of the initial distribution of aggression, social identification, and cohesiveness on riot outcomes, but also an analytic tool others may use to investigate, visualize and predict how various individual characteristics affect emergent crowd behavior.
Resumo:
- Resumen La hipótesis que anima esta tesis doctoral es que algunas de las características del entorno urbano, en particular las que describen la accesibilidad de su red de espacio público, podrían estar relacionadas con la proporción de viajes a pie o reparto modal, que tiene cada zona o barrio de Madrid. Uno de los puntos de partida de dicha hipótesis que el entorno urbano tiene una mayor influencia sobre los viaje a pie que en sobre otros modos de transporte, por ejemplo que en los viajes de bicicleta o en transporte público; y es que parece razonable suponer que estos últimos van a estar más condicionadas por ejemplo por la disponibilidad de vías ciclistas, en el primer caso, o por la existencia de un servicio fiable y de calidad, en el segundo. Otra de las motivaciones del trabajo es que la investigación en este campo de la accesibilidad del espacio público, en concreto la denominada “Space Syntax”, ha probado en repetidas ocasiones la influencia de la red de espacio público en cómo se distribuye la intensidad del tráfico peatonal por la trama urbana, pero no se han encontrado referencias de la influencia de dicho elemento sobre el reparto modal. De acuerdo con la hipótesis y con otros trabajos anteriores se propone una metodología basada en el análisis empírico y cuantitativo. Su objetivo es comprobar si la red de espacio público, independientemente de otras variables como los usos del suelo, incluso de las variables de ajenas entorno no construido, como las socioeconómicas, está o no relacionada estadísticamente con la proporción de peatones viajes en las zonas urbanas. Las técnicas estadísticas se utilizan para comprobar sistemáticamente la asociación de las variables del entorno urbano, denominadas variables independientes, con el porcentaje de viajes a pie, la variable dependiente. En términos generales, la metodología es similar a la usada en otros trabajos en este campo como los de CERVERÓ y KOCKLEMAN (1997), CERVERÓ y DUNCAN (2003), o para los que se utilizan principalmente en la revisión general de TRB (2005) o, más recientemente, en ZEGRAS (2006) o CHATMAN (2009). Otras opciones metodológicas, como los métodos de preferencias declaradas (ver LOUVIERE, HENSHER y SWAIT, 2000) o el análisis basado en agentes (PENN & TURNER, 2004) fueron descartados, debido a una serie de razones, demasiado extensas para ser descritas aquí. El caso de estudio utilizado es la zona metropolitana de Madrid, abarcándola hasta la M-50, es decir en su mayor parte, con un tamaño aproximado de 31x34 Km y una población de 4.132.820 habitantes (aproximadamente el 80% de la población de la región). Las principales fuentes de datos son la Encuesta Domiciliaria de Movilidad de 2004 (EDM04), del Consorcio Regional de Transportes de Madrid que es la última disponible (muestra: > 35.000 familias,> 95.000 personas), y un modelo espacial del área metropolitana, integrando el modelo para calcular los índices de Space Syntax y un Sistema de Información Geográfica (SIG). La unidad de análisis, en este caso las unidades espaciales, son las zonas de transporte (con una población media de 7.063 personas) y los barrios (con una población media de 26.466 personas). Las variables del entorno urbano son claramente el centro del estudio. Un total de 20 índices (de 21) se seleccionan de entre los más relevantes encontrados en la revisión de la producción científica en este campo siendo que, al mismo tiempo, fueran accesibles. Nueve de ellos se utilizan para describir las características de los usos del suelo, mientras que otros once se usan para describir la red de espacios públicos. Estos últimos incluyen las variables de accesibilidad configuracional, que son, como se desprende de su título, el centro del estudio propuesto. La accesibilidad configuracional es un tipo especial de accesibilidad que se basa en la configuración de la trama urbana, según esta fue definida por HILLIER (1996), el autor de referencia dentro de esta línea de investigación de Space Syntax. Además se incluyen otras variables de la red de espacio público más habituales en los estudios de movilidad, y que aquí se denominan características geométricas de los elementos de la red, tales como su longitud, tipo de intersección, conectividad, etc. Por último se incluye además una variable socioeconómica, es decir ajena al entorno urbano, para evaluar la influencia de los factores externos, pues son varios los que pueden tener un impacto en la decisión de caminar (edad, género, nivel de estudios, ingresos, tasa de motorización, etc.). La asociación entre las variables se han establecido usando análisis de correlación (bivariante) y modelos de análisis multivariante. Las primeras se calculan entre por pares entre cada una de las 21 variables independientes y la dependiente, el porcentaje de viajes a pie. En cuanto a los segundos, se han realizado tres tipos de estudios: modelo multivariante general lineal, modelo multivariante general curvilíneo y análisis discriminante. Todos ellos son capaces de generar modelos de asociación entre diversas variables, pudiéndose de esta manera evaluar con bastante precisión en qué medida cada modelo reproduce el comportamiento de la variable dependiente, y además, el peso o influencia de cada variable en el modelo respecto a las otras. Los resultados fundamentales del estudio se expresan en dos modelos finales alternativos, que demuestran tener una significativa asociación con el porcentaje de viajes a pie (R2 = 0,6789, p <0,0001), al explicar las dos terceras partes de su variabilidad. En ellos, y en general en todo el estudio realizado, se da una influencia constante de tres índices en particular, que quedan como los principales. Dos de ellos, de acuerdo con muchos de los estudios previos, corresponden a la densidad y la mezcla de usos del suelo. Pero lo más novedoso de los resultados obtenidos es que el tercero es una medida de la accesibilidad de la red de espacio público, algo de lo que no había referencias hasta ahora. Pero, ¿cuál es la definición precisa y el peso relativo de cada uno en el modelo, es decir, en la variable independiente? El de mayor peso en la mayor parte de los análisis realizados es el índice de densidad total (n º residentes + n º puestos de trabajo + n º alumnos / Ha). Es decir, una densidad no sólo de población, sino que incluye algunas de las actividades más importantes que pueden darse una zona para generar movilidad a pie. El segundo que mayor peso adquiere, llegando a ser el primero en alguno de los análisis estadísticos efecturados, es el índice de accesibuilidad configuracional denominado integración de radio 5. Se trata de una medida de la accesibilidad de la zona, de su centralidad, a la escala de, más un menor, un distrito o comarca. En cuanto al tercero, obtiene una importancia bastante menor que los anteriores, y es que representa la mezcla de usos. En concreto es una medida del equilibrio entre los comercios especializados de venta al por menor y el número de residentes (n º de tiendas especializadas en alimentación, bebidas y tabaco / n º de habitantes). Por lo tanto, estos resultados confirman buena parte de los de estudios anteriores, especialmente los relativas a los usos del suelo, pero al mismo tiempo, apuntan a que la red de espacio público podría tener una influir mayor de la comprobada hasta ahora en la proporción de peatones sobre el resto de modos de transportes. Las razones de por qué esto puede ser así, se discuten ampliamente en las conclusiones. Finalmente se puede precisar que dicha conclusión principal se refiere a viajes de una sola etapa (no multimodales) que se dan en los barrios y zonas del área metropolitana de Madrid. Por supuesto, esta conclusión tiene en la actualidad, una validez limitada, ya que es el resultado de un solo caso — Abstract The research hypothesis for this Ph.D. Thesis is that some characteristics of the built environment, particularly those describing the accessibility of the public space network, could be associated with the proportion of pedestrians in all trips (modal split), found in the different parts of a city. The underlying idea is that walking trips are more sensitive to built environment than those by other transport modes, such as for example those by bicycle or by public transport, which could be more conditioned by, e.g. infrastructure availability or service frequency and quality. On the other hand, it has to be noted that the previously research on this field, in particular within Space Syntax’s where this study can be referred, have tested similar hypothesis using pedestrian volumes as the dependent variable, but never against modal split. According to such hypothesis, research methodology is based primarily on empirical quantitative analysis, and it is meant to be able to assess whether public space network, no matter other built environment and non-built environment variables, could have a relationship with the proportion of pedestrian trips in urban areas. Statistical techniques are used to check the association of independent variables with the percentage of walking in all trips, the dependent one. Broadly speaking this methodology is similar to that of previous studies in the field such as CERVERO&KOCKLEMAN (1997), CERVERO & DUNCAN (2003), or to those used mainly in the general review of T.R.B. (2005) or, more recently in ZEGRAS (2006) or CHATMAN (2009). Other methodological options such as stated choice methods (see LOUVIERE, HENSHER & SWAIT, 2000) or agent based analysis (PENN & TURNER, 2004), were discarded, due to a number of reasons, too long to be described here. The case study is not the entire Madrid’s metropolitan area, but almost (4.132.820 inhabitants, about 80% of region´s population). Main data sources are the Regional Mobility Home Based Survey 2004 (EDM04), which is the last available (sample: >35.000 families, > 95.000 individuals), and a spatial model of the metropolitan area, developed using Space Syntax and G.I.S. techniques. The analysis unit, in this case spatial units, are both transport zones (mean population = 7.063) and neighborhoods (mean population = 26.466). The variables of the built environment are clearly the core of the study. A total of 20 (out of 21) are selected from among those found in the literature while, at the same time, being accessible. Nine out of them are used to describe land use characteristics while another eleven describe the network of public spaces. Latter ones include configurational accessibility or Space Syntax variables. This is a particular sort of accessibility related with the concept of configuration, by HILLIER (1996), one of the main authors of Space Syntax, But it also include more customary variables used in mobility research to describe the urban design or spatial structure (here public space network), which here are called geometric characteristics of the such as its length, type of intersection, conectivity, density, etc. Finally a single socioeconomic variable was included in order to assess the influence non built environment factors that also may have an impact on walking (age, income, motorization rate, etc.). The association among variables is worked out using bi-variate correlation analysis and multivariate-analysis. Correlations are calculated among the 21 independent variables and the dependent one, the percentage of walking trips. Then, three types of multi-variate studies are run: general linear, curvilinear and discriminant multi-variate analysis. The latter are fully capable of generating complex association models among several variables, assessing quite precisely to what extent each model reproduces the behavior of the dependent variable, and also the weight or influence of each variable in the model. This study’s results show a consistent influence of three particular indexes in the two final alternative models of the multi-variate study (best, R2=0,6789, p<0,0000). Not surprisingly, two of them correspond to density and mix of land uses. But perhaps more interesting is that the third one is a measure of the accessibility of the public space network, a variable less important in the literature up to now. Additional precisions about them and their relative weight could also be of some interest. The density index is not only about population but includes most important activities in an area (nº residents + nº jobs+ nº students/Ha). The configurational index (radius 5 integration) is a measure of the accessibility of the area, i.e. centrality, at the scale of, more a less, a district. Regarding the mix of land uses index, this one is a measure of the balance between retail, in fact local basic retail, and the number of residents (nº of convenience shops / nº of residents). Referring to their weights, configurational index (radius 5 integration) gets the higher standardized coefficient of the final equation. However, in the final equations, there are a higher number of indexes coming from the density or land use mix categories than from public space network enter. Therefore, these findings seem to support part of the field’s knowledge, especially those concerning land uses, but at the same time they seem to bring in the idea that the configuration of the urban grid could have an influence in the proportion of walkers (as a part of total trips on any transport mode) that do single journey trips in the neighborhoods of Madrid, Spain. Of course this conclusion has, at present, a limited validity since it’s the result of a single case. The reasons of why this can be so, are discussed in the last part of the thesis.
Resumo:
This paper presents the results obtained with a new agent-based computer model that can simulate the evacuation of narrow-body transport airplanes in the conditions prescribed by the airworthiness regulations for certification. The model, described in detail in a former paper, has been verified with real data of narrow-body certification demonstrations. Numerical simulations of around 20 narrow-body aircraft, representative of current designs in various market segments, show the capabilities of the model and provide relevant information on the relationship between cabin features and emergency evacuation. The longitudinal location of emergency exits seems to be even more important than their size or the overall margin with respect to the prescribed number and type of exits indicated by the airworthiness requirements
Resumo:
Learning analytics is the analysis of static and dynamic data extracted from virtual learning environments, in order to understand and optimize the learning process. Generally, this dynamic data is generated by the interactions which take place in the virtual learning environment. At the present time, many implementations for grouping of data have been proposed, but there is no consensus yet on which interactions and groups must be measured and analyzed. There is also no agreement on what is the influence of these interactions, if any, on learning outcomes, academic performance or student success. This study presents three different extant interaction typologies in e-learning and analyzes the relation of their components with students? academic performance. The three different classifications are based on the agents involved in the learning process, the frequency of use and the participation mode, respectively. The main findings from the research are: a) that agent-based classifications offer a better explanation of student academic performance; b) that at least one component in each typology predicts academic performance; and c) that student-teacher and student-student, evaluating students, and active interactions, respectively, have a significant impact on academic performance, while the other interaction types are not significantly related to academic performance.
Resumo:
We investigate optimal strategies to defend valuable goods against the attacks of a thief. Given the value of the goods and the probability of success for the thief, we look for the strategy that assures the largest benefit to each player irrespective of the strategy of his opponent. Two complementary approaches are used: agent-based modeling and game theory. It is shown that the compromise between the value of the goods and the probability of success defines the mixed Nash equilibrium of the game, that is compared with the results of the agent-based simulations and discussed in terms of the system parameters.
Resumo:
Actualmente existen aplicaciones que permiten simular el comportamiento de bacterias en distintos hábitats y los procesos que ocurren en estos para facilitar su estudio y experimentación sin la necesidad de un laboratorio. Una de las aplicaciones de software libre para la simulación de poblaciones bacteriológicas mas usada es iDynoMiCS (individual-based Dynamics of Microbial Communities Simulator), un simulador basado en agentes que permite trabajar con varios modelos computacionales de bacterias en 2D y 3D. Este simulador permite una gran libertad al configurar una numerosa cantidad de variables con respecto al entorno, reacciones químicas y otros detalles importantes. Una característica importante es el poder simular de manera sencilla la conjugación de plásmidos entre bacterias. Los plásmidos son moléculas de ADN diferentes del cromosoma celular, generalmente circularles, que se replican, transcriben y conjugan independientemente del ADN cromosómico. Estas están presentes normalmente en bacterias procariotas, y en algunas ocasiones en eucariotas, sin embargo, en este tipo de células son llamados episomas. Dado el complejo comportamiento de los plásmidos y la gama de posibilidades que estos presentan como mecanismos externos al funcionamiento básico de la célula, en la mayoría de los casos confiriéndole distintas ventajas evolutivas, como por ejemplo: resistencia antibiótica, entre otros, resulta importante su estudio y subsecuente manipulación. Sin embargo, el marco operativo del iDynoMiCS, en cuanto a simulación de plásmidos se refiere, es demasiado sencillo y no permite realizar operaciones más complejas que el análisis de la propagación de un plásmido en la comunidad. El presente trabajo surge para resolver esta deficiencia de iDynomics. Aquí se analizarán, desarrollarán e implementarán las modificaciones necesarias para que iDynomics pueda simular satisfactoriamente y mas apegado a la realidad la conjugación de plásmidos y permita así mismo resolver distintas operaciones lógicas, como lo son los circuitos genéticos, basadas en plásmidos. También se analizarán los resultados obtenidos de acuerdo a distintos estudios relevantes y a la comparación de los resultados obtenidos con el código original de iDynomics. Adicionalmente se analizará un estudio comparando la eficiencia de detección de una sustancia mediante dos circuitos genéticos distintos. Asimismo el presente trabajo puede tener interés para el grupo LIA de la Facultad de Informática de la Universidad Politécnica de Madrid, el cual está participando en el proyecto europeo BACTOCOM que se centra en el estudio de la conjugación de plásmidos y circuitos genéticos. --ABSTRACT--Currently there are applications that simulate the behavior of bacteria in different habitats and the ongoing processes inside them to facilitate their study and experimentation without the need for an actual laboratory. One of the most used open source applications to simulate bacterial populations is iDynoMiCS (individual-based Dynamics of Microbial Communities Simulator), an agent-based simulator that allows working with several computer models of 2D and 3D bacteria in biofilms. This simulator allows great freedom by means of a large number of configurable variables regarding environment, chemical reactions and other important details of the simulation. Within these characteristics there exists a very basic framework to simulate plasmid conjugation. Plasmids are DNA molecules physically different from the cell’s chromosome, commonly found as small circular, double-stranded DNA molecules that are replicated, conjugated and transcribed independently of chromosomal DNA. These bacteria are normally present in prokaryotes and sometimes in eukaryotes, which in this case these cells are called episomes. Plasmids are external mechanisms to the cells basic operations, and as such, in the majority of the cases, confer to the host cell various evolutionary advantages, like antibiotic resistance for example. It is mperative to further study plasmids and the possibilities they present. However, the operational framework of the iDynoMiCS plasmid simulation is too simple, and does not allow more complex operations that the analysis of the spread of a plasmid in the community. This project was conceived to resolve this particular deficiency in iDynomics, moreover, in this paper is discussed, developed and implemented the necessary changes to iDynomics simulation software so it can satisfactorily and realistically simulate plasmid conjugation, and allow the possibility to solve various ogic operations, such as plasmid-based genetic circuits. Moreover the results obtained will be analyzed and compared with other relevant studies and with those obtained with the original iDynomics code. Conjointly, an additional study detailing the sensing of a substance with two different genetic circuits will be presented. This work may also be relevant to the LIA group of the Faculty of Informatics of the Polytechnic University of Madrid, which is participating in the European project BACTOCOM that focuses on the study of the of plasmid conjugation and genetic circuits.
Resumo:
Introdução: Grande parte das ações para promover a atividade física no lazer em populações tem apresentado tamanhos de efeito pequenos ou inexistentes, ou resultados inconsistentes. Abordar o problema a partir da perspectiva sistêmica pode ser uma das formas de superar esse descompasso. Objetivo: Desenvolver um modelo baseado em agentes para investigar a conformação e evolução de padrões populacionais de atividade física no lazer em adultos a partir da interação entre atributos psicológicos dos indivíduos e atributos dos ambientes físico construído e social em que vivem. Métodos: O processo de modelagem foi composto por três etapas: elaboração de um mapa conceitual, com base em revisão da literatura e consulta com especialistas; criação e verificação do algoritmo do modelo; e parametrização e análise de consistência e sensibilidade. Os resultados da revisão da literatura foram consolidados e relatados de acordo com os domínios da busca (aspectos psicológicos, ambiente social e ambiente físico construído). Os resultados quantitativos da consulta com os especialistas foram descritos por meio de frequências e o conteúdo das respostas questões abertas foi analisado e compilado pelo autor desta tese. O algoritmo do modelo foi criado no software NetLogo, versão 5.2.1., seguindo-se um protocolo de verificação para garantir que o algoritmo fosse implementado acuradamente. Nas análises de consistência e sensibilidade, utilizaram-se o Teste A de Vargha-Delaney, coeficiente de correlação de postos parcial, boxplots e gráficos de linha e de dispersão. Resultados: Definiram-se como elementos do mapa conceitual a intenção da pessoa, o comportamento de pessoas próximas e da comunidade, e a percepção da qualidade, do acesso e das atividades disponíveis nos locais em que atividade física no lazer pode ser praticada. O modelo representa uma comunidade hipotética contendo dois tipos de agentes: pessoas e locais em que atividade física no lazer pode ser praticada. As pessoas interagem entre si e com o ambiente construído, gerando tendências temporais populacionais de prática de atividade física no lazer e de intenção. As análises de sensibilidade indicaram que as tendências temporais de atividade física no lazer e de intenção são altamente sensíveis à influência do comportamento atual da pessoa sobre a sua intenção futura, ao tamanho do raio de percepção da pessoa e à proporção de locais em que a atividade física no lazer pode ser praticada. Considerações finais: O mapa conceitual e o modelo baseado em agentes se mostraram adequados para investigar a conformação e evolução de padrões populacionais de atividade física no lazer em adultos. A influência do comportamento da pessoa sobre a sua intenção, o tamanho do raio de percepção da pessoa e a proporção de locais em que a atividade física no lazer pode ser praticada são importantes determinantes da conformação e evolução dos padrões populacionais de atividade física no lazer entre adultos no modelo.