900 resultados para Stochastic agent-based models
Resumo:
This thesis provides a set of tools for managing uncertainty in Web-based models and workflows.To support the use of these tools, this thesis firstly provides a framework for exposing models through Web services. An introduction to uncertainty management, Web service interfaces,and workflow standards and technologies is given, with a particular focus on the geospatial domain.An existing specification for exposing geospatial models and processes, theWeb Processing Service (WPS), is critically reviewed. A processing service framework is presented as a solutionto usability issues with the WPS standard. The framework implements support for Simple ObjectAccess Protocol (SOAP), Web Service Description Language (WSDL) and JavaScript Object Notation (JSON), allowing models to be consumed by a variety of tools and software. Strategies for communicating with models from Web service interfaces are discussed, demonstrating the difficultly of exposing existing models on the Web. This thesis then reviews existing mechanisms for uncertainty management, with an emphasis on emulator methods for building efficient statistical surrogate models. A tool is developed to solve accessibility issues with such methods, by providing a Web-based user interface and backend to ease the process of building and integrating emulators. These tools, plus the processing service framework, are applied to a real case study as part of the UncertWeb project. The usability of the framework is proved with the implementation of aWeb-based workflow for predicting future crop yields in the UK, also demonstrating the abilities of the tools for emulator building and integration. Future directions for the development of the tools are discussed.
Resumo:
Large-scale evacuations are a recurring theme on news channels, whether in response to major natural or manmade disasters. The role of warning dissemination is a key part in the success of such large-scale evacuations and its inadequacy in certain cases has been a 'primary contribution to deaths and injuries' (Hayden et al.; 2007). Along with technology-driven 'official warning channels' (e.g. sirens, mass media), the role of unofficial channel (e.g. neighbours, personal contacts, volunteer wardens) has proven to be significant in warning the public of the need to evacuate. Although post-evacuation studies identify the behaviours of evacuees as disseminators of the warning message, there has not been a detailed study that quantifies the effects of such behaviour on the warning message dissemination. This paper develops an Agent-Based Simulation (ABS) model of multiple agents (evacuee households) in a hypothetical community to investigate the impact of behaviour as an unofficial channel on the overall warning dissemination. Parameters studied include the percentage of people who warn their neighbours, the efficiency of different official warning channels, and delay time to warn neighbours. Even with a low proportion of people willing to warn their neighbour, the results showed considerable impact on the overall warning dissemination. © 2012 Elsevier B.V. All rights reserved.
Resumo:
Timely warning of the public during large scale emergencies is essential to ensure safety and save lives. This ongoing study proposes an agent-based simulation model to simulate the warning message dissemination among the public considering both official channels and unofficial channels The proposed model was developed in NetLogo software for a hypothetical area, and requires input parameters such as effectiveness of each official source (%), estimated time to begin informing others, estimated time to inform others and estimated percentage of people (who do not relay the message). This paper demonstrates a means of factoring the behaviour of the public as informants into estimating the effectiveness of warningdissemination during large scale emergencies. The model provides a tool for the practitioner to test the potential impact of the informal channels on the overall warning time and sensitivity of the modelling parameters. The tool would help the practitioners to persuade evacuees to disseminate the warning message informing others similar to the ’Run to thy neighbour campaign conducted by the Red cross.
Resumo:
In-Motes Bins is an agent based real time In-Motes application developed for sensing light and temperature variations in an environment. In-Motes is a mobile agent middleware that facilitates the rapid deployment of adaptive applications in Wireless Sensor Networks (WSN's). In-Motes Bins is based on the injection of mobile agents into the WSN that can migrate or clone following specific rules and performing application specific tasks. Using In-Motes we were able to create and rapidly deploy our application on a WSN consisting of 10 MICA2 motes. Our application was tested in a wine store for a period of four months. In this paper we present the In-Motes Bins application and provide a detailed evaluation of its implementation. © 2007 IEEE.
Resumo:
Cleavage by the proteasome is responsible for generating the C terminus of T-cell epitopes. Modeling the process of proteasome cleavage as part of a multi-step algorithm for T-cell epitope prediction will reduce the number of non-binders and increase the overall accuracy of the predictive algorithm. Quantitative matrix-based models for prediction of the proteasome cleavage sites in a protein were developed using a training set of 489 naturally processed T-cell epitopes (nonamer peptides) associated with HLA-A and HLA-B molecules. The models were validated using an external test set of 227 T-cell epitopes. The performance of the models was good, identifying 76% of the C-termini correctly. The best model of proteasome cleavage was incorporated as the first step in a three-step algorithm for T-cell epitope prediction, where subsequent steps predicted TAP affinity and MHC binding using previously derived models.
Resumo:
In recent years, there has been an increas-ing interest in learning a distributed rep-resentation of word sense. Traditional context clustering based models usually require careful tuning of model parame-ters, and typically perform worse on infre-quent word senses. This paper presents a novel approach which addresses these lim-itations by first initializing the word sense embeddings through learning sentence-level embeddings from WordNet glosses using a convolutional neural networks. The initialized word sense embeddings are used by a context clustering based model to generate the distributed representations of word senses. Our learned represen-tations outperform the publicly available embeddings on 2 out of 4 metrics in the word similarity task, and 6 out of 13 sub tasks in the analogical reasoning task.
Resumo:
Last mile relief distribution is the final stage of humanitarian logistics. It refers to the supply of relief items from local distribution centers to the disaster affected people (Balcik et al., 2008). In the last mile relief distribution literature, researchers have focused on the use of optimisation techniques for determining the exact optimal solution (Liberatore et al., 2014), but there is a need to include behavioural factors with those optimisation techniques in order to obtain better predictive results. This paper will explain how improving the coordination factor increases the effectiveness of the last mile relief distribution process. There are two stages of methodology used to achieve the goal: Interviews: The authors conducted interviews with the Indian Government and with South Asian NGOs to identify the critical factors for final relief distribution. After thematic and content analysis of the interviews and the reports, the authors found some behavioural factors which affect the final relief distribution. Model building: Last mile relief distribution in India follows a specific framework described in the Indian Government disaster management handbook. We modelled this framework using agent based simulation and investigated the impact of coordination on effectiveness. We define effectiveness as the speed and accuracy with which aid is delivered to affected people. We tested through simulation modelling whether coordination improves effectiveness.
Resumo:
Astrocytes are now increasingly acknowledged as having fundamental and sophisticated roles in brain function and dysfunction. Unravelling the complex mechanisms that underlie human brain astrocyte-neuron interactions is therefore an essential step on the way to understanding how the brain operates. Insights into astrocyte function to date, have almost exclusively been derived from studies conducted using murine or rodent models. Whilst these have led to significant discoveries, preliminary work with human astrocytes has revealed a hitherto unknown range of astrocyte types with potentially greater functional complexity and increased neuronal interaction with respect to animal astrocytes. It is becoming apparent, therefore, that many important functions of astrocytes will only be discovered by direct physiological interrogation of human astrocytes. Recent advancements in the field of stem cell biology have provided a source of human based models. These will provide a platform to facilitate our understanding of normal astrocyte functions as well as their role in CNS pathology. A number of recent studies have demonstrated that stem cell derived astrocytes exhibit a range of properties, suggesting that they may be functionally equivalent to their in vivo counterparts. Further validation against in vivo models will ultimately confirm the future utility of these stem-cell based approaches in fulfilling the need for human- based cellular models for basic and clinical research. In this review we discuss the roles of astrocytes in the brain and highlight the extent to which human stem cell derived astrocytes have demonstrated functional activities that are equivalent to that observed in vivo.
Resumo:
Industry practitioners are seeking to create optimal logistics networks through more efficient decision-making leading to a shift of power from a centralized position to a more decentralized approach. This has led to researchers, exploring with vigor, the application of agent based modeling (ABM) in supply chains and more recently, its impact on decision-making. This paper investigates reasons for the shift to decentralized decision-making and the impact on supply chains. Effective decentralization of decision-making with ABM and hybrid modeling is investigated, observing the methods and potential of achieving optimality.
Resumo:
Contemporary models of contrast integration across space assume that pooling operates uniformly over the target region. For sparse stimuli, where high contrast regions are separated by areas containing no signal, this strategy may be sub-optimal because it pools more noise than signal as area increases. Little is known about the behaviour of human observers for detecting such stimuli. We performed an experiment in which three observers detected regular textures of various areas, and six levels of sparseness. Stimuli were regular grids of horizontal grating micropatches, each 1 cycle wide. We varied the ratio of signals (marks) to gaps (spaces), with mark:space ratios ranging from 1 : 0 (a dense texture with no spaces) to 1 : 24. To compensate for the decline in sensitivity with increasing distance from fixation, we adjusted the stimulus contrast as a function of eccentricity based on previous measurements [Baldwin, Meese & Baker, 2012, J Vis, 12(11):23]. We used the resulting area summation functions and psychometric slopes to test several filter-based models of signal combination. A MAX model failed to predict the thresholds, but did a good job on the slopes. Blanket summation of stimulus energy improved the threshold fit, but did not predict an observed slope increase with mark:space ratio. Our best model used a template matched to the sparseness of the stimulus, and pooled the squared contrast signal over space. Templates for regular patterns have also recently been proposed to explain the regular appearance of slightly irregular textures (Morgan et al, 2012, Proc R Soc B, 279, 2754–2760)
Resumo:
A tanulmány a kockázatnak és a kockázatok felmérésének az éves beszámolók (pénzügyi kimutatások) könyvvizsgálatban betöltött szerepével foglalkozik. A modern könyvvizsgálat – belső és külső korlátainál fogva – nem létezhet a vizsgált vállalkozás üzleti kockázatainak felmérése nélkül. Olyannyira igaz ez, hogy a szakma alapvető szabályait lefektető nemzeti és nemzetközi standardok is kötelező jelleggel előírják az ügyfelek üzleti kockázatainak megismerését. Mindez nem öncélú tevékenység, hanem éppen ez jelenti a könyvvizsgálat kiinduló magját: a kockázatbecslés – a tervezés részeként – az audit végrehajtásának alapja, és egyben vezérfonala. A szerző először bemutatja a könyvvizsgálat és a kockázat kapcsolatának alapvonásait, azt, hogy miként jelenik meg egyáltalán a kockázat problémája a könyvvizsgálatban. Ezt követően a különféle kockázatalapú megközelítéseket tárgyalja, majd néhány főbb elem kiragadásával ábrázolja a kockázatkoncepció beágyazódását a szakmai szabályozásba. Végül – mintegy az elmélet tesztjeként – bemutatja a kockázatmodell gyakorlati alkalmazásának néhány aspektusát. ______ The study examines the role of risk and the assessment of risks in the external audit of financial statements. A modern audit – due to its internal and external limitations – cannot exist without the assessment of the business risk of the entity being audited. This is not a l’art pour l’art activity but rather the very core of the audit. It is – as part of the planning of the audit – a guideline to the whole auditing process. This study has three main sections. The first one explains the connection between audit and risk, the second discusses the different risk based approaches to auditing and the embeddedness of the risk concept into professional regulation. Finally – as a test of theory – some practical aspects of the risk model are discussed through the lens of former empirical research carried out mostly in the US. The conclusion of the study is that though risk based models of auditing have many weaknesses they still result in the most effective and efficient high quality audits.
Resumo:
A közgazdaságtanban az ágensalapú modellezés egyik alkalmazási területe a makro ökonómia. Ebben a tanulmányban néhány népszerű megtakarítási szabály létét feltételezve adaptív-evolúciós megközelítésben endogén módon próbálunk következtetni e szabályok relatív életképességére. Három különböző típusú ágenst vezetünk be: egy prudens, egy rövidlátó és egy, a permanensjövedelem-elméletnek megfelelően működőt. Rendkívül erős szelekciós nyomás mellett a prudens típus egyértelműen kiszorítja a másik kettőt. A második legéletképesebbnek a rövidlátó típus tűnik, de már közepes szelekciós nyomásnál sem tűnik el egyik típus sem. Szokásos tőkehatékonyság mellett a prudens típus túlzott beruházási tendenciát visz a gazdaságba, és a gazdaság az aranykori megtakarítási rátánál magasabbat ér el. A hitelkorlátok oldása még nagyobb mértékű túlzott beruházáshoz vezethet, a hitelek mennyiségének növekedése mellett a tőketulajdonosok mintegy "kizsákmányoltatják" magukat azokkal, akiknek nincs tőkejövedelmük. A hosszú távú átlagos fogyasztás szempontjából a három típus kiegyensúlyozott aránya adja a legjobb eredményt, ugyanakkor ez jóval nagyobb ingadozással jár, mint amikor csak prudens típusú háztartások léteznek. ____ Agent-based modelling techniques have been employed for some time in macroeconomics. This paper tests some popular saving rules in an adaptive-evolutionary context of looking at their relative survival values. The three types are prudent, short-sighted, and responsive to the permanent-income hypothesis. It is found that where selection pressure is very high, only the prudent type persists. The second most resilient seems to be the short-sighted type, but all three coexist even at medium levels of selection pressure. When the efficiency of capital approaches the level usually assumed in macroeconomics, the prudent type drives the economy towards excessive accumulation of capital, i. e. a long-term savings rate that exceeds the golden rule. If credit constraints are relaxed, this tendency strengthens as credit grows and capital-owners seem to allow themselves to be exploited" by workers. From the angle of average consumption, the best outcome is obtained from a random distribution of types, although this is accompanied by higher volatility.
Resumo:
Research on the adoption of innovations by individuals has been criticized for focusing on various factors that lead to the adoption or rejection of an innovation while ignoring important aspects of the dynamic process that takes place. Theoretical process-based models hypothesize that individuals go through consecutive stages of information gathering and decision making but do not clearly explain the mechanisms that cause an individual to leave one stage and enter the next one. Research on the dynamics of the adoption process have lacked a structurally formal and quantitative description of the process. ^ This dissertation addresses the adoption process of technological innovations from a Systems Theory perspective and assumes that individuals roam through different, not necessarily consecutive, states, determined by the levels of quantifiable state variables. It is proposed that different levels of these state variables determine the state in which potential adopters are. Various events that alter the levels of these variables can cause individuals to migrate into different states. ^ It was believed that Systems Theory could provide the required infrastructure to model the innovation adoption process, particularly applied to information technologies, in a formal, structured fashion. This dissertation assumed that an individual progressing through an adoption process could be considered a system, where the occurrence of different events affect the system's overall behavior and ultimately the adoption outcome. The research effort aimed at identifying the various states of such system and the significant events that could lead the system from one state to another. By mapping these attributes onto an “innovation adoption state space” the adoption process could be fully modeled and used to assess the status, history, and possible outcomes of a specific adoption process. ^ A group of Executive MBA students were observed as they adopted Internet-based technological innovations. The data collected were used to identify clusters in the values of the state variables and consequently define significant system states. Additionally, events were identified across the student sample that systematically moved the system from one state to another. The compilation of identified states and change-related events enabled the definition of an innovation adoption state-space model. ^
Resumo:
Limited literature regarding parameter estimation of dynamic systems has been identified as the central-most reason for not having parametric bounds in chaotic time series. However, literature suggests that a chaotic system displays a sensitive dependence on initial conditions, and our study reveals that the behavior of chaotic system: is also sensitive to changes in parameter values. Therefore, parameter estimation technique could make it possible to establish parametric bounds on a nonlinear dynamic system underlying a given time series, which in turn can improve predictability. By extracting the relationship between parametric bounds and predictability, we implemented chaos-based models for improving prediction in time series. ^ This study describes work done to establish bounds on a set of unknown parameters. Our research results reveal that by establishing parametric bounds, it is possible to improve the predictability of any time series, although the dynamics or the mathematical model of that series is not known apriori. In our attempt to improve the predictability of various time series, we have established the bounds for a set of unknown parameters. These are: (i) the embedding dimension to unfold a set of observation in the phase space, (ii) the time delay to use for a series, (iii) the number of neighborhood points to use for avoiding detection of false neighborhood and, (iv) the local polynomial to build numerical interpolation functions from one region to another. Using these bounds, we are able to get better predictability in chaotic time series than previously reported. In addition, the developments of this dissertation can establish a theoretical framework to investigate predictability in time series from the system-dynamics point of view. ^ In closing, our procedure significantly reduces the computer resource usage, as the search method is refined and efficient. Finally, the uniqueness of our method lies in its ability to extract chaotic dynamics inherent in non-linear time series by observing its values. ^
Resumo:
Leadership is a socially constructed concept shaped by the context, values and experiences of society (Klenke, 1996); the historical context of gender and ethnicity in society affects views about leadership and who merits a leadership role. Therefore, developing an understanding of Hispanic women students’ leadership identity development is critical in broadening how we define leadership and develop leadership education. The purpose of this qualitative case study was to explore and describe the leadership identity development of a select group of women leaders at a Hispanic Serving Institution (HSI) in the southeast. A psychosocial approach to the study was utilized. In-depth interviews and focus groups were conducted with 11 self-identified Hispanic women students of sophomore, junior or senior standing with varying degrees of involvement in leadership activities at Florida International University. Participants were asked questions related to four topics; (a) leadership, (b) gender, (c) ethnic identity, and (d) influences that contributed to their understanding of self as leader. Five topics emerged from the data presented by the participants’: (a) encouraging relationships, (b) meaningful experiences, (c) self development, (d) the role of gender, and (e) impact of ethnicity. These themes contributed to the leadership identity development of the participants. Findings indicate that leadership identity development for Hispanic women college students at this HSI is complex. The concept of leadership identity development presented in the literature was challenged as findings indicate that the participants’ experiences living and attending a school in a majority-minority city influenced their development of a leadership identity. The data indicate that leadership is not gender or ethnicity neutral as differences exist in expectations of men and women in leadership roles. Gender expectations posed particular challenges for these women student leaders. The prescriptive nature of stage-based models was problematic as findings indicated leadership identity development a complicated and continuing process influenced strongly by relationships and experiences. This study enhanced knowledge of the ways that Hispanic women students become leaders and the influences that shape their leadership experiences which can assist higher education professionals in developing leadership programs and courses that address gender, multiculturalism and awareness of self as leader.