870 resultados para Agent-Based Models


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The behaviour of self adaptive systems can be emergent. The difficulty in predicting the system's behaviour means that there is scope for the system to surprise its customers and its developers. Because its behaviour is emergent, a self-adaptive system needs to garner confidence in its customers and it needs to resolve any surprise on the part of the developer during testing and mainteinance. We believe that these two functions can only be achieved if a self-adaptive system is also capable of self-explanation. We argue a self-adaptive system's behaviour needs to be explained in terms of satisfaction of its requirements. Since self-adaptive system requirements may themselves be emergent, a means needs to be found to explain the current behaviour of the system and the reasons that brought that behaviour about. We propose the use of goal-based models during runtime to offer self-explanation of how a system is meeting its requirements, and why the means of meeting these were chosen. We discuss the results of early experiments in self-explanation, and set out future work. © 2012 C.E.S.A.M.E.S.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The supply chain can be a source of competitive advantage for the firm. Simulation is an effective tool for investigating supply chain problems. The three main simulation approaches in the supply chain context are System Dynamics (SD), Discrete Event Simulation (DES) and Agent Based Modelling (ABM). A sample from the literature suggests that whilst SD and ABM have been used to address strategic and planning problems, DES has mainly been used on planning and operational problems., A review of received wisdom suggests that historically, driven by custom and practice, certain simulation techniques have been focused on certain problem types. A theoretical review of the techniques, however, suggests that the scope of their application should be much wider and that supply chain practitioners could benefit from applying them in this broader way.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis provides a set of tools for managing uncertainty in Web-based models and workflows.To support the use of these tools, this thesis firstly provides a framework for exposing models through Web services. An introduction to uncertainty management, Web service interfaces,and workflow standards and technologies is given, with a particular focus on the geospatial domain.An existing specification for exposing geospatial models and processes, theWeb Processing Service (WPS), is critically reviewed. A processing service framework is presented as a solutionto usability issues with the WPS standard. The framework implements support for Simple ObjectAccess Protocol (SOAP), Web Service Description Language (WSDL) and JavaScript Object Notation (JSON), allowing models to be consumed by a variety of tools and software. Strategies for communicating with models from Web service interfaces are discussed, demonstrating the difficultly of exposing existing models on the Web. This thesis then reviews existing mechanisms for uncertainty management, with an emphasis on emulator methods for building efficient statistical surrogate models. A tool is developed to solve accessibility issues with such methods, by providing a Web-based user interface and backend to ease the process of building and integrating emulators. These tools, plus the processing service framework, are applied to a real case study as part of the UncertWeb project. The usability of the framework is proved with the implementation of aWeb-based workflow for predicting future crop yields in the UK, also demonstrating the abilities of the tools for emulator building and integration. Future directions for the development of the tools are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Large-scale evacuations are a recurring theme on news channels, whether in response to major natural or manmade disasters. The role of warning dissemination is a key part in the success of such large-scale evacuations and its inadequacy in certain cases has been a 'primary contribution to deaths and injuries' (Hayden et al.; 2007). Along with technology-driven 'official warning channels' (e.g. sirens, mass media), the role of unofficial channel (e.g. neighbours, personal contacts, volunteer wardens) has proven to be significant in warning the public of the need to evacuate. Although post-evacuation studies identify the behaviours of evacuees as disseminators of the warning message, there has not been a detailed study that quantifies the effects of such behaviour on the warning message dissemination. This paper develops an Agent-Based Simulation (ABS) model of multiple agents (evacuee households) in a hypothetical community to investigate the impact of behaviour as an unofficial channel on the overall warning dissemination. Parameters studied include the percentage of people who warn their neighbours, the efficiency of different official warning channels, and delay time to warn neighbours. Even with a low proportion of people willing to warn their neighbour, the results showed considerable impact on the overall warning dissemination. © 2012 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Timely warning of the public during large scale emergencies is essential to ensure safety and save lives. This ongoing study proposes an agent-based simulation model to simulate the warning message dissemination among the public considering both official channels and unofficial channels The proposed model was developed in NetLogo software for a hypothetical area, and requires input parameters such as effectiveness of each official source (%), estimated time to begin informing others, estimated time to inform others and estimated percentage of people (who do not relay the message). This paper demonstrates a means of factoring the behaviour of the public as informants into estimating the effectiveness of warningdissemination during large scale emergencies. The model provides a tool for the practitioner to test the potential impact of the informal channels on the overall warning time and sensitivity of the modelling parameters. The tool would help the practitioners to persuade evacuees to disseminate the warning message informing others similar to the ’Run to thy neighbour campaign conducted by the Red cross.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In-Motes Bins is an agent based real time In-Motes application developed for sensing light and temperature variations in an environment. In-Motes is a mobile agent middleware that facilitates the rapid deployment of adaptive applications in Wireless Sensor Networks (WSN's). In-Motes Bins is based on the injection of mobile agents into the WSN that can migrate or clone following specific rules and performing application specific tasks. Using In-Motes we were able to create and rapidly deploy our application on a WSN consisting of 10 MICA2 motes. Our application was tested in a wine store for a period of four months. In this paper we present the In-Motes Bins application and provide a detailed evaluation of its implementation. © 2007 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cleavage by the proteasome is responsible for generating the C terminus of T-cell epitopes. Modeling the process of proteasome cleavage as part of a multi-step algorithm for T-cell epitope prediction will reduce the number of non-binders and increase the overall accuracy of the predictive algorithm. Quantitative matrix-based models for prediction of the proteasome cleavage sites in a protein were developed using a training set of 489 naturally processed T-cell epitopes (nonamer peptides) associated with HLA-A and HLA-B molecules. The models were validated using an external test set of 227 T-cell epitopes. The performance of the models was good, identifying 76% of the C-termini correctly. The best model of proteasome cleavage was incorporated as the first step in a three-step algorithm for T-cell epitope prediction, where subsequent steps predicted TAP affinity and MHC binding using previously derived models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years, there has been an increas-ing interest in learning a distributed rep-resentation of word sense. Traditional context clustering based models usually require careful tuning of model parame-ters, and typically perform worse on infre-quent word senses. This paper presents a novel approach which addresses these lim-itations by first initializing the word sense embeddings through learning sentence-level embeddings from WordNet glosses using a convolutional neural networks. The initialized word sense embeddings are used by a context clustering based model to generate the distributed representations of word senses. Our learned represen-tations outperform the publicly available embeddings on 2 out of 4 metrics in the word similarity task, and 6 out of 13 sub tasks in the analogical reasoning task.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Last mile relief distribution is the final stage of humanitarian logistics. It refers to the supply of relief items from local distribution centers to the disaster affected people (Balcik et al., 2008). In the last mile relief distribution literature, researchers have focused on the use of optimisation techniques for determining the exact optimal solution (Liberatore et al., 2014), but there is a need to include behavioural factors with those optimisation techniques in order to obtain better predictive results. This paper will explain how improving the coordination factor increases the effectiveness of the last mile relief distribution process. There are two stages of methodology used to achieve the goal: Interviews: The authors conducted interviews with the Indian Government and with South Asian NGOs to identify the critical factors for final relief distribution. After thematic and content analysis of the interviews and the reports, the authors found some behavioural factors which affect the final relief distribution. Model building: Last mile relief distribution in India follows a specific framework described in the Indian Government disaster management handbook. We modelled this framework using agent based simulation and investigated the impact of coordination on effectiveness. We define effectiveness as the speed and accuracy with which aid is delivered to affected people. We tested through simulation modelling whether coordination improves effectiveness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Astrocytes are now increasingly acknowledged as having fundamental and sophisticated roles in brain function and dysfunction. Unravelling the complex mechanisms that underlie human brain astrocyte-neuron interactions is therefore an essential step on the way to understanding how the brain operates. Insights into astrocyte function to date, have almost exclusively been derived from studies conducted using murine or rodent models. Whilst these have led to significant discoveries, preliminary work with human astrocytes has revealed a hitherto unknown range of astrocyte types with potentially greater functional complexity and increased neuronal interaction with respect to animal astrocytes. It is becoming apparent, therefore, that many important functions of astrocytes will only be discovered by direct physiological interrogation of human astrocytes. Recent advancements in the field of stem cell biology have provided a source of human based models. These will provide a platform to facilitate our understanding of normal astrocyte functions as well as their role in CNS pathology. A number of recent studies have demonstrated that stem cell derived astrocytes exhibit a range of properties, suggesting that they may be functionally equivalent to their in vivo counterparts. Further validation against in vivo models will ultimately confirm the future utility of these stem-cell based approaches in fulfilling the need for human- based cellular models for basic and clinical research. In this review we discuss the roles of astrocytes in the brain and highlight the extent to which human stem cell derived astrocytes have demonstrated functional activities that are equivalent to that observed in vivo.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Industry practitioners are seeking to create optimal logistics networks through more efficient decision-making leading to a shift of power from a centralized position to a more decentralized approach. This has led to researchers, exploring with vigor, the application of agent based modeling (ABM) in supply chains and more recently, its impact on decision-making. This paper investigates reasons for the shift to decentralized decision-making and the impact on supply chains. Effective decentralization of decision-making with ABM and hybrid modeling is investigated, observing the methods and potential of achieving optimality.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Contemporary models of contrast integration across space assume that pooling operates uniformly over the target region. For sparse stimuli, where high contrast regions are separated by areas containing no signal, this strategy may be sub-optimal because it pools more noise than signal as area increases. Little is known about the behaviour of human observers for detecting such stimuli. We performed an experiment in which three observers detected regular textures of various areas, and six levels of sparseness. Stimuli were regular grids of horizontal grating micropatches, each 1 cycle wide. We varied the ratio of signals (marks) to gaps (spaces), with mark:space ratios ranging from 1 : 0 (a dense texture with no spaces) to 1 : 24. To compensate for the decline in sensitivity with increasing distance from fixation, we adjusted the stimulus contrast as a function of eccentricity based on previous measurements [Baldwin, Meese & Baker, 2012, J Vis, 12(11):23]. We used the resulting area summation functions and psychometric slopes to test several filter-based models of signal combination. A MAX model failed to predict the thresholds, but did a good job on the slopes. Blanket summation of stimulus energy improved the threshold fit, but did not predict an observed slope increase with mark:space ratio. Our best model used a template matched to the sparseness of the stimulus, and pooled the squared contrast signal over space. Templates for regular patterns have also recently been proposed to explain the regular appearance of slightly irregular textures (Morgan et al, 2012, Proc R Soc B, 279, 2754–2760)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A tanulmány a kockázatnak és a kockázatok felmérésének az éves beszámolók (pénzügyi kimutatások) könyvvizsgálatban betöltött szerepével foglalkozik. A modern könyvvizsgálat – belső és külső korlátainál fogva – nem létezhet a vizsgált vállalkozás üzleti kockázatainak felmérése nélkül. Olyannyira igaz ez, hogy a szakma alapvető szabályait lefektető nemzeti és nemzetközi standardok is kötelező jelleggel előírják az ügyfelek üzleti kockázatainak megismerését. Mindez nem öncélú tevékenység, hanem éppen ez jelenti a könyvvizsgálat kiinduló magját: a kockázatbecslés – a tervezés részeként – az audit végrehajtásának alapja, és egyben vezérfonala. A szerző először bemutatja a könyvvizsgálat és a kockázat kapcsolatának alapvonásait, azt, hogy miként jelenik meg egyáltalán a kockázat problémája a könyvvizsgálatban. Ezt követően a különféle kockázatalapú megközelítéseket tárgyalja, majd néhány főbb elem kiragadásával ábrázolja a kockázatkoncepció beágyazódását a szakmai szabályozásba. Végül – mintegy az elmélet tesztjeként – bemutatja a kockázatmodell gyakorlati alkalmazásának néhány aspektusát. ______ The study examines the role of risk and the assessment of risks in the external audit of financial statements. A modern audit – due to its internal and external limitations – cannot exist without the assessment of the business risk of the entity being audited. This is not a l’art pour l’art activity but rather the very core of the audit. It is – as part of the planning of the audit – a guideline to the whole auditing process. This study has three main sections. The first one explains the connection between audit and risk, the second discusses the different risk based approaches to auditing and the embeddedness of the risk concept into professional regulation. Finally – as a test of theory – some practical aspects of the risk model are discussed through the lens of former empirical research carried out mostly in the US. The conclusion of the study is that though risk based models of auditing have many weaknesses they still result in the most effective and efficient high quality audits.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A közgazdaságtanban az ágensalapú modellezés egyik alkalmazási területe a makro ökonómia. Ebben a tanulmányban néhány népszerű megtakarítási szabály létét feltételezve adaptív-evolúciós megközelítésben endogén módon próbálunk következtetni e szabályok relatív életképességére. Három különböző típusú ágenst vezetünk be: egy prudens, egy rövidlátó és egy, a permanensjövedelem-elméletnek megfelelően működőt. Rendkívül erős szelekciós nyomás mellett a prudens típus egyértelműen kiszorítja a másik kettőt. A második legéletképesebbnek a rövidlátó típus tűnik, de már közepes szelekciós nyomásnál sem tűnik el egyik típus sem. Szokásos tőkehatékonyság mellett a prudens típus túlzott beruházási tendenciát visz a gazdaságba, és a gazdaság az aranykori megtakarítási rátánál magasabbat ér el. A hitelkorlátok oldása még nagyobb mértékű túlzott beruházáshoz vezethet, a hitelek mennyiségének növekedése mellett a tőketulajdonosok mintegy "kizsákmányoltatják" magukat azokkal, akiknek nincs tőkejövedelmük. A hosszú távú átlagos fogyasztás szempontjából a három típus kiegyensúlyozott aránya adja a legjobb eredményt, ugyanakkor ez jóval nagyobb ingadozással jár, mint amikor csak prudens típusú háztartások léteznek. ____ Agent-based modelling techniques have been employed for some time in macroeconomics. This paper tests some popular saving rules in an adaptive-evolutionary context of looking at their relative survival values. The three types are prudent, short-sighted, and responsive to the permanent-income hypothesis. It is found that where selection pressure is very high, only the prudent type persists. The second most resilient seems to be the short-sighted type, but all three coexist even at medium levels of selection pressure. When the efficiency of capital approaches the level usually assumed in macroeconomics, the prudent type drives the economy towards excessive accumulation of capital, i. e. a long-term savings rate that exceeds the golden rule. If credit constraints are relaxed, this tendency strengthens as credit grows and capital-owners seem to allow themselves to be exploited" by workers. From the angle of average consumption, the best outcome is obtained from a random distribution of types, although this is accompanied by higher volatility.