47 resultados para agent-based modelling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The supply chain can be a source of competitive advantage for the firm. Simulation is an effective tool for investigating supply chain problems. The three main simulation approaches in the supply chain context are System Dynamics (SD), Discrete Event Simulation (DES) and Agent Based Modelling (ABM). A sample from the literature suggests that whilst SD and ABM have been used to address strategic and planning problems, DES has mainly been used on planning and operational problems., A review of received wisdom suggests that historically, driven by custom and practice, certain simulation techniques have been focused on certain problem types. A theoretical review of the techniques, however, suggests that the scope of their application should be much wider and that supply chain practitioners could benefit from applying them in this broader way.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Softeam has over 20 years of experience providing UML-based modelling solutions, such as its Modelio modelling tool, and its Constellation enterprise model management and collaboration environment. Due to the increasing number and size of the models used by Softeam’s clients, Softeam joined the MONDO FP7 EU research project, which worked on solutions for these scalability challenges and produced the Hawk model indexer among other results. This paper presents the technical details and several case studies on the integration of Hawk into Softeam’s toolset. The first case study measured the performance of Hawk’s Modelio support using varying amounts of memory for the Neo4j backend. In another case study, Hawk was integrated into Constellation to provide scalable global querying of model repositories. Finally, the combination of Hawk and the Epsilon Generation Language was compared against Modelio for document generation: for the largest model, Hawk was two orders of magnitude faster.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Manufacturing planning and control systems are fundamental to the successful operations of a manufacturing organisation. 10 order to improve their business performance, significant investment is made by companies into planning and control systems; however, not all companies realise the benefits sought Many companies continue to suffer from high levels of inventory, shortages, obsolete parts, poor resource utilisation and poor delivery performance. This thesis argues that the fit between the planning and control system and the manufacturing organisation is a crucial element of success. The design of appropriate control systems is, therefore, important. The different approaches to the design of manufacturing planning and control systems are investigated. It is concluded that there is no provision within these design methodologies to properly assess the impact of a proposed design on the manufacturing facility. Consequently, an understanding of how a new (or modified) planning and control system will perform in the context of the complete manufacturing system is unlikely to be gained until after the system has been implemented and is running. There are many modelling techniques available, however discrete-event simulation is unique in its ability to model the complex dynamics inherent in manufacturing systems, of which the planning and control system is an integral component. The existing application of simulation to manufacturing control system issues is limited: although operational issues are addressed, application to the more fundamental design of control systems is rarely, if at all, considered. The lack of a suitable simulation-based modelling tool does not help matters. The requirements of a simulation tool capable of modelling a host of different planning and control systems is presented. It is argued that only through the application of object-oriented principles can these extensive requirements be achieved. This thesis reports on the development of an extensible class library called WBS/Control, which is based on object-oriented principles and discrete-event simulation. The functionality, both current and future, offered by WBS/Control means that different planning and control systems can be modelled: not only the more standard implementations but also hybrid systems and new designs. The flexibility implicit in the development of WBS/Control supports its application to design and operational issues. WBS/Control wholly integrates with an existing manufacturing simulator to provide a more complete modelling environment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In today's market, the global competition has put manufacturing businesses in great pressures to respond rapidly to dynamic variations in demand patterns across products and changing product mixes. To achieve substantial responsiveness, the manufacturing activities associated with production planning and control must be integrated dynamically, efficiently and cost-effectively. This paper presents an iterative agent bidding mechanism, which performs dynamic integration of process planning and production scheduling to generate optimised process plans and schedules in response to dynamic changes in the market and production environment. The iterative bidding procedure is carried out based on currency-like metrics in which all operations (e.g. machining processes) to be performed are assigned with virtual currency values, and resource agents bid for the operations if the costs incurred for performing them are lower than the currency values. The currency values are adjusted iteratively and resource agents re-bid for the operations based on the new set of currency values until the total production cost is minimised. A simulated annealing optimisation technique is employed to optimise the currency values iteratively. The feasibility of the proposed methodology has been validated using a test case and results obtained have proven the method outperforming non-agent-based methods.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Discrete event simulation is a popular aid for manufacturing system design; however in application this technique can sometimes be unnecessarily complex. This paper is concerned with applying an alternative technique to manufacturing system design which may well provide an efficient form of rough-cut analysis. This technique is System Dynamics, and the work described in this paper has set about incorporating the principles of this technique into a computer based modelling tool that is tailored to manufacturing system design. This paper is structured to first explore the principles of System Dynamics and how they differ from Discrete Event Simulation. The opportunity for System Dynamics is then explored, and this leads to defining the capabilities that a suitable tool would need. This specification is then transformed into a computer modelling tool, which is then assessed by applying this tool to model an engine production facility. Read More: http://www.worldscientific.com/doi/abs/10.1142/S0219686703000228

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Web-based distributed modelling architectures are gaining increasing recognition as potentially useful tools to build holistic environmental models, combining individual components in complex workflows. However, existing web-based modelling frameworks currently offer no support for managing uncertainty. On the other hand, the rich array of modelling frameworks and simulation tools which support uncertainty propagation in complex and chained models typically lack the benefits of web based solutions such as ready publication, discoverability and easy access. In this article we describe the developments within the UncertWeb project which are designed to provide uncertainty support in the context of the proposed ‘Model Web’. We give an overview of uncertainty in modelling, review uncertainty management in existing modelling frameworks and consider the semantic and interoperability issues raised by integrated modelling. We describe the scope and architecture required to support uncertainty management as developed in UncertWeb. This includes tools which support elicitation, aggregation/disaggregation, visualisation and uncertainty/sensitivity analysis. We conclude by highlighting areas that require further research and development in UncertWeb, such as model calibration and inference within complex environmental models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Agent-based technology is playing an increasingly important role in today’s economy. Usually a multi-agent system is needed to model an economic system such as a market system, in which heterogeneous trading agents interact with each other autonomously. Two questions often need to be answered regarding such systems: 1) How to design an interacting mechanism that facilitates efficient resource allocation among usually self-interested trading agents? 2) How to design an effective strategy in some specific market mechanisms for an agent to maximise its economic returns? For automated market systems, auction is the most popular mechanism to solve resource allocation problems among their participants. However, auction comes in hundreds of different formats, in which some are better than others in terms of not only the allocative efficiency but also other properties e.g., whether it generates high revenue for the auctioneer, whether it induces stable behaviour of the bidders. In addition, different strategies result in very different performance under the same auction rules. With this background, we are inevitably intrigued to investigate auction mechanism and strategy designs for agent-based economics. The international Trading Agent Competition (TAC) Ad Auction (AA) competition provides a very useful platform to develop and test agent strategies in Generalised Second Price auction (GSP). AstonTAC, the runner-up of TAC AA 2009, is a successful advertiser agent designed for GSP-based keyword auction. In particular, AstonTAC generates adaptive bid prices according to the Market-based Value Per Click and selects a set of keyword queries with highest expected profit to bid on to maximise its expected profit under the limit of conversion capacity. Through evaluation experiments, we show that AstonTAC performs well and stably not only in the competition but also across a broad range of environments. The TAC CAT tournament provides an environment for investigating the optimal design of mechanisms for double auction markets. AstonCAT-Plus is the post-tournament version of the specialist developed for CAT 2010. In our experiments, AstonCAT-Plus not only outperforms most specialist agents designed by other institutions but also achieves high allocative efficiencies, transaction success rates and average trader profits. Moreover, we reveal some insights of the CAT: 1) successful markets should maintain a stable and high market share of intra-marginal traders; 2) a specialist’s performance is dependent on the distribution of trading strategies. However, typical double auction models assume trading agents have a fixed trading direction of either buy or sell. With this limitation they cannot directly reflect the fact that traders in financial markets (the most popular application of double auction) decide their trading directions dynamically. To address this issue, we introduce the Bi-directional Double Auction (BDA) market which is populated by two-way traders. Experiments are conducted under both dynamic and static settings of the continuous BDA market. We find that the allocative efficiency of a continuous BDA market mainly comes from rational selection of trading directions. Furthermore, we introduce a high-performance Kernel trading strategy in the BDA market which uses kernel probability density estimator built on historical transaction data to decide optimal order prices. Kernel trading strategy outperforms some popular intelligent double auction trading strategies including ZIP, GD and RE in the continuous BDA market by making the highest profit in static games and obtaining the best wealth in dynamic games.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Timely warning of the public during large scale emergencies is essential to ensure safety and save lives. This ongoing study proposes an agent-based simulation model to simulate the warning message dissemination among the public considering both official channels and unofficial channels The proposed model was developed in NetLogo software for a hypothetical area, and requires input parameters such as effectiveness of each official source (%), estimated time to begin informing others, estimated time to inform others and estimated percentage of people (who do not relay the message). This paper demonstrates a means of factoring the behaviour of the public as informants into estimating the effectiveness of warningdissemination during large scale emergencies. The model provides a tool for the practitioner to test the potential impact of the informal channels on the overall warning time and sensitivity of the modelling parameters. The tool would help the practitioners to persuade evacuees to disseminate the warning message informing others similar to the ’Run to thy neighbour campaign conducted by the Red cross.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Last mile relief distribution is the final stage of humanitarian logistics. It refers to the supply of relief items from local distribution centers to the disaster affected people (Balcik et al., 2008). In the last mile relief distribution literature, researchers have focused on the use of optimisation techniques for determining the exact optimal solution (Liberatore et al., 2014), but there is a need to include behavioural factors with those optimisation techniques in order to obtain better predictive results. This paper will explain how improving the coordination factor increases the effectiveness of the last mile relief distribution process. There are two stages of methodology used to achieve the goal: Interviews: The authors conducted interviews with the Indian Government and with South Asian NGOs to identify the critical factors for final relief distribution. After thematic and content analysis of the interviews and the reports, the authors found some behavioural factors which affect the final relief distribution. Model building: Last mile relief distribution in India follows a specific framework described in the Indian Government disaster management handbook. We modelled this framework using agent based simulation and investigated the impact of coordination on effectiveness. We define effectiveness as the speed and accuracy with which aid is delivered to affected people. We tested through simulation modelling whether coordination improves effectiveness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Knowledge maintenance is a major challenge for both knowledge management and the Semantic Web. Operating over the Semantic Web, there will be a network of collaborating agents, each with their own ontologies or knowledge bases. Change in the knowledge state of one agent may need to be propagated across a number of agents and their associated ontologies. The challenge is to decide how to propagate a change of knowledge state. The effects of a change in knowledge state cannot be known in advance, and so an agent cannot know who should be informed unless it adopts a simple ‘tell everyone – everything’ strategy. This situation is highly reminiscent of the classic Frame Problem in AI. We argue that for agent-based technologies to succeed, far greater attention must be given to creating an appropriate model for knowledge update. In a closed system, simple strategies are possible (e.g. ‘sleeping dog’ or ‘cheap test’ or even complete checking). However, in an open system where cause and effect are unpredictable, a coherent cost-benefit based model of agent interaction is essential. Otherwise, the effectiveness of every act of knowledge update/maintenance is brought into question.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Challenges of returnable transport equipment (RTE) management continue to heighten as the popularity of their usage magnifies. Logistics companies are investigating the implementation of radio-frequency identification (RFID) technology to alleviate problems such as loss prevention and stock reduction. However, the research within this field is limited and fails to fully explore with depth, the wider network improvements that can be made to optimize the supply chain through efficient RTE management. This paper, investigates the nature of RTE network management building on current research and practices, filling a gap in the literature, through the investigation of a product-centric approach where the paradigms of “intelligent products” and “autonomous objects” are explored. A network optimizing approach with RTE management is explored, encouraging advanced research development of the RTE paradigm to align academic research with problematic areas in industry. Further research continues with the development of an agent-based software system, ready for application to a real-case study distribution network, producing quantitative results for further analysis. This is pivotal on the endeavor to developing agile support systems, fully utilizing an information-centric environment and encouraging RTE to be viewed as critical network optimizing tools rather than costly waste.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We develop a multi-agent based model to simulate a population which comprises of two ethnic groups and a peacekeeping force. We investigate the effects of different strategies for civilian movement to the resulting violence in this bi-communal population. Specifically, we compare and contrast random and race-based migration strategies. Race-based migration leads the formation of clusters. Previous work in this area has shown that same-race clustering instigates violent behavior in otherwise passive segments of the population. Our findings confirm this. Furthermore, we show that in settings where only one of the two races adopts race-based migration it is a winning strategy especially in violently predisposed populations. On the other hand, in relatively peaceful settings clustering is a restricting factor which causes the race that adopts it to drift into annihilation. Finally, we show that when race-based migration is adopted as a strategy by both ethnic groups it results in peaceful co-existence even in the most violently predisposed populations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis challenges the consensual scholarly expectation of low EU impact in Central Asia. In particular, it claims that by focusing predominantly on narrow, micro-level factors, the prevailing theoretical perspectives risk overlooking less obvious aspects of the EU?s power, including structural aspects, and thus tend to underestimate the EU?s leverage in the region. Therefore, the thesis argues that a more structurally integrative and holistic approach is needed to understand the EU?s power in the region. In responding to this need, the thesis introduces a conceptual tool, which it terms „transnational power over? (TNPO). Inspired by debates in IPE, in particular new realist and critical IPE perspectives, and combining these views with insights from neorealist, neo-institutionalist and constructivist approaches to EU external relations, the concept of TNPO is an analytically eclectic notion, which helps to assess the degree to which, in today?s globalised and interdependent world, the EU?s power over third countries derives from its control over a combination of material, institutional and ideational structures, making it difficult for the EU?s partners to resist the EU?s initiatives or to reject its offers. In order to trace and assess the mechanisms of EU impact across these three structures, the thesis constructs a toolbox, which centres on four analytical distinctions: (i) EU-driven versus domestically driven mechanisms, (ii) mechanisms based on rationalist logics of action versus mechanisms following constructivist logics of action, (iii) agent-based versus purely structural mechanisms of TNPO, and (iv) transnational and intergovernmental mechanisms of EU impact. Using qualitative research methodology, the thesis then applies the conceptual model to the case of EU-Central Asia. It finds that the EU?s power over Central Asia effectively derives from its control over a combination of material, institutional and ideational structures, including its position as a leader in trade and investment in the region, its (geo)strategic and security-related capabilities vis-à-vis Central Asia, as well as the relatively dense level of institutionalisation of its relations with the five countries and the positive image of the EU in Central Asia as a more neutral actor.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The introduction of agent technology raises several security issues that are beyond conventional security mechanisms capability and considerations, but research in protecting the agent from malicious host attack is evolving. This research proposes two approaches to protecting an agent from being attacked by a malicious host. The first approach consists of an obfuscation algorithm that is able to protect the confidentiality of an agent and make it more difficult for a malicious host to spy on the agent. The algorithm uses multiple polynomial functions with multiple random inputs to convert an agent's critical data to a value that is meaningless to the malicious host. The effectiveness of the obfuscation algorithm is enhanced by addition of noise code. The second approach consists of a mechanism that is able to protect the integrity of the agent using state information, recorded during the agent execution process in a remote host environment, to detect a manipulation attack by a malicious host. Both approaches are implemented using a master-slave agent architecture that operates on a distributed migration pattern. Two sets of experimental test were conducted. The first set of experiments measures the migration and migration+computation overheads of the itinerary and distributed migration patterns. The second set of experiments is used to measure the security overhead of the proposed approaches. The protection of the agent is assessed by analysis of its effectiveness under known attacks. Finally, an agent-based application, known as Secure Flight Finder Agent-based System (SecureFAS) is developed, in order to prove the function of the proposed approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Physically based distributed models of catchment hydrology are likely to be made available as engineering tools in the near future. Although these models are based on theoretically acceptable equations of continuity, there are still limitations in the present modelling strategy. Of interest to this thesis are the current modelling assumptions made concerning the effects of soil spatial variability, including formations producing distinct zones of preferential flow. The thesis contains a review of current physically based modelling strategies and a field based assessment of soil spatial variability. In order to investigate the effects of soil nonuniformity a fully three dimensional model of variability saturated flow in porous media is developed. The model is based on a Galerkin finite element approximation to Richards equation. Accessibility to a vector processor permits numerical solutions on grids containing several thousand node points. The model is applied to a single hillslope segment under various degrees of soil spatial variability. Such variability is introduced by generating random fields of saturated hydraulic conductivity using the turning bands method. Similar experiments are performed under conditions of preferred soil moisture movement. The results show that the influence of soil variability on subsurface flow may be less significant than suggested in the literature, due to the integrating effects of three dimensional flow. Under conditions of widespread infiltration excess runoff, the results indicate a greater significance of soil nonuniformity. The recognition of zones of preferential flow is also shown to be an important factor in accurate rainfall-runoff modelling. Using the results of various fields of soil variability, experiments are carried out to assess the validity of the commonly used concept of `effective parameters'. The results of these experiments suggest that such a concept may be valid in modelling subsurface flow. However, the effective parameter is observed to be event dependent when the dominating mechanism is infiltration excess runoff.