914 resultados para Markov Decision Process
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
The focus of the present work is the well-known feature of the probability density function (PDF) transport equations in turbulent flows-the inverse parabolicity of the equations. While it is quite common in fluid mechanics to interpret equations with direct (forward-time) parabolicity as diffusive (or as a combination of diffusion, convection and reaction), the possibility of a similar interpretation for equations with inverse parabolicity is not clear. According to Einstein's point of view, a diffusion process is associated with the random walk of some physical or imaginary particles, which can be modelled by a Markov diffusion process. In the present paper it is shown that the Markov diffusion process directly associated with the PDF equation represents a reasonable model for dealing with the PDFs of scalars but it significantly underestimates the diffusion rate required to simulate turbulent dispersion when the velocity components are considered.
Resumo:
In 3 experiments, the authors examined the role of memory for prior instances for making relative judgments in conflict detection. Participants saw pairs of aircraft either repeatedly conflict with each other or pass safely before being tested on new aircraft pairs, which varied in similarity to the training pairs. Performance was influenced by the similarity between aircraft pairs. Detection time was faster when a conflict pair resembled a pair that had repeatedly conflicted. Detection time was slower, and participants missed conflicts, when a conflict pair resembled a pair that had repeatedly passed safely. The findings identify aircraft features that are used as inputs into the memory decision process and provide an indication of the processes involved in the use of memory for prior instances to make relative judgments.
Resumo:
Flow control in Computer Communication systems is generally a multi-layered structure, consisting of several mechanisms operating independently at different levels. Evaluation of the performance of networks in which different flow control mechanisms act simultaneously is an important area of research, and is examined in depth in this thesis. This thesis presents the modelling of a finite resource computer communication network equipped with three levels of flow control, based on closed queueing network theory. The flow control mechanisms considered are: end-to-end control of virtual circuits, network access control of external messages at the entry nodes and the hop level control between nodes. The model is solved by a heuristic technique, based on an equivalent reduced network and the heuristic extensions to the mean value analysis algorithm. The method has significant computational advantages, and overcomes the limitations of the exact methods. It can be used to solve large network models with finite buffers and many virtual circuits. The model and its heuristic solution are validated by simulation. The interaction between the three levels of flow control are investigated. A queueing model is developed for the admission delay on virtual circuits with end-to-end control, in which messages arrive from independent Poisson sources. The selection of optimum window limit is considered. Several advanced network access schemes are postulated to improve the network performance as well as that of selected traffic streams, and numerical results are presented. A model for the dynamic control of input traffic is developed. Based on Markov decision theory, an optimal control policy is formulated. Numerical results are given and throughput-delay performance is shown to be better with dynamic control than with static control.
Resumo:
In global environment, a company has to make many decisions that impact upon its position in global supply chain networks such as outsourcing, offshoring, joint venture, vertical/horizontal integration, etc. All these decisions impact on the company’s strategic position, and hence on competitive space and performance. Therefore, it is important for a company to carefully manage strategic positioning by making careful decisions about the adoption of alternative manufacturing and supply chain activities. Unfortunately, there is no complete process studied in strategic positioning of manufacturing operations within global supply chain. Therefore, the work presented in this paper has investigated leading research and industrial practices to create a formal and rational decision process. An analysis of previous literature, industrial practices, and the resulting decision process are all presented in this paper.
Resumo:
Purpose – The purpose of this paper is to report on an investigation into the selection and evaluation of a suitable strategic positioning methodology for SMEs in Singapore. Design/methodology/approach – The research methodology is based on critical review of the literature to identify the potentially most suitable strategic positioning methodology, evaluation and testing of the methodology within the context of SME's in Singapore, and analysis to determine the strengths and weaknesses of the methodology and opportunities for further research. Findings – This paper illustrates a leading integrated strategic positioning decision making process, which has been found to be potentially suitable for SMEs in Singapore, and the process is then applied and evaluated in two industrial case studies. Results in the form of strengths, weaknesses and opportunities are evaluated and discussed in detail, and further research to improve the process has been identified. Practical implications – A practical and integrated strategic supply chain positioning methodology for SMEs to define their own competitive space, among other companies in the manufacturing supply chain, so as to maximize business competitiveness. Originality/value – This paper contributes to the knowledge of the strategic positioning decision process as well as identifies further research to adapt the process for SMEs in Singapore.
Resumo:
Increasingly in the UK, companies that have traditionally considered themselves as manufacturers are being advised to now see themselves as service providers and to reconsider whether to have any production capability. A key challenge is to translate this strategy into a selection of product and service-centred activities within the company's supply chain networks. Strategic positioning is concerned with the choice of business activities a company carries out itself, compared to those provided by suppliers, partners, distributors and even customers. In practice, strategic positioning is directly impacted by such decisions as outsourcing, off-shoring, partnering, technology innovation, acquisition and exploitation. If companies can better understand their strategic positioning, they can make more informed decisions about the adoption of alternative manufacturing and supply chain activities. Similarly, they are more likely to reject those that, like off-shoring, are currently en vogue but are highly likely to erode competitive edge and business success. Our research has developed a new concept we call 'competitive space' as a means of appreciating the strategic positioning of companies, along with a structured decision process for managing competitive space. Our ideas about competitive space, along with the decision process itself, have been developed and tested on a range of manufacturers. As more and more manufacturers are encouraged to move towards system integration and a serviceable business model, the challenge is to identify the appropriate strategic position for their organisations, or in other words, to identify their optimum competitive space for manufacture.
Resumo:
Smart grid technologies have given rise to a liberalised and decentralised electricity market, enabling energy providers and retailers to have a better understanding of the demand side and its response to pricing signals. This paper puts forward a reinforcement-learning-powered tool aiding an electricity retailer to define the tariff prices it offers, in a bid to optimise its retail strategy. In a competitive market, an energy retailer aims to simultaneously increase the number of contracted customers and its profit margin. We have abstracted the problem of deciding on a tariff price as faced by a retailer, as a semi-Markov decision problem (SMDP). A hierarchical reinforcement learning approach, MaxQ value function decomposition, is applied to solve the SMDP through interactions with the market. To evaluate our trading strategy, we developed a retailer agent (termed AstonTAC) that uses the proposed SMDP framework to act in an open multi-agent simulation environment, the Power Trading Agent Competition (Power TAC). An evaluation and analysis of the 2013 Power TAC finals show that AstonTAC successfully selects sell prices that attract as many customers as necessary to maximise the profit margin. Moreover, during the competition, AstonTAC was the only retailer agent performing well across all retail market settings.
Resumo:
This paper details the development and evaluation of AstonTAC, an energy broker that successfully participated in the 2012 Power Trading Agent Competition (Power TAC). AstonTAC buys electrical energy from the wholesale market and sells it in the retail market. The main focus of the paper is on the broker’s bidding strategy in the wholesale market. In particular, it employs Markov Decision Processes (MDP) to purchase energy at low prices in a day-ahead power wholesale market, and keeps energy supply and demand balanced. Moreover, we explain how the agent uses Non-Homogeneous Hidden Markov Model (NHHMM) to forecast energy demand and price. An evaluation and analysis of the 2012 Power TAC finals show that AstonTAC is the only agent that can buy energy at low price in the wholesale market and keep energy imbalance low.
Resumo:
2000 Mathematics Subject Classification: 60J80.
Resumo:
2000 Mathematics Subject Classification: 60J80, 60J85
Resumo:
Recognising the importance of alliance decision making in a virtual enterprise (VE), this paper proposes an analysis template to facilitate this process. The existing transaction-cost and resource-based theories in the literature are first reviewed, showing some deficiencies in both type of theories, and the potential of the resource based explanations. The paper then goes on to propose a resource-based analysis template, integrating both the motives of using certain business forms and the factors why different forms help achieve different objectives, Resource-combination effectiveness, management complexity and flexibility are identified as the three factors providing fundamental explanations of an organization's alliance making decision process. The template provides a comprehensive and generic approach for analysing alliance decisions.
Resumo:
Cloud computing is a new technological paradigm offering computing infrastructure, software and platforms as a pay-as-you-go, subscription-based service. Many potential customers of cloud services require essential cost assessments to be undertaken before transitioning to the cloud. Current assessment techniques are imprecise as they rely on simplified specifications of resource requirements that fail to account for probabilistic variations in usage. In this paper, we address these problems and propose a new probabilistic pattern modelling (PPM) approach to cloud costing and resource usage verification. Our approach is based on a concise expression of probabilistic resource usage patterns translated to Markov decision processes (MDPs). Key costing and usage queries are identified and expressed in a probabilistic variant of temporal logic and calculated to a high degree of precision using quantitative verification techniques. The PPM cost assessment approach has been implemented as a Java library and validated with a case study and scalability experiments. © 2012 Springer-Verlag Berlin Heidelberg.
Resumo:
The theoretical analysis and research of cultural activities have been limited, for the most part, to the study of the role the public sector plays in the funding and support of nonprofit Arts organizations. The tools used to evaluate this intervention follow a macroeconomic perspective and fail to account for microeconomic principles and assumptions that affect the behavior of these organizations. This dissertation describes through conceptual models the behavior of the agents involved in the artistic process and the economic sectors affected by it. The first paper deals with issues related to economic impact studies and formulates a set of guidelines that should be followed when conducting this type of study. One of the ways to assess more accurately the impact culture has in a community is by assuming that artists can re-create the public space of a blight community and get it ready for a regeneration process. The second paper of this dissertation assumes just that and explains in detail all the cultural, political, economic and sociological interactions that are taking place in the Arts-led regeneration process in Miami Beach, Florida. The paper models the behavior of these agents by indicating what their goals and decision process mechanisms are. The results give support to the claim that the public space artists create in a city actually stimulate development. The third paper discusses the estimation of a demand function for artistic activities, specifically the New World Symphony (NWS) located in Miami Beach, Florida. The behavior of the consumers and producers of NWS' concerts is modeled. The results support the notion that consumers make their decisions based, among other things, on the perceived value these concerts have. Economists engage in the analysis of the effects of cultural activities in a community since many cities rely on them for their development. The history of many communities is not told by their assembly lines and machinery anymore but by their centers of entertainment, hotels and restaurants. Many cities in Europe and North America that have seen the manufacturing sector migrate to the South are trying to face the demands of the new economy by using the Arts as catalysts for development. ^
Resumo:
Performance-based maintenance contracts differ significantly from material and method-based contracts that have been traditionally used to maintain roads. Road agencies around the world have moved towards a performance-based contract approach because it offers several advantages like cost saving, better budgeting certainty, better customer satisfaction with better road services and conditions. Payments for the maintenance of road are explicitly linked to the contractor successfully meeting certain clearly defined minimum performance indicators in these contracts. Quantitative evaluation of the cost of performance-based contracts has several difficulties due to the complexity of the pavement deterioration process. Based on a probabilistic analysis of failures of achieving multiple performance criteria over the length of the contract period, an effort has been made to develop a model that is capable of estimating the cost of these performance-based contracts. One of the essential functions of such model is to predict performance of the pavement as accurately as possible. Prediction of future degradation of pavement is done using Markov Chain Process, which requires estimating transition probabilities from previous deterioration rate for similar pavements. Transition probabilities were derived using historical pavement condition rating data, both for predicting pavement deterioration when there is no maintenance, and for predicting pavement improvement when maintenance activities are performed. A methodological framework has been developed to estimate the cost of maintaining road based on multiple performance criteria such as crack, rut and, roughness. The application of the developed model has been demonstrated via a real case study of Miami Dade Expressways (MDX) using pavement condition rating data from Florida Department of Transportation (FDOT) for a typical performance-based asphalt pavement maintenance contract. Results indicated that the pavement performance model developed could predict the pavement deterioration quite accurately. Sensitivity analysis performed shows that the model is very responsive to even slight changes in pavement deterioration rate and performance constraints. It is expected that the use of this model will assist the highway agencies and contractors in arriving at a fair contract value for executing long term performance-based pavement maintenance works.