898 resultados para Multi-access systems
Resumo:
When designing systems that are complex, dynamic and stochastic in nature, simulation is generally recognised as one of the best design support technologies, and a valuable aid in the strategic and tactical decision making process. A simulation model consists of a set of rules that define how a system changes over time, given its current state. Unlike analytical models, a simulation model is not solved but is run and the changes of system states can be observed at any point in time. This provides an insight into system dynamics rather than just predicting the output of a system based on specific inputs. Simulation is not a decision making tool but a decision support tool, allowing better informed decisions to be made. Due to the complexity of the real world, a simulation model can only be an approximation of the target system. The essence of the art of simulation modelling is abstraction and simplification. Only those characteristics that are important for the study and analysis of the target system should be included in the simulation model. The purpose of simulation is either to better understand the operation of a target system, or to make predictions about a target system’s performance. It can be viewed as an artificial white-room which allows one to gain insight but also to test new theories and practices without disrupting the daily routine of the focal organisation. What you can expect to gain from a simulation study is very well summarised by FIRMA (2000). His idea is that if the theory that has been framed about the target system holds, and if this theory has been adequately translated into a computer model this would allow you to answer some of the following questions: · Which kind of behaviour can be expected under arbitrarily given parameter combinations and initial conditions? · Which kind of behaviour will a given target system display in the future? · Which state will the target system reach in the future? The required accuracy of the simulation model very much depends on the type of question one is trying to answer. In order to be able to respond to the first question the simulation model needs to be an explanatory model. This requires less data accuracy. In comparison, the simulation model required to answer the latter two questions has to be predictive in nature and therefore needs highly accurate input data to achieve credible outputs. These predictions involve showing trends, rather than giving precise and absolute predictions of the target system performance. The numerical results of a simulation experiment on their own are most often not very useful and need to be rigorously analysed with statistical methods. These results then need to be considered in the context of the real system and interpreted in a qualitative way to make meaningful recommendations or compile best practice guidelines. One needs a good working knowledge about the behaviour of the real system to be able to fully exploit the understanding gained from simulation experiments. The goal of this chapter is to brace the newcomer to the topic of what we think is a valuable asset to the toolset of analysts and decision makers. We will give you a summary of information we have gathered from the literature and of the experiences that we have made first hand during the last five years, whilst obtaining a better understanding of this exciting technology. We hope that this will help you to avoid some pitfalls that we have unwittingly encountered. Section 2 is an introduction to the different types of simulation used in Operational Research and Management Science with a clear focus on agent-based simulation. In Section 3 we outline the theoretical background of multi-agent systems and their elements to prepare you for Section 4 where we discuss how to develop a multi-agent simulation model. Section 5 outlines a simple example of a multi-agent system. Section 6 provides a collection of resources for further studies and finally in Section 7 we will conclude the chapter with a short summary.
Resumo:
In knowledge technology work, as expressed by the scope of this conference, there are a number of communities, each uncovering new methods, theories, and practices. The Library and Information Science (LIS) community is one such community. This community, through tradition and innovation, theories and practice, organizes knowledge and develops knowledge technologies formed by iterative research hewn to the values of equal access and discovery for all. The Information Modeling community is another contributor to knowledge technologies. It concerns itself with the construction of symbolic models that capture the meaning of information and organize it in ways that are computer-based, but human understandable. A recent paper that examines certain assumptions in information modeling builds a bridge between these two communities, offering a forum for a discussion on common aims from a common perspective. In a June 2000 article, Parsons and Wand separate classes from instances in information modeling in order to free instances from what they call the “tyranny” of classes. They attribute a number of problems in information modeling to inherent classification – or the disregard for the fact that instances can be conceptualized independent of any class assignment. By faceting instances from classes, Parsons and Wand strike a sonorous chord with classification theory as understood in LIS. In the practice community and in the publications of LIS, faceted classification has shifted the paradigm of knowledge organization theory in the twentieth century. Here, with the proposal of inherent classification and the resulting layered information modeling, a clear line joins both the LIS classification theory community and the information modeling community. Both communities have their eyes turned toward networked resource discovery, and with this conceptual conjunction a new paradigmatic conversation can take place. Parsons and Wand propose that the layered information model can facilitate schema integration, schema evolution, and interoperability. These three spheres in information modeling have their own connotation, but are not distant from the aims of classification research in LIS. In this new conceptual conjunction, established by Parsons and Ward, information modeling through the layered information model, can expand the horizons of classification theory beyond LIS, promoting a cross-fertilization of ideas on the interoperability of subject access tools like classification schemes, thesauri, taxonomies, and ontologies. This paper examines the common ground between the layered information model and faceted classification, establishing a vocabulary and outlining some common principles. It then turns to the issue of schema and the horizons of conventional classification and the differences between Information Modeling and Library and Information Science. Finally, a framework is proposed that deploys an interpretation of the layered information modeling approach in a knowledge technologies context. In order to design subject access systems that will integrate, evolve and interoperate in a networked environment, knowledge organization specialists must consider a semantic class independence like Parsons and Wand propose for information modeling.
Resumo:
Multi-phase electrical drives are potential candidates for the employment in innovative electric vehicle powertrains, in response to the request for high efficiency and reliability of this type of application. In addition to the multi-phase technology, in the last decades also, multilevel technology has been developed. These two technologies are somewhat complementary since both allow increasing the power rating of the system without increasing the current and voltage ratings of the single power switches of the inverter. In this thesis, some different topics concerning the inverter, the motor and the fault diagnosis of an electric vehicle powertrain are addressed. In particular, the attention is focused on multi-phase and multilevel technologies and their potential advantages with respect to traditional technologies. First of all, the mathematical models of two multi-phase machines, a five-phase induction machine and an asymmetrical six-phase permanent magnet synchronous machines are developed using the Vector Space Decomposition approach. Then, a new modulation technique for multi-phase multilevel T-type inverters, which solves the voltage balancing problem of the DC-link capacitors, ensuring flexible management of the capacitor voltages, is developed. The technique is based on the proper selection of the zero-sequence component of the modulating signals. Subsequently, a diagnostic technique for detecting the state of health of the rotor magnets in a six-phase permanent magnet synchronous machine is established. The technique is based on analysing the electromotive force induced in the stator windings by the rotor magnets. Furthermore, an innovative algorithm able to extend the linear modulation region for five-phase inverters, taking advantage of the multiple degrees of freedom available in multi-phase systems is presented. Finally, the mathematical model of an eighteen-phase squirrel cage induction motor is defined. This activity aims to develop a motor drive able to change the number of poles of the machine during the machine operation.
Resumo:
Squeezed light is of interest as an example of a non-classical state of the electromagnetic field and because of its applications both in technology and in fundamental quantum physics. This review concentrates on one aspect of squeezed light, namely its application in atomic spectroscopy. The general properties, detection and application of squeezed light are first reviewed. The basic features of the main theoretical methods (master equations, quantum Langevin equations, coupled systems) used to treat squeezed light spectroscopy are then outlined. The physics of squeezed light interactions with atomic systems is dealt with first for the simpler case of two-level atoms and then for the more complex situation of multi-level atoms and multi-atom systems. Finally the specific applications of squeezed light spectroscopy are reviewed.
Resumo:
This paper presents a means of structuring specifications in real-time Object-Z: an integration of Object-Z with the timed refinement calculus. Incremental modification of classes using inheritance and composition of classes to form multi-component systems are examined. Two approaches to the latter are considered: using Object-Z's notion of object instantiation and introducing a parallel composition operator similar to those found in process algebras. The parallel composition operator approach is both more concise and allows more general modelling of concurrency. Its incorporation into the existing semantics of real-time Object-Z is presented.
Resumo:
The phase and microstructural evolution of multi-cation Sm-Ca-alpha-sialon ceramics was investigated. Six samples were prepared, ranging from a pure Sm-sialon to a pure Ca-sialon, with calcium replacing samarium in 20 eq% increments, thus maintaining an equivalent design composition in all samples. After pressureless sintering at 1820 degreesC for 2 It, all samples were subsequently heat treated up to 192 h at 1450 and 1300 degreesC. The amount of grain boundary glass in the samples after sintering was observed to decrease with increasing calcium levels. A M-ss' or M-ss',-gehlenite solid solution was observed to form during the 1450 degreesC heat treatment of all Sm-containing samples, and this phase forms in clusters in the high-Sm samples. The thermal stability of the alpha-sialon phase was improved in the multi-cation systems. Heat treatment at 1300 degreesC produces SmAlO3 in the high-Sm samples, a M-ss',-gehlenite solid solution in the high-Ca samples, and a Sm-Ca-apatite phase in some intermediate samples. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Measurement of exchange of substances between blood and tissue has been a long-lasting challenge to physiologists, and considerable theoretical and experimental accomplishments were achieved before the development of the positron emission tomography (PET). Today, when modeling data from modern PET scanners, little use is made of earlier microvascular research in the compartmental models, which have become the standard model by which the vast majority of dynamic PET data are analysed. However, modern PET scanners provide data with a sufficient temporal resolution and good counting statistics to allow estimation of parameters in models with more physiological realism. We explore the standard compartmental model and find that incorporation of blood flow leads to paradoxes, such as kinetic rate constants being time-dependent, and tracers being cleared from a capillary faster than they can be supplied by blood flow. The inability of the standard model to incorporate blood flow consequently raises a need for models that include more physiology, and we develop microvascular models which remove the inconsistencies. The microvascular models can be regarded as a revision of the input function. Whereas the standard model uses the organ inlet concentration as the concentration throughout the vascular compartment, we consider models that make use of spatial averaging of the concentrations in the capillary volume, which is what the PET scanner actually registers. The microvascular models are developed for both single- and multi-capillary systems and include effects of non-exchanging vessels. They are suitable for analysing dynamic PET data from any capillary bed using either intravascular or diffusible tracers, in terms of physiological parameters which include regional blood flow. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Metaheuristics performance is highly dependent of the respective parameters which need to be tuned. Parameter tuning may allow a larger flexibility and robustness but requires a careful initialization. The process of defining which parameters setting should be used is not obvious. The values for parameters depend mainly on the problem, the instance to be solved, the search time available to spend in solving the problem, and the required quality of solution. This paper presents a learning module proposal for an autonomous parameterization of Metaheuristics, integrated on a Multi-Agent System for the resolution of Dynamic Scheduling problems. The proposed learning module is inspired on Autonomic Computing Self-Optimization concept, defining that systems must continuously and proactively improve their performance. For the learning implementation it is used Case-based Reasoning, which uses previous similar data to solve new cases. In the use of Case-based Reasoning it is assumed that similar cases have similar solutions. After a literature review on topics used, both AutoDynAgents system and Self-Optimization module are described. Finally, a computational study is presented where the proposed module is evaluated, obtained results are compared with previous ones, some conclusions are reached, and some future work is referred. It is expected that this proposal can be a great contribution for the self-parameterization of Metaheuristics and for the resolution of scheduling problems on dynamic environments.
Resumo:
This paper presents a Swarm based Cooperation Mechanism for scheduling optimization. We intend to conceptualize real manufacturing systems as interacting autonomous entities in order to support decision making in agile manufacturing environments. Agents coordinate their actions automatically without human supervision considering a common objective – global scheduling solution taking advantages from collective behavior of species through implicit and explicit cooperation. The performance of the cooperation mechanism will be evaluated consider implicit cooperation at first stage through ACS, PSO and ABC algorithms and explicit through cooperation mechanism application.
Resumo:
A novel approach to scheduling resolution by combining Autonomic Computing (AC), Multi-Agent Systems (MAS), Case-based Reasoning (CBR), and Bio-Inspired Optimization Techniques (BIT) will be described. AC has emerged as a paradigm aiming at incorporating applications with a management structure similar to the central nervous system. The main intentions are to improve resource utilization and service quality. In this paper we envisage the use of MAS paradigm for supporting dynamic and distributed scheduling in Manufacturing Systems with AC properties, in order to reduce the complexity of managing manufacturing systems and human interference. The proposed CBR based Intelligent Scheduling System was evaluated under different dynamic manufacturing scenarios.
Resumo:
This chapter addresses the resolution of dynamic scheduling by means of meta-heuristic and multi-agent systems. Scheduling is an important aspect of automation in manufacturing systems. Several contributions have been proposed, but the problem is far from being solved satisfactorily, especially if scheduling concerns real world applications. The proposed multi-agent scheduling system assumes the existence of several resource agents (which are decision-making entities based on meta-heuristics) distributed inside the manufacturing system that interact with other agents in order to obtain optimal or near-optimal global performances.
Resumo:
Electricity Markets are not only a new reality but an evolving one as the involved players and rules change at a relatively high rate. Multi-agent simulation combined with Artificial Intelligence techniques may result in sophisticated tools very helpful under this context. Some simulation tools have already been developed, some of them very interesting. However, at the present state it is important to go a step forward in Electricity Markets simulators as this is crucial for facing changes in Power Systems. This paper explains the context and needs of electricity market simulation, describing the most important characteristics of available simulators. We present our work concerning MASCEM simulator, presenting its features as well as the improvements being made to accomplish the change and challenging reality of Electricity Markets.
Resumo:
A manufacturing system has a natural dynamic nature observed through several kinds of random occurrences and perturbations on working conditions and requirements over time. For this kind of environment it is important the ability to efficient and effectively adapt, on a continuous basis, existing schedules according to the referred disturbances, keeping performance levels. The application of Meta-Heuristics and Multi-Agent Systems to the resolution of this class of real world scheduling problems seems really promising. This paper presents a prototype for MASDScheGATS (Multi-Agent System for Distributed Manufacturing Scheduling with Genetic Algorithms and Tabu Search).
Resumo:
Competitive electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is an electricity market simulator able to model market players and simulate their operation in the market. As market players are complex entities, having their characteristics and objectives, making their decisions and interacting with other players, a multi-agent architecture is used and proved to be adequate. MASCEM players have learning capabilities and different risk preferences. They are able to refine their strategies according to their past experience (both real and simulated) and considering other agents’ behavior. Agents’ behavior is also subject to its risk preferences.
Resumo:
Electricity market players operating in a liberalized environment require adequate decision support tools, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. This paper deals with short-term predication of day-ahead spinning reserve (SR) requirement that helps the ISO to make effective and timely decisions. Based on these forecasted information, market participants can use strategic bidding for day-ahead SR market. The proposed concepts and methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A case study based on California ISO (CAISO) data is included; the forecasted results are presented and compared with CAISO published forecast.