67 resultados para linear machine modeling
Resumo:
In the energy management of the isolated operation of small power system, the economic scheduling of the generation units is a crucial problem. Applying right timing can maximize the performance of the supply. The optimal operation of a wind turbine, a solar unit, a fuel cell and a storage battery is searched by a mixed-integer linear programming implemented in General Algebraic Modeling Systems (GAMS). A Virtual Power Producer (VPP) can optimal operate the generation units, assured the good functioning of equipment, including the maintenance, operation cost and the generation measurement and control. A central control at system allows a VPP to manage the optimal generation and their load control. The application of methodology to a real case study in Budapest Tech, demonstrates the effectiveness of this method to solve the optimal isolated dispatch of the DC micro-grid renewable energy park. The problem has been converged in 0.09 s and 30 iterations.
Resumo:
Em Angola, apenas cerca de 30% da população tem acesso à energia elétrica, nível que decresce para valores inferiores a 10% em zonas rurais mais remotas. Este problema é agravado pelo facto de, na maioria dos casos, as infraestruturas existentes se encontrarem danificadas ou não acompanharem o desenvolvimento da região. Em particular na capital angolana, Luanda que, sendo a menor província de Angola, é a que regista atualmente a maior densidade populacional. Com uma população de cerca de 5 milhões de habitantes, não só há frequentemente problemas relacionados com a falha do fornecimento de energia elétrica como há ainda uma percentagem considerável de municípios onde a rede elétrica ainda nem sequer chegou. O governo de Angola, no seu esforço de crescimento e aproveitamento das suas enormes potencialidades, definiu o setor energético como um dos fatores críticos para o desenvolvimento sustentável do país, tendo assumido que este é um dos eixos prioritários até 2016. Existem objetivos claros quanto à reabilitação e expansão das infraestruturas do setor elétrico, aumentando a capacidade instalada do país e criando uma rede nacional adequada, com o intuito não só de melhorar a qualidade e fiabilidade da rede já existente como de a aumentar. Este trabalho de dissertação consistiu no levantamento de dados reais relativamente à rede de distribuição de energia elétrica de Luanda, na análise e planeamento do que é mais premente fazer relativamente à sua expansão, na escolha dos locais onde é viável localizar novas subestações, na modelação adequada do problema real e na proposta de uma solução ótima para a expansão da rede existente. Depois de analisados diferentes modelos matemáticos aplicados ao problema de expansão de redes de distribuição de energia elétrica encontrados na literatura, optou-se por um modelo de programação linear inteira mista (PLIM) que se mostrou adequado. Desenvolvido o modelo do problema, o mesmo foi resolvido por recurso a software de otimização Analytic Solver e CPLEX. Como forma de validação dos resultados obtidos, foi implementada a solução de rede no simulador PowerWorld 8.0 OPF, software este que permite a simulação da operação do sistema de trânsito de potências.
Resumo:
The theory of fractional calculus goes back to the beginning of thr throry of differential calculus but its inherent complexity postponed the applications of the associated concepts. In the last decade the progress in the areas of chaos and fractals revealed subtle relationships with the fractional calculus leading to an increasing interest in the development of the new paradigm. In the area of automaticcontrol preliminary work has already been carried out but the proposed algorithms are restricted to the frequency domain. The paper discusses the design of fractional-order discrete-time controllers. The algorithms studied adopt the time domein, which makes them suited for z-transform analusis and discrete-time implementation. The performance of discrete-time fractional-order controllers with linear and non-linear systems is also investigated.
Resumo:
This paper proposes a novel framework for modelling the Value for the Customer, the so-called the Conceptual Model for Decomposing Value for the Customer (CMDVC). This conceptual model is first validated through an exploratory case study where the authors validate both the proposed constructs of the model and their relations. In a second step the authors propose a mathematical formulation for the CMDVC as well as a computational method. This has enabled the final quantitative discussion of how the CMDVC can be applied and used in the enterprise environment, and the final validation by the people in the enterprise. Along this research, we were able to confirm that the results of this novel quantitative approach to model the Value for the Customer is consistent with the company's empirical experience. The paper further discusses the merits and limitations of this approach, proposing that the model is likely to bring value to support not only the contract preparation at an Ex-Ante Negotiation Phase, as demonstrated, but also along the actual negotiation process, as finally confirmed by an enterprise testimonial.
Resumo:
The paper proposes a methodology to increase the probability of delivering power to any load point by identifying new investments in distribution energy systems. The proposed methodology is based on statistical failure and repair data of distribution components and it uses a fuzzy-probabilistic modeling for the components outage parameters. The fuzzy membership functions of the outage parameters of each component are based on statistical records. A mixed integer nonlinear programming optimization model is developed in order to identify the adequate investments in distribution energy system components which allow increasing the probability of delivering power to any customer in the distribution system at the minimum possible cost for the system operator. To illustrate the application of the proposed methodology, the paper includes a case study that considers a 180 bus distribution network.
Resumo:
The management of energy resources for islanded operation is of crucial importance for the successful use of renewable energy sources. A Virtual Power Producer (VPP) can optimally operate the resources taking into account the maintenance, operation and load control considering all the involved cost. This paper presents the methodology approach to formulate and solve the problem of determining the optimal resource allocation applied to a real case study in Budapest Tech’s. The problem is formulated as a mixed-integer linear programming model (MILP) and solved by a deterministic optimization technique CPLEX-based implemented in General Algebraic Modeling Systems (GAMS). The problem has also been solved by Evolutionary Particle Swarm Optimization (EPSO). The obtained results are presented and compared.
Resumo:
In the energy management of a small power system, the scheduling of the generation units is a crucial problem for which adequate methodologies can maximize the performance of the energy supply. This paper proposes an innovative methodology for distributed energy resources management. The optimal operation of distributed generation, demand response and storage resources is formulated as a mixed-integer linear programming model (MILP) and solved by a deterministic optimization technique CPLEX-based implemented in General Algebraic Modeling Systems (GAMS). The paper deals with a vision for the grids of the future, focusing on conceptual and operational aspects of electrical grids characterized by an intensive penetration of DG, in the scope of competitive environments and using artificial intelligence methodologies to attain the envisaged goals. These concepts are implemented in a computational framework which includes both grid and market simulation.
Resumo:
Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tools must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.
Resumo:
In the context of previous publications, we propose a new lightweight UM process, intended to work as a tourism recommender system in a commercial environment. The new process tackles issues like cold start, gray sheep and over specialization through a rich user model and the application of a gradual forgetting function to the collected user action history. Also, significant performance improvements were achieved regarding the previously proposed UM process.
Resumo:
The paper introduces an approach to solve the problem of generating a sequence of jobs that minimizes the total weighted tardiness for a set of jobs to be processed in a single machine. An Ant Colony System based algorithm is validated with benchmark problems available in the OR library. The obtained results were compared with the best available results and were found to be nearer to the optimal. The obtained computational results allowed concluding on their efficiency and effectiveness.
Resumo:
This paper aims to present a multi-agent model for a simulation, whose goal is to help one specific participant of multi-criteria group decision making process.This model has five main intervenient types: the human participant, who is using the simulation and argumentation support system; the participant agents, one associated to the human participant and the others simulating the others human members of the decision meeting group; the directory agent; the proposal agents, representing the different alternatives for a decision (the alternatives are evaluated based on criteria); and the voting agent responsiblefor all voting machanisms.At this stage it is proposed a two phse algorithm. In the first phase each participantagent makes his own evaluation of the proposals under discussion, and the voting agent proposes a simulation of a voting process.In the second phase, after the dissemination of the voting results,each one ofthe partcipan agents will argue to convince the others to choose one of the possible alternatives. The arguments used to convince a specific participant are dependent on agent knowledge about that participant. This two-phase algorithm is applied iteratively.
Resumo:
Mestrado em Engenharia Informática
Resumo:
Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.
Resumo:
This document is a survey in the research area of User Modeling (UM) for the specific field of Adaptive Learning. The aims of this document are: To define what it is a User Model; To present existing and well known User Models; To analyze the existent standards related with UM; To compare existing systems. In the scientific area of User Modeling (UM), numerous research and developed systems already seem to promise good results, but some experimentation and implementation are still necessary to conclude about the utility of the UM. That is, the experimentation and implementation of these systems are still very scarce to determine the utility of some of the referred applications. At present, the Student Modeling research goes in the direction to make possible reuse a student model in different systems. The standards are more and more relevant for this effect, allowing systems communicate and to share data, components and structures, at syntax and semantic level, even if most of them still only allow syntax integration.