65 resultados para Real data
em Instituto Politécnico do Porto, Portugal
Resumo:
The study of electricity markets operation has been gaining an increasing importance in the last years, as result of the new challenges that the restructuring process produced. Currently, lots of information concerning electricity markets is available, as market operators provide, after a period of confidentiality, data regarding market proposals and transactions. These data can be used as source of knowledge to define realistic scenarios, which are essential for understanding and forecast electricity markets behavior. The development of tools able to extract, transform, store and dynamically update data, is of great importance to go a step further into the comprehension of electricity markets and of the behaviour of the involved entities. In this paper an adaptable tool capable of downloading, parsing and storing data from market operators’ websites is presented, assuring constant updating and reliability of the stored data.
Resumo:
Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.
Resumo:
This paper presents the development of a solar photovoltaic (PV) model based on PSCAD/EMTDC - Power System Computer Aided Design – including a mathematical model study. An additional algorithm has been implemented in MATLAB software in order to calculate several parameters required by the PSCAD developed model. All the simulation study has been performed in PSCAD/MATLAB software simulation tool. A real data base concerning irradiance, cell temperature and PV power generation was used in order to support the evaluation of the implemented PV model.
Resumo:
This paper presents a methodology supported on the data base knowledge discovery process (KDD), in order to find out the failure probability of electrical equipments’, which belong to a real electrical high voltage network. Data Mining (DM) techniques are used to discover a set of outcome failure probability and, therefore, to extract knowledge concerning to the unavailability of the electrical equipments such us power transformers and high-voltages power lines. The framework includes several steps, following the analysis of the real data base, the pre-processing data, the application of DM algorithms, and finally, the interpretation of the discovered knowledge. To validate the proposed methodology, a case study which includes real databases is used. This data have a heavy uncertainty due to climate conditions for this reason it was used fuzzy logic to determine the set of the electrical components failure probabilities in order to reestablish the service. The results reflect an interesting potential of this approach and encourage further research on the topic.
Resumo:
Presently power system operation produces huge volumes of data that is still treated in a very limited way. Knowledge discovery and machine learning can make use of these data resulting in relevant knowledge with very positive impact. In the context of competitive electricity markets these data is of even higher value making clear the trend to make data mining techniques application in power systems more relevant. This paper presents two cases based on real data, showing the importance of the use of data mining for supporting demand response and for supporting player strategic behavior.
Resumo:
This document presents a tool able to automatically gather data provided by real energy markets and to generate scenarios, capture and improve market players’ profiles and strategies by using knowledge discovery processes in databases supported by artificial intelligence techniques, data mining algorithms and machine learning methods. It provides the means for generating scenarios with different dimensions and characteristics, ensuring the representation of real and adapted markets, and their participating entities. The scenarios generator module enhances the MASCEM (Multi-Agent Simulator of Competitive Electricity Markets) simulator, endowing a more effective tool for decision support. The achievements from the implementation of the proposed module enables researchers and electricity markets’ participating entities to analyze data, create real scenarios and make experiments with them. On the other hand, applying knowledge discovery techniques to real data also allows the improvement of MASCEM agents’ profiles and strategies resulting in a better representation of real market players’ behavior. This work aims to improve the comprehension of electricity markets and the interactions among the involved entities through adequate multi-agent simulation.
Resumo:
Worldwide electricity markets have been evolving into regional and even continental scales. The aim at an efficient use of renewable based generation in places where it exceeds the local needs is one of the main reasons. A reference case of this evolution is the European Electricity Market, where countries are connected, and several regional markets were created, each one grouping several countries, and supporting transactions of huge amounts of electrical energy. The continuous transformations electricity markets have been experiencing over the years create the need to use simulation platforms to support operators, regulators, and involved players for understanding and dealing with this complex environment. This paper focuses on demonstrating the advantage that real electricity markets data has for the creation of realistic simulation scenarios, which allow the study of the impacts and implications that electricity markets transformations will bring to the participant countries. A case study using MASCEM (Multi-Agent System for Competitive Electricity Markets) is presented, with a scenario based on real data, simulating the European Electricity Market environment, and comparing its performance when using several different market mechanisms.
Resumo:
Sustainable development concerns are being addressed with increasing attention, in general, and in the scope of power industry, in particular. The use of distributed generation (DG), mainly based on renewable sources, has been seen as an interesting approach to this problem. However, the increasing of DG in power systems raises some complex technical and economic issues. This paper presents ViProd, a simulation tool that allows modeling and simulating DG operation and participation in electricity markets. This paper mainly focuses on the operation of Virtual Power Producers (VPP) which are producers’ aggregations, being these producers mainly of DG type. The paper presents several reserve management strategies implemented in the scope of ViProd and the results of a case study, based on real data.
Resumo:
This paper consist in the establishment of a Virtual Producer/Consumer Agent (VPCA) in order to optimize the integrated management of distributed energy resources and to improve and control Demand Side Management DSM) and its aggregated loads. The paper presents the VPCA architecture and the proposed function-based organization to be used in order to coordinate the several generation technologies, the different load types and storage systems. This VPCA organization uses a frame work based on data mining techniques to characterize the costumers. The paper includes results of several experimental tests cases, using real data and taking into account electricity generation resources as well as consumption data.
Resumo:
This paper presents MASCEM - a multi-agent based electricity market simulator. MASCEM uses game theory, machine learning techniques, scenario analysis and optimization techniques to model market agents and to provide them with decision-support. This paper mainly focus on the MASCEM ability to provide the means to model and simulate Virtual Power Players (VPP). VPPs are represented as a coalition of agents, with specific characteristics and goals. The paper details some of the most important aspects considered in VPP formation and in the aggregation of new producers and includes a case study based on real data.
Resumo:
Power Systems (PS), have been affected by substantial penetration of Distributed Generation (DG) and the operation in competitive environments. The future PS will have to deal with large-scale integration of DG and other distributed energy resources (DER), such as storage means, and provide to market agents the means to ensure a flexible and secure operation. Virtual power players (VPP) can aggregate a diversity of players, namely generators and consumers, and a diversity of energy resources, including electricity generation based on several technologies, storage and demand response. This paper proposes an artificial neural network (ANN) based methodology to support VPP resource schedule. The trained network is able to achieve good schedule results requiring modest computational means. A real data test case is presented.
Resumo:
The smart grid concept is rapidly evolving in the direction of practical implementations able to bring smart grid advantages into practice. Evolution in legacy equipment and infrastructures is not sufficient to accomplish the smart grid goals as it does not consider the needs of the players operating in a complex environment which is dynamic and competitive in nature. Artificial intelligence based applications can provide solutions to these problems, supporting decentralized intelligence and decision-making. A case study illustrates the importance of Virtual Power Players (VPP) and multi-player negotiation in the context of smart grids. This case study is based on real data and aims at optimizing energy resource management, considering generation, storage and demand response.
Resumo:
This paper presents an integrated system that helps both retail companies and electricity consumers on the definition of the best retail contracts and tariffs. This integrated system is composed by a Decision Support System (DSS) based on a Consumer Characterization Framework (CCF). The CCF is based on data mining techniques, applied to obtain useful knowledge about electricity consumers from large amounts of consumption data. This knowledge is acquired following an innovative and systematic approach able to identify different consumers’ classes, represented by a load profile, and its characterization using decision trees. The framework generates inputs to use in the knowledge base and in the database of the DSS. The rule sets derived from the decision trees are integrated in the knowledge base of the DSS. The load profiles together with the information about contracts and electricity prices form the database of the DSS. This DSS is able to perform the classification of different consumers, present its load profile and test different electricity tariffs and contracts. The final outputs of the DSS are a comparative economic analysis between different contracts and advice about the most economic contract to each consumer class. The presentation of the DSS is completed with an application example using a real data base of consumers from the Portuguese distribution company.
Resumo:
A presente dissertação apresenta uma solução para o problema de modelização tridimensional de galerias subterrâneas. O trabalho desenvolvido emprega técnicas provenientes da área da robótica móvel para obtenção um sistema autónomo móvel de modelização, capaz de operar em ambientes não estruturados sem acesso a sistemas de posicionamento global, designadamente GPS. Um sistema de modelização móvel e autónomo pode ser bastante vantajoso, pois constitui um método rápido e simples de monitorização das estruturas e criação de representações virtuais das galerias com um elevado nível de detalhe. O sistema de modelização desloca-se no interior dos túneis para recolher informações sensoriais sobre a geometria da estrutura. A tarefa de organização destes dados com vista _a construção de um modelo coerente, exige um conhecimento exacto do percurso praticado pelo sistema, logo o problema de localização da plataforma sensorial tem que ser resolvido. A formulação de um sistema de localização autónoma tem que superar obstáculos que se manifestam vincadamente nos ambientes underground, tais como a monotonia estrutural e a já referida ausência de sistemas de posicionamento global. Neste contexto, foi abordado o conceito de SLAM (Simultaneous Loacalization and Mapping) para determinação da localização da plataforma sensorial em seis graus de liberdade. Seguindo a abordagem tradicional, o núcleo do algoritmo de SLAM consiste no filtro de Kalman estendido (EKF { Extended Kalman Filter ). O sistema proposto incorpora métodos avançados do estado da arte, designadamente a parametrização em profundidade inversa (Inverse Depth Parametrization) e o método de rejeição de outliers 1-Point RANSAC. A contribuição mais importante do método por nós proposto para o avanço do estado da arte foi a fusão da informação visual com a informação inercial. O algoritmo de localização foi testado com base em dados reais, adquiridos no interior de um túnel rodoviário. Os resultados obtidos permitem concluir que, ao fundir medidas inerciais com informações visuais, conseguimos evitar o fenómeno de degeneração do factor de escala, comum nas aplicações de localização através de sistemas puramente monoculares. Provámos simultaneamente que a correcção de um sistema de localização inercial através da consideração de informações visuais é eficaz, pois permite suprimir os desvios de trajectória que caracterizam os sistemas de dead reckoning. O algoritmo de modelização, com base na localização estimada, organiza no espaço tridimensional os dados geométricos adquiridos, resultando deste processo um modelo em nuvem de pontos, que posteriormente _e convertido numa malha triangular, atingindo-se assim uma representação mais realista do cenário original.
Resumo:
Com a crescente geração, armazenamento e disseminação da informação nos últimos anos, o anterior problema de falta de informação transformou-se num problema de extracção do conhecimento útil a partir da informação disponível. As representações visuais da informação abstracta têm sido utilizadas para auxiliar a interpretação os dados e para revelar padrões de outra forma escondidos. A visualização de informação procura aumentar a cognição humana aproveitando as capacidades visuais humanas, de forma a tornar perceptível a informação abstracta, fornecendo os meios necessários para que um humano possa absorver quantidades crescentes de informação, com as suas capacidades de percepção. O objectivo das técnicas de agrupamento de dados consiste na divisão de um conjunto de dados em vários grupos, em que dados semelhantes são colocados no mesmo grupo e dados dissemelhantes em grupos diferentes. Mais especificamente, o agrupamento de dados com restrições tem o intuito de incorporar conhecimento a priori no processo de agrupamento de dados, com o objectivo de aumentar a qualidade do agrupamento de dados e, simultaneamente, encontrar soluções apropriadas a tarefas e interesses específicos. Nesta dissertação é estudado a abordagem de Agrupamento de Dados Visual Interactivo que permite ao utilizador, através da interacção com uma representação visual da informação, incorporar o seu conhecimento prévio acerca do domínio de dados, de forma a influenciar o agrupamento resultante para satisfazer os seus objectivos. Esta abordagem combina e estende técnicas de visualização interactiva de informação, desenho de grafos de forças direccionadas e agrupamento de dados com restrições. Com o propósito de avaliar o desempenho de diferentes estratégias de interacção com o utilizador, são efectuados estudos comparativos utilizando conjuntos de dados sintéticos e reais.