960 resultados para Real Electricity Markets Data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is investigating the claim that Change Data Capture (CDC) technologies capture data changes in real-time. Based on theory, our hypothesis states that real-time CDC is not achievable with traditional approaches (log scanning, triggers and timestamps). Traditional approaches to CDC require a resource to be polled, which prevents true real-time CDC. We propose an approach to CDC that encapsulates the data source with a set of web services. These web services will propagate the changes to the targets and eliminate the need for polling. Additionally we propose a framework for CDC technologies that allow changes to flow from source to target. This paper discusses current CDC technologies and presents the theory about why they are unable to deliver changes in real-time. Following, we discuss our web service approach to CDC and accompanying framework, explaining how they can produce real-time CDC. The paper concludes with a discussion on the research required to investigate the real-time capabilities of CDC technologies. © 2010 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The variability in non-dispatchable power generation raises important challenges to the integration of renewable energy sources into the electricity power grid. This paper provides the coordinated trading of wind and photovoltaic energy to mitigate risks due to the wind and solar power variability, electricity prices, and financial penalties arising out the generation shortfall and surplus. The problem of wind-photovoltaic coordinated trading is formulated as a linear programming problem. The goal is to obtain the optimal bidding strategy that maximizes the total profit. The wind-photovoltaic coordinated operation is modeled and compared with the uncoordinated operation. A comparison of the models and relevant conclusions are drawn from an illustrative case study of the Iberian day-ahead electricity market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The variability in non-dispatchable power generation raises important challenges to the integration of renewable energy sources into the electricity power grid. This paper provides the coordinated trading of wind and photovoltaic energy assisted by a cyber-physical system for supporting management decisions to mitigate risks due to the wind and solar power variability, electricity prices, and financial penalties arising out the generation shortfall and surplus. The problem of wind-photovoltaic coordinated trading is formulated as a stochastic linear programming problem. The goal is to obtain the optimal bidding strategy that maximizes the total profit. The wind-photovoltaic coordinated operation is modelled and compared with the uncoordinated operation. A comparison of the models and relevant conclusions are drawn from an illustrative case study of the Iberian day-ahead electricity market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The newly inaugurated Navile District of the University of Bologna is a complex created along the Navile canal, that now houses various teaching and research activities for the disciplines of Chemistry, Industrial Chemistry, Pharmacy, Biotechnology and Astronomy. A Building Information Modeling system (BIM) gives staff of the Navile campus several ways to monitor buildings in the complex throughout their life cycle, one of which is the ability to access real-time environmental data such as room temperature, humidity, air composition, and more, thereby simplifying operations like finding faults and optimizing environmental resource usage. But smart features at Navile are not only available to the staff: AlmaMap Navile is a web application, whose development is documented in this thesis, that powers the public touch kiosks available throughout the campus, offering maps of the district and indications on how to reach buildings and spaces. Even if these two systems, BIM and AlmaMap, don't seem to have many similarities, they share the common intent of promoting awareness for informed decision making in the campus, and they do it while relying on web standards for communication. This opens up interesting possibilities, and is the idea behind AlmaMap Navile 2.0, an app that interfaces with the BIM system and combines real-time sensor data with a comfort calculation algorithm, giving users the ability not just to ask for directions to a space, but also to see its comfort level in advance and, should they want to, check environmental measurements coming from each sensor in a granular manner. The end result is a first step towards building a smart campus Digital Twin, that can support all the people who are part of the campus life in their daily activities, improving their efficiency and satisfaction, giving them the ability to make informed decisions, and promoting awareness and sustainability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent decades, all over the world, competition in the electric power sector has deeply changed the way this sector’s agents play their roles. In most countries, electric process deregulation was conducted in stages, beginning with the clients of higher voltage levels and with larger electricity consumption, and later extended to all electrical consumers. The sector liberalization and the operation of competitive electricity markets were expected to lower prices and improve quality of service, leading to greater consumer satisfaction. Transmission and distribution remain noncompetitive business areas, due to the large infrastructure investments required. However, the industry has yet to clearly establish the best business model for transmission in a competitive environment. After generation, the electricity needs to be delivered to the electrical system nodes where demand requires it, taking into consideration transmission constraints and electrical losses. If the amount of power flowing through a certain line is close to or surpasses the safety limits, then cheap but distant generation might have to be replaced by more expensive closer generation to reduce the exceeded power flows. In a congested area, the optimal price of electricity rises to the marginal cost of the local generation or to the level needed to ration demand to the amount of available electricity. Even without congestion, some power will be lost in the transmission system through heat dissipation, so prices reflect that it is more expensive to supply electricity at the far end of a heavily loaded line than close to an electric power generation. Locational marginal pricing (LMP), resulting from bidding competition, represents electrical and economical values at nodes or in areas that may provide economical indicator signals to the market agents. This article proposes a data-mining-based methodology that helps characterize zonal prices in real power transmission networks. To test our methodology, we used an LMP database from the California Independent System Operator for 2009 to identify economical zones. (CAISO is a nonprofit public benefit corporation charged with operating the majority of California’s high-voltage wholesale power grid.) To group the buses into typical classes that represent a set of buses with the approximate LMP value, we used two-step and k-means clustering algorithms. By analyzing the various LMP components, our goal was to extract knowledge to support the ISO in investment and network-expansion planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents MASCEM - a multi-agent based electricity market simulator. MASCEM uses game theory, machine learning techniques, scenario analysis and optimization techniques to model market agents and to provide them with decision-support. This paper mainly focus on the MASCEM ability to provide the means to model and simulate Virtual Power Players (VPP). VPPs are represented as a coalition of agents, with specific characteristics and goals. The paper details some of the most important aspects considered in VPP formation and in the aggregation of new producers and includes a case study based on real data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presently power system operation produces huge volumes of data that is still treated in a very limited way. Knowledge discovery and machine learning can make use of these data resulting in relevant knowledge with very positive impact. In the context of competitive electricity markets these data is of even higher value making clear the trend to make data mining techniques application in power systems more relevant. This paper presents two cases based on real data, showing the importance of the use of data mining for supporting demand response and for supporting player strategic behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an integrated system that helps both retail companies and electricity consumers on the definition of the best retail contracts and tariffs. This integrated system is composed by a Decision Support System (DSS) based on a Consumer Characterization Framework (CCF). The CCF is based on data mining techniques, applied to obtain useful knowledge about electricity consumers from large amounts of consumption data. This knowledge is acquired following an innovative and systematic approach able to identify different consumers’ classes, represented by a load profile, and its characterization using decision trees. The framework generates inputs to use in the knowledge base and in the database of the DSS. The rule sets derived from the decision trees are integrated in the knowledge base of the DSS. The load profiles together with the information about contracts and electricity prices form the database of the DSS. This DSS is able to perform the classification of different consumers, present its load profile and test different electricity tariffs and contracts. The final outputs of the DSS are a comparative economic analysis between different contracts and advice about the most economic contract to each consumer class. The presentation of the DSS is completed with an application example using a real data base of consumers from the Portuguese distribution company.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The electricity industry throughout the world, which has long been dominated by vertically integrated utilities, has experienced major changes. Deregulation, unbundling, wholesale and retail wheeling, and real-time pricing were abstract concepts a few years ago. Today market forces drive the price of electricity and reduce the net cost through increased competition. As power markets continue to evolve, there is a growing need for advanced modeling approaches. This article addresses the challenge of maximizing the profit (or return) of power producers through the optimization of their share of customers. Power producers have fixed production marginal costs and decide the quantity of energy to sell in both day-ahead markets and a set of target clients, by negotiating bilateral contracts involving a three-rate tariff. Producers sell energy by considering the prices of a reference week and five different types of clients with specific load profiles. They analyze several tariffs and determine the best share of customers, i.e., the share that maximizes profit. © 2014 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The electricity industry throughout the world, which has long been dominated by vertically integrated utilities, has experienced major changes. Deregulation, unbundling, wholesale and retail wheeling, and real-time pricing were abstract concepts a few years ago. Today market forces drive the price of electricity and reduce the net cost through increased competition. As power markets continue to evolve, there is a growing need for advanced modeling approaches. This article addresses the challenge of maximizing the profit (or return) of power producers through the optimization of their share of customers. Power producers have fixed production marginal costs and decide the quantity of energy to sell in both day-ahead markets and a set of target clients, by negotiating bilateral contracts involving a three-rate tariff. Producers sell energy by considering the prices of a reference week and five different types of clients with specific load profiles. They analyze several tariffs and determine the best share of customers, i.e., the share that maximizes profit. © 2014 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing importance of the integration of distributed generation and demand response in the power systems operation and planning, namely at lower voltage levels of distribution networks and in the competitive environment of electricity markets, leads us to the concept of smart grids. In both traditional and smart grid operation, non-technical losses are a great economic concern, which can be addressed. In this context, the ELECON project addresses the use of demand response contributions to the identification of non-technical losses. The present paper proposes a methodology to be used by Virtual Power Players (VPPs), which are entities able to aggregate distributed small-size resources, aiming to define the best electricity tariffs for several, clusters of consumers. A case study based on real consumption data demonstrates the application of the proposed methodology.