960 resultados para Real Electricity Markets Data
Resumo:
An important application of Big Data Analytics is the real-time analysis of streaming data. Streaming data imposes unique challenges to data mining algorithms, such as concept drifts, the need to analyse the data on the fly due to unbounded data streams and scalable algorithms due to potentially high throughput of data. Real-time classification algorithms that are adaptive to concept drifts and fast exist, however, most approaches are not naturally parallel and are thus limited in their scalability. This paper presents work on the Micro-Cluster Nearest Neighbour (MC-NN) classifier. MC-NN is based on an adaptive statistical data summary based on Micro-Clusters. MC-NN is very fast and adaptive to concept drift whilst maintaining the parallel properties of the base KNN classifier. Also MC-NN is competitive compared with existing data stream classifiers in terms of accuracy and speed.
Resumo:
In the present study, we propose a theoretical graph procedure to investigate multiple pathways in brain functional networks. By taking into account all the possible paths consisting of h links between the nodes pairs of the network, we measured the global network redundancy R (h) as the number of parallel paths and the global network permeability P (h) as the probability to get connected. We used this procedure to investigate the structural and dynamical changes in the cortical networks estimated from a dataset of high-resolution EEG signals in a group of spinal cord injured (SCI) patients during the attempt of foot movement. In the light of a statistical contrast with a healthy population, the permeability index P (h) of the SCI networks increased significantly (P < 0.01) in the Theta frequency band (3-6 Hz) for distances h ranging from 2 to 4. On the contrary, no significant differences were found between the two populations for the redundancy index R (h) . The most significant changes in the brain functional network of SCI patients occurred mainly in the lower spectral contents. These changes were related to an improved propagation of communication between the closest cortical areas rather than to a different level of redundancy. This evidence strengthens the hypothesis of the need for a higher functional interaction among the closest ROIs as a mechanism to compensate the lack of feedback from the peripheral nerves to the sensomotor areas.
Resumo:
Drinking water utilities in urban areas are focused on finding smart solutions facing new challenges in their real-time operation because of limited water resources, intensive energy requirements, a growing population, a costly and ageing infrastructure, increasingly stringent regulations, and increased attention towards the environmental impact of water use. Such challenges force water managers to monitor and control not only water supply and distribution, but also consumer demand. This paper presents and discusses novel methodologies and procedures towards an integrated water resource management system based on advanced ICT technologies of automation and telecommunications for largely improving the efficiency of drinking water networks (DWN) in terms of water use, energy consumption, water loss minimization, and water quality guarantees. In particular, the paper addresses the first results of the European project EFFINET (FP7-ICT2011-8-318556) devoted to the monitoring and control of the DWN in Barcelona (Spain). Results are split in two levels according to different management objectives: (i) the monitoring level is concerned with all the aspects involved in the observation of the current state of a system and the detection/diagnosis of abnormal situations. It is achieved through sensors and communications technology, together with mathematical models; (ii) the control level is concerned with computing the best suitable and admissible control strategies for network actuators as to optimize a given set of operational goals related to the performance of the overall system. This level covers the network control (optimal management of water and energy) and the demand management (smart metering, efficient supply). The consideration of the Barcelona DWN as the case study will allow to prove the general applicability of the proposed integrated ICT solutions and their effectiveness in the management of DWN, with considerable savings of electricity costs and reduced water loss while ensuring the high European standards of water quality to citizens.
Resumo:
The purpose of the present study is to discuss the eventual relationship between foreign direct investment in Brazil and trade balance, considering the period after the beginning of "Plano Real" on 1994, which presented a new currency regime and a new profile of Brazilian macroeconomy. It is important to state that there is a controversial debate around the question, since those investments are seen as positive to receiving countries by some authors and negative by other ones. Those who are favorable, argue that the recent attitude assumed by a lot of companies towards internalization is changing the modus operandi in some markets, providing a much more competitive framework. It is also remarkable - they also mention - the potential advantages brought by these new strategies. On the other hand, some authors defend that it increases the exposure of the country that receives such resources, since the subsidiaries of those companies operate under marketing strategies of profit maximization, considered the competitive environment they face. We will go over these opinions troughout the study, trying also to capture the reasons that usually motivate foreign companies to look for new markets and branches and also the effects on receiving country's Balance of Payments. Besides that point, the approach presented will try to answer if the increase of foreign capital stock in Brazil helps to explain some positive response from the country's trade balance, and more, on Balance of Payments. It is also important to mention that the considered period is extremely representative, mainly when considered the huge amounts involved and the increasing liberalization verified in brazilian's external policies since 1990. There is special concern, troughout the study, to define the pattern of such investments, and more, the impacts that those resources brought to public budget. The present study will focus on official data, published by Central Bank of Brazil, mainly those ones regarding Census of Foreign Capitals, as well as the referable to the evolution of Balance of Payments. Finally, based on statistical procedures, it will be provided multiple regressions on available data that will help the reader to capture the effects of some selected variables, which will bring a much more oriented analysis to the discussion.
Resumo:
A forte alta dos imóveis no Brasil nos últimos anos iniciou um debate sobre a possível existência de uma bolha especulativa. Dada a recente crise do crédito nos Estados Unidos, é factível questionar se a situação atual no Brasil pode ser comparada à crise americana. Considerando argumentos quantitativos e fundamentais, examina-se o contexto imobiliário brasileiro e questiona-se a sustentabilidade em um futuro próximo. Primeiramente, analisou-se a taxa de aluguel e o nível de acesso aos imóveis e também utilizou-se um modelo do custo real para ver se o mercado está em equilíbrio o não. Depois examinou-se alguns fatores fundamentais que afetam o preço dos imóveis – oferta e demanda, crédito e regulação, fatores culturais – para encontrar evidências que justificam o aumento dos preços dos imóveis. A partir dessas observações tentou-se chegar a uma conclusão sobre a evolução dos preços no mercado imobiliário brasileiro. Enquanto os dados sugerem que os preços dos imóveis estão supervalorizados em comparação ao preço dos aluguéis, há evidências de uma legítima demanda por novos imóveis na emergente classe média brasileira. Um risco maior pode estar no mercado de crédito, altamente alavancado em relação ao consumidor brasileiro. No entanto, não se encontrou evidências que sugerem mais do que uma temporária estabilização ou correção no preço dos imóveis.
Resumo:
This paper discusses two key aspects regarding the efficiency of the Argentinean Electricity Market. Using hourly data on prices, marginal costs, and operational status of generators, it will be argued that, unlike the former British and Californian electricity spot markets, this market is not subject to the conventional forms of exercise of market power by generators. We then use Chao's (1983) model of optimal configuation of electricity supply to evaluate the social desirability of the change in the supply pattern of the Argentinean electricity industry, which took place throughout the last ten years.
Resumo:
Qual o efeito de eleições em ativos reais? É possível mensurar diretamente a diferença de preços mesmo que só possamos enxergar um dos resultados potenciais? Essa dissertação estima esses efeitos utilizando metodologia baseada em opções sobre ações. O modelo aqui desenvolvido adaptção tradicional Black-Scholes para incorporar dois novos parâmetros: um salto no preço do ativo perfeitamente antecipado e uma série de probabilidades diárias refletindo as crenças sobre quem venceria a corrida eleitoral. Aplicamos esse método para o caso brasileiro das Eleições Presidenciais de 2014 e a Petrobras - uma importante companhia do setor petrolífero do país -utilizando dados de bolsa do segundo turno das eleições. Os resultados encontrados mostram uma diferença de 65-77% para o valor da companhia, dependendo de quem vencesse nas urnas. Isso é equivalente a aproximadamente 2.5% do PIB de 2014 do país.
Resumo:
This work presents a new approach for rainfall measurements making use of weather radar data for real time application to the radar systems operated by institute of Meteorological Research (IPMET) - UNESP - Bauru - SP-Brazil. Several real time adjustment techniques has been presented being most of them based on surface rain-gauge network. However, some of these methods do not regard the effect of the integration area, time integration and distance rainfall-radar. In this paper, artificial neural networks have been applied for generate a radar reflectivity-rain relationships which regard all effects described above. To evaluate prediction procedure, cross validation was performed using data from IPMET weather Doppler radar and rain-gauge network under the radar umbrella. The preliminary results were acceptable for rainfalls prediction. The small errors observed result from the spatial density and the time resolution of the rain-gauges networks used to calibrate the radar.
Resumo:
This work presents a methodological proposal for acquisition of biometric data through telemetry basing its development on a research-action and a case study. Nowadays, the qualified professionals of physical evaluation have to use specific devices to obtain biometric signals and data. These devices in the most of the time are high cost and difficult to use and handling. Therefore, the methodological proposal was elaborate in order to develop, conceptually, a bio telemetric device which could acquire the desirable biometric signals: oxymetry, biometrics, corporal temperature and pedometry which are essential for the area of physical evaluation. It was researched the existent biometrics sensors, the possible ways for the remote transmission of signals and the computer systems available so that the acquisition of data could be possible. This methodological proposal of remote acquisition of biometrical signals is structured in four modules: Acquisitor of biometrics data; Converser and transmitter of biometric signals; Receiver and Processor of biometrics signals and Generator of Interpretative Graphs. The modules aim the obtention of interpretative graphics of human biometric signals. In order to validate this proposal a functional prototype was developed and it is presented in the development of this work.
Resumo:
This paper adjusts decentralized OPF optimization to the AC power flow problem in power systems with interconnected areas operated by diferent transmission system operators (TSO). The proposed methodology allows finding the operation point of a particular area without explicit knowledge of network data of the other interconnected areas, being only necessary to exchange border information related to the tie-lines between areas. The methodology is based on the decomposition of the first-order optimality conditions of the AC power flow, which is formulated as a nonlinear programming problem. To allow better visualization of the concept of independent operation of each TSO, an artificial neural network have been used for computing border information of the interconnected TSOs. A multi-area Power Flow tool can be seen as a basic building block able to address a large number of problems under a multi-TSO competitive market philosophy. The IEEE RTS-96 power system is used in order to show the operation and effectiveness of the decentralized AC Power Flow. ©2010 IEEE.
Resumo:
Following the thermodynamic formulation of a multifractal measure that was shown to enable the detection of large fluctuations at an early stage, here we propose a new index which permits us to distinguish events like financial crises in real time. We calculate the partition function from which we can obtain thermodynamic quantities analogous to the free energy and specific heat. The index is defined as the normalized energy variation and it can be used to study the behavior of stochastic time series, such as financial market daily data. Famous financial market crashes-Black Thursday (1929), Black Monday (1987) and the subprime crisis (2008)-are identified with clear and robust results. The method is also applied to the market fluctuations of 2011. From these results it appears as if the apparent crisis of 2011 is of a different nature to the other three. We also show that the analysis has forecasting capabilities. © 2012 Elsevier B.V. All rights reserved.
Resumo:
This thesis is dedicated to the analysis of non-linear pricing in oligopoly. Non-linear pricing is a fairly predominant practice in most real markets, mostly characterized by some amount of competition. The sophistication of pricing practices has increased in the latest decades due to the technological advances that have allowed companies to gather more and more data on consumers preferences. The first essay of the thesis highlights the main characteristics of oligopolistic non-linear pricing. Non-linear pricing is a special case of price discrimination. The theory of price discrimination has to be modified in presence of oligopoly: in particular, a crucial role is played by the competitive externality that implies that product differentiation is closely related to the possibility of discriminating. The essay reviews the theory of competitive non-linear pricing by starting from its foundations, mechanism design under common agency. The different approaches to model non-linear pricing are then reviewed. In particular, the difference between price and quantity competition is highlighted. Finally, the close link between non-linear pricing and the recent developments in the theory of vertical differentiation is explored. The second essay shows how the effects of non-linear pricing are determined by the relationship between the demand and the technological structure of the market. The chapter focuses on a model in which firms supply a homogeneous product in two different sizes. Information about consumers' reservation prices is incomplete and the production technology is characterized by size economies. The model provides insights on the size of the products that one finds in the market. Four equilibrium regions are identified depending on the relative intensity of size economies with respect to consumers' evaluation of the good. Regions for which the product is supplied in a single unit or in several different sizes or in only a very large one. Both the private and social desirability of non-linear pricing varies across different equilibrium regions. The third essay considers the broadband internet market. Non discriminatory issues seem the core of the recent debate on the opportunity or not of regulating the internet. One of the main questions posed is whether the telecom companies, owning the networks constituting the internet, should be allowed to offer quality-contingent contracts to content providers. The aim of this essay is to analyze the issue through a stylized two-sided market model of the web that highlights the effects of such a discrimination over quality, prices and participation to the internet of providers and final users. An overall welfare comparison is proposed, concluding that the final effects of regulation crucially depend on both the technology and preferences of agents.
Resumo:
Environmental computer models are deterministic models devoted to predict several environmental phenomena such as air pollution or meteorological events. Numerical model output is given in terms of averages over grid cells, usually at high spatial and temporal resolution. However, these outputs are often biased with unknown calibration and not equipped with any information about the associated uncertainty. Conversely, data collected at monitoring stations is more accurate since they essentially provide the true levels. Due the leading role played by numerical models, it now important to compare model output with observations. Statistical methods developed to combine numerical model output and station data are usually referred to as data fusion. In this work, we first combine ozone monitoring data with ozone predictions from the Eta-CMAQ air quality model in order to forecast real-time current 8-hour average ozone level defined as the average of the previous four hours, current hour, and predictions for the next three hours. We propose a Bayesian downscaler model based on first differences with a flexible coefficient structure and an efficient computational strategy to fit model parameters. Model validation for the eastern United States shows consequential improvement of our fully inferential approach compared with the current real-time forecasting system. Furthermore, we consider the introduction of temperature data from a weather forecast model into the downscaler, showing improved real-time ozone predictions. Finally, we introduce a hierarchical model to obtain spatially varying uncertainty associated with numerical model output. We show how we can learn about such uncertainty through suitable stochastic data fusion modeling using some external validation data. We illustrate our Bayesian model by providing the uncertainty map associated with a temperature output over the northeastern United States.
Resumo:
This thesis is a collection of works focused on the topic of Earthquake Early Warning, with a special attention to large magnitude events. The topic is addressed from different points of view and the structure of the thesis reflects the variety of the aspects which have been analyzed. The first part is dedicated to the giant, 2011 Tohoku-Oki earthquake. The main features of the rupture process are first discussed. The earthquake is then used as a case study to test the feasibility Early Warning methodologies for very large events. Limitations of the standard approaches for large events arise in this chapter. The difficulties are related to the real-time magnitude estimate from the first few seconds of recorded signal. An evolutionary strategy for the real-time magnitude estimate is proposed and applied to the single Tohoku-Oki earthquake. In the second part of the thesis a larger number of earthquakes is analyzed, including small, moderate and large events. Starting from the measurement of two Early Warning parameters, the behavior of small and large earthquakes in the initial portion of recorded signals is investigated. The aim is to understand whether small and large earthquakes can be distinguished from the initial stage of their rupture process. A physical model and a plausible interpretation to justify the observations are proposed. The third part of the thesis is focused on practical, real-time approaches for the rapid identification of the potentially damaged zone during a seismic event. Two different approaches for the rapid prediction of the damage area are proposed and tested. The first one is a threshold-based method which uses traditional seismic data. Then an innovative approach using continuous, GPS data is explored. Both strategies improve the prediction of large scale effects of strong earthquakes.
Resumo:
Over the time, Twitter has become a fundamental source of information for news. As a one step forward, researchers have tried to analyse if the tweets contain predictive power. In the past, in financial field, a lot of research has been done to propose a function which takes as input all the tweets for a particular stock or index s, analyse them and predict the stock or index price of s. In this work, we take an alternative approach: using the stock price and tweet information, we investigate following questions. 1. Is there any relation between the amount of tweets being generated and the stocks being exchanged? 2. Is there any relation between the sentiment of the tweets and stock prices? 3. What is the structure of the graph that describes the relationships between users?