44 resultados para Market data approach

em Repositório Científico do Instituto Politécnico de Lisboa - Portugal


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of wind power energy for energy and environmental policies has been growing in past recent years. However, because of its random nature over time, the wind generation cannot be reliable dispatched and perfectly forecasted, becoming a challenge when integrating this production in power systems. In addition the wind energy has to cope with the diversity of production resulting from alternative wind power profiles located in different regions. In 2012, Portugal presented a cumulative installed capacity distributed over 223 wind farms [1]. In this work the circular data statistical methods are used to analyze and compare alternative spatial wind generation profiles. Variables indicating extreme situations are analyzed. The hour (s) of the day where the farm production attains its maximum daily production is considered. This variable was converted into circular variable, and the use of circular statistics enables to identify the daily hour distribution for different wind production profiles. This methodology was applied to a real case, considering data from the Portuguese power system regarding the year 2012 with a 15-minutes interval. Six geographical locations were considered, representing different wind generation profiles in the Portuguese system.In this work the circular data statistical methods are used to analyze and compare alternative spatial wind generation profiles. Variables indicating extreme situations are analyzed. The hour (s) of the day where the farm production attains its maximum daily production is considered. This variable was converted into circular variable, and the use of circular statistics enables to identify the daily hour distribution for different wind production profiles. This methodology was applied to a real case, considering data from the Portuguese power system regarding the year 2012 with a 15-minutes interval. Six geographical locations were considered, representing different wind generation profiles in the Portuguese system.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This paper seeks to study the persistence in the G7’s stock market volatility, which is carried out using the GARCH, IGARCH and FIGARCH models. The data set consists of the daily returns of the S&P/TSX 60, CAC 40, DAX 30, MIB 30, NIKKEI 225, FTSE 100 and S&P 500 indexes over the period 1999-2009. The results evidences long memory in volatility, which is more pronounced in Germany, Italy and France. On the other hand, Japan appears as the country where this phenomenon is less obvious; nevertheless, the persistence prevails but with minor intensity.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Although stock prices fluctuate, the variations are relatively small and are frequently assumed to be normal distributed on a large time scale. But sometimes these fluctuations can become determinant, especially when unforeseen large drops in asset prices are observed that could result in huge losses or even in market crashes. The evidence shows that these events happen far more often than would be expected under the generalized assumption of normal distributed financial returns. Thus it is crucial to properly model the distribution tails so as to be able to predict the frequency and magnitude of extreme stock price returns. In this paper we follow the approach suggested by McNeil and Frey (2000) and combine the GARCH-type models with the Extreme Value Theory (EVT) to estimate the tails of three financial index returns DJI,FTSE 100 and NIKKEI 225 representing three important financial areas in the world. Our results indicate that EVT-based conditional quantile estimates are much more accurate than those from conventional AR-GARCH models assuming normal or Student’s t-distribution innovations when doing out-of-sample estimation (within the insample estimation, this is so for the right tail of the distribution of returns).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, a novel hybrid approach is proposed for electricity prices forecasting in a competitive market, considering a time horizon of 1 week. The proposed approach is based on the combination of particle swarm optimization and adaptive-network based fuzzy inference system. Results from a case study based on the electricity market of mainland Spain are presented. A thorough comparison is carried out, taking into account the results of previous publications, to demonstrate its effectiveness regarding forecasting accuracy and computation time. Finally, conclusions are duly drawn. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, a hybrid intelligent approach is proposed for short-term electricity prices forecasting in a competitive market. The proposed approach is based on the wavelet transform and a hybrid of neural networks and fuzzy logic. Results from a case study based on the electricity market of mainland Spain are presented. A thorough comparison is carried out, taking into account the results of previous publications. Conclusions are duly drawn. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Research on the problem of feature selection for clustering continues to develop. This is a challenging task, mainly due to the absence of class labels to guide the search for relevant features. Categorical feature selection for clustering has rarely been addressed in the literature, with most of the proposed approaches having focused on numerical data. In this work, we propose an approach to simultaneously cluster categorical data and select a subset of relevant features. Our approach is based on a modification of a finite mixture model (of multinomial distributions), where a set of latent variables indicate the relevance of each feature. To estimate the model parameters, we implement a variant of the expectation-maximization algorithm that simultaneously selects the subset of relevant features, using a minimum message length criterion. The proposed approach compares favourably with two baseline methods: a filter based on an entropy measure and a wrapper based on mutual information. The results obtained on synthetic data illustrate the ability of the proposed expectation-maximization method to recover ground truth. An application to real data, referred to official statistics, shows its usefulness.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper is on the maximization of total profit in a day-ahead market for a price-taker producer needing a short-term scheduling for wind power plants coordination with concentrated solar power plants, having thermal energy storage systems. The optimization approach proposed for the maximization of profit is a mixed-integer linear programming problem. The approach considers not only transmission grid constraints, but also technical operating constraints on both wind and concentrated solar power plants. Then, an improved short-term scheduling coordination is provided due to the more accurate modelling presented in this paper. Computer simulation results based on data for the Iberian wind and concentrated solar power plants illustrate the coordination benefits and show the effectiveness of the approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper is on the maximization of total profit in a day-ahead market for a price-taker producer needing a short-term scheduling for wind power plants coordination with concentrated solar power plants, having thermal energy storage systems. The optimization approach proposed for the maximization of profit is a mixed-integer linear programming problem. The approach considers not only transmission grid constraints, but also technical operating constraints on both wind and concentrated solar power plants. Then, an improved short-term scheduling coordination is provided due to the more accurate modelling presented in this paper. Computer simulation results based on data for the Iberian wind and concentrated solar power plants illustrate the coordination benefits and show the effectiveness of the approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, a novel hybrid approach is proposed for electricity prices forecasting in a competitive market, considering a time horizon of 1 week. The proposed approach is based on the combination of particle swarm optimization and adaptive-network based fuzzy inference system. Results from a case study based on the electricity market of mainland Spain are presented. A thorough comparison is carried out, taking into account the results of previous publications, to demonstrate its effectiveness regarding forecasting accuracy and computation time. Finally, conclusions are duly drawn.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a mixed-integer nonlinear approach is proposed to support decision-making for a hydro power producer, considering a head-dependent hydro chain. The aim is to maximize the profit of the hydro power producer from selling energy into the electric market. As a new contribution to earlier studies, a risk aversion criterion is taken into account, as well as head-dependency. The volatility of the expected profit is limited through the conditional value-at-risk (CVaR). The proposed approach has been applied successfully to solve a case study based on one of the main Portuguese cascaded hydro systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Benchmarking is an important tool to organisations to improve their productivity, product quality, process efficiency or services. From Benchmarking the organisations could compare their performance with competitors and identify their strengths and weaknesses. This study intends to do a benchmarking analysis on the main Iberian Sea ports with a special focus on their container terminals efficiency. To attain this, the DEA (data envelopment analysis) is used since it is considered by several researchers as the most effective method to quantify a set of key performance indicators. In order to reach a more reliable diagnosis tool the DEA is used together with the data mining in comparing the sea ports operational data of container terminals during 2007.Taking into account that sea ports are global logistics networks the performance evaluation is essential to an effective decision making in order to improve their efficiency and, therefore, their competitiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a practical approach for profit-based unit commitment (PBUC) with emission limitations. Under deregulation, unit commitment has evolved from a minimum-cost optimisation problem to a profit-based optimisation problem. However, as a consequence of growing environmental concern, the impact of fossil-fuelled power plants must be considered, giving rise to emission limitations. The simultaneous address of the profit with the emission is taken into account in our practical approach by a multiobjective optimisation (MO) problem. Hence, trade-off Curves between profit and emission are obtained for different energy price profiles, in a way to aid decision-makers concerning emission allowance trading. Moreover, a new parameter is presented, ratio of change, and the corresponding gradient angle, enabling the proper selection of a compromise commitment for the units. A case study based on the standard IEEE 30-bus system is presented to illustrate the proficiency Of Our practical approach for the new competitive and environmentally constrained electricity supply industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is on the problem of short-term hydro, scheduling, particularly concerning head-dependent cascaded hydro systems. We propose a novel mixed-integer quadratic programming approach, considering not only head-dependency, but also discontinuous operating regions and discharge ramping constraints. Thus, an enhanced short-term hydro scheduling is provided due to the more realistic modeling presented in this paper. Numerical results from two case studies, based on Portuguese cascaded hydro systems, illustrate the proficiency of the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 27 December 1722 Algarve earthquake destroyed a large area in southern Portugal generating a local tsunami that inundated the shallow areas of Tavira. It is unclear whether its source was located onshore or offshore and, in any case, what was the tectonic source responsible for the event. We analyze available historical information concerning macroseismicity and the tsunami to discuss the most probable location of the source. We also review available seismotectonic knowledge of the offshore region close to the probable epicenter, selecting a set of four candidate sources. We simulate tsunamis produced by these candidate sources assuming that the sea bottom displacement is caused by a compressive dislocation over a rectangular fault, as given by the half-space homogeneous elastic approach, and we use numerical modeling to study wave propagation and run-up. We conclude that the 27 December 1722 Tavira earthquake and tsunami was probably generated offshore, close to 37 degrees 01'N, 7 degrees 49'W.