74 resultados para Real Electricity Markets Data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This analysis was stimulated by the real data analysis problem of householdexpenditure data. The full dataset contains expenditure data for a sample of 1224 households. The expenditure is broken down at 2 hierarchical levels: 9 major levels (e.g. housing, food, utilities etc.) and 92 minor levels. There are also 5 factors and 5 covariates at the household level. Not surprisingly, there are a small number of zeros at the major level, but many zeros at the minor level. The question is how best to model the zeros. Clearly, models that tryto add a small amount to the zero terms are not appropriate in general as at least some of the zeros are clearly structural, e.g. alcohol/tobacco for households that are teetotal. The key question then is how to build suitable conditional models. For example, is the sub-composition of spendingexcluding alcohol/tobacco similar for teetotal and non-teetotal households?In other words, we are looking for sub-compositional independence. Also, what determines whether a household is teetotal? Can we assume that it is independent of the composition? In general, whether teetotal will clearly depend on the household level variables, so we need to be able to model this dependence. The other tricky question is that with zeros on more than onecomponent, we need to be able to model dependence and independence of zeros on the different components. Lastly, while some zeros are structural, others may not be, for example, for expenditure on durables, it may be chance as to whether a particular household spends money on durableswithin the sample period. This would clearly be distinguishable if we had longitudinal data, but may still be distinguishable by looking at the distribution, on the assumption that random zeros will usually be for situations where any non-zero expenditure is not small.While this analysis is based on around economic data, the ideas carry over tomany other situations, including geological data, where minerals may be missing for structural reasons (similar to alcohol), or missing because they occur only in random regions which may be missed in a sample (similar to the durables)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The statistical analysis of compositional data is commonly used in geological studies.As is well-known, compositions should be treated using logratios of parts, which aredifficult to use correctly in standard statistical packages. In this paper we describe thenew features of our freeware package, named CoDaPack, which implements most of thebasic statistical methods suitable for compositional data. An example using real data ispresented to illustrate the use of the package

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The automatic interpretation of conventional traffic signs is very complex and time consuming. The paper concerns an automatic warning system for driving assistance. It does not interpret the standard traffic signs on the roadside; the proposal is to incorporate into the existing signs another type of traffic sign whose information will be more easily interpreted by a processor. The type of information to be added is profuse and therefore the most important object is the robustness of the system. The basic proposal of this new philosophy is that the co-pilot system for automatic warning and driving assistance can interpret with greater ease the information contained in the new sign, whilst the human driver only has to interpret the "classic" sign. One of the codings that has been tested with good results and which seems to us easy to implement is that which has a rectangular shape and 4 vertical bars of different colours. The size of these signs is equivalent to the size of the conventional signs (approximately 0.4 m2). The colour information from the sign can be easily interpreted by the proposed processor and the interpretation is much easier and quicker than the information shown by the pictographs of the classic signs

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents recent WMR (wheeled mobile robot) navigation experiences using local perception knowledge provided by monocular and odometer systems. A local narrow perception horizon is used to plan safety trajectories towards the objective. Therefore, monocular data are proposed as a way to obtain real time local information by building two dimensional occupancy grids through a time integration of the frames. The path planning is accomplished by using attraction potential fields, while the trajectory tracking is performed by using model predictive control techniques. The results are faced to indoor situations by using the lab available platform consisting in a differential driven mobile robot

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’objectiu d’aquest PFC és desenvolupar un sistema de pluja per a videojocs i aplicacions de realitat virtual que sigui acurat, tant en el sentit del realisme visual com del seu comportament. El projecte permetrà al desenvolupador de videojocs incorporar a les seves aplicacions, zones de pluja amb diferents intensitats utilitzant el hardware gràfic més modern, per així evitar que aquesta pluja sigui processada per la CPU i per tant pugui alentir el videojoc que està creant. S’han desenvolupat dos sistemes, el sistema d’edició de pluja i el de visualització en temps real

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally accepted that financial markets are efficient in the long run a lthough there may be some deviations in the short run. It is also accepted that a good portfolio manager is the one who beats the market persistently along time, this type of manager could not exist if markets were perfectly efficient According to this in a pure efficient market we should find that managers know that they can not beat the market so they would undertake only pure passive management strategies. Assuming a certain degree of inefficiency in the short run, a market may show some managers who tr y to beat the market by undertaking active strategies. From Fama’s efficient markets theory we can state that these active managers may beat the market occasionally although they will not be able to enhance significantly their performance in the long run. On the other hand, in an inefficient market it would be expected to find a higher level of activity related with the higher probability of beating the market. In this paper we follow two objectives: first, we set a basis to analyse the level of efficiency in an asset invest- ment funds market by measuring performance, strategies activity and it’s persistence for a certain group of funds during the period of study. Second, we analyse individual performance persistence in order to determine the existence of skilled managers. The CAPM model is taken as theoretical background and the use of the Sharpe’s ratio as a suitable performance measure in a limited information environment leads to a group performance measurement proposal. The empiri- cal study takes quarterly data from 1999-2007 period, for the whole population of the Spanish asset investment funds market, provided by the CNMV (Comisión Nacional del Mercado de Valores). This period of study has been chosen to ensure a wide enough range of efficient market observation so it would allow us to set a proper basis to compare with the following period. As a result we develop a model that allows us to measure efficiency in a given asset mutual funds market, based on the level of strategy’s activity undertaken by managers. We also observe persistence in individual performance for a certain group of funds

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the effects of the financial crisis on the stationarity of real interest rates in the Euro Area. We use a new unit root test developed by Peseran et al. (2013) that allows for multiple unobserved factors in a panel set up. Our results suggest that while short-term and long-term real interest rates were stationary before the financial crisis, they became nonstationary during the crisis period likely due to persistent risk that characterized financial markets during that time. JEL codes: E43, C23. Keywords: Real interest rates, Euro Area, financial crisis, panel unit root tests, cross-sectional dependence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the disadvantages of old age is that there is more past than future: this,however, may be turned into an advantage if the wealth of experience and, hopefully,wisdom gained in the past can be reflected upon and throw some light on possiblefuture trends. To an extent, then, this talk is necessarily personal, certainly nostalgic,but also self critical and inquisitive about our understanding of the discipline ofstatistics. A number of almost philosophical themes will run through the talk: searchfor appropriate modelling in relation to the real problem envisaged, emphasis onsensible balances between simplicity and complexity, the relative roles of theory andpractice, the nature of communication of inferential ideas to the statistical layman, theinter-related roles of teaching, consultation and research. A list of keywords might be:identification of sample space and its mathematical structure, choices betweentransform and stay, the role of parametric modelling, the role of a sample spacemetric, the underused hypothesis lattice, the nature of compositional change,particularly in relation to the modelling of processes. While the main theme will berelevance to compositional data analysis we shall point to substantial implications forgeneral multivariate analysis arising from experience of the development ofcompositional data analysis…

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the quantitative properties of a dynamic general equilibrium model in which agents face both idiosyncratic and aggregate income risk, state-dependent borrowing constraints that bind in some but not all periods and markets are incomplete. Optimal individual consumption-savings plans and equilibrium asset prices are computed under various assumptions about income uncertainty. Then we investigate whether our general equilibrium model with incomplete markets replicates two empirical observations: the high correlation between individual consumption and individual income, and the equity premium puzzle. We find that, when the driving processes are calibrated according to the data from wage income in different sectors of the US economy, the results move in the direction of explaining these observations, but the model falls short of explaining the observed correlations quantitatively. If the incomes of agents are assumed independent of each other, the observations can be explained quantitatively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is common in econometric applications that several hypothesis tests arecarried out at the same time. The problem then becomes how to decide whichhypotheses to reject, accounting for the multitude of tests. In this paper,we suggest a stepwise multiple testing procedure which asymptoticallycontrols the familywise error rate at a desired level. Compared to relatedsingle-step methods, our procedure is more powerful in the sense that itoften will reject more false hypotheses. In addition, we advocate the useof studentization when it is feasible. Unlike some stepwise methods, ourmethod implicitly captures the joint dependence structure of the teststatistics, which results in increased ability to detect alternativehypotheses. We prove our method asymptotically controls the familywise errorrate under minimal assumptions. We present our methodology in the context ofcomparing several strategies to a common benchmark and deciding whichstrategies actually beat the benchmark. However, our ideas can easily beextended and/or modied to other contexts, such as making inference for theindividual regression coecients in a multiple regression framework. Somesimulation studies show the improvements of our methods over previous proposals. We also provide an application to a set of real data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper tests for the market environment within which US fiscal policyoperates, that is we test for the incompleteness of the US government bondmarket. We document the stochastic properties of US debt and deficits andthen consider the ability of competing optimal tax models to account forthis behaviour. We show that when a government pursues an optimal taxpolicy and issues a full set of contingent claims, the value of debthas the same or less persistence than other variables in the economyand declines in response to higher deficit shocks. By contrast, ifgovernments only issue one-period risk free bonds (incomplete markets),debt shows more persistence than other variables and it increases inresponse to expenditure shocks. Maintaining the hypothesis of Ramseybehavior, US data conflicts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides empirical evidence on the explanatory factorsaffecting introductory prices of new pharmaceuticals in a heavilyregulated and highly subsidized market. We collect a data setconsisting of all new chemical entities launched in Spain between1997 and 2005, and model launching prices. We found that, unlike inthe US and Sweden, therapeutically "innovative" products are notoverpriced relative to "imitative" ones. Price setting is mainly used asa mechanism to adjust for inflation independently of the degree ofinnovation. The drugs that enter through the centralized EMAapproval procedure are overpriced, which may be a consequence ofmarket globalization and international price setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper I analyze the effects of insider trading on real investmentand the insurance role of financial markets. There is a single entrepreneurwho, at a first stage, chooses the level of investment in a risky business.At the second stage, an asset with random payoff is issued and then the entrepreneurreceives some privileged information on the likely realization of productionreturn. At the third stage, trading occurs on the asset market, where theentrepreneur faces the aggregate demand coming from a continuum of rationaluniformed traders and some noise traders. I compare the equilibrium withinsider trading (when the entrepreneur trades on her inside information in theasset market) with the equilibrium in the same market without insider trading. Ifind that permitting insider trading tends to decrease the level of realinvestment. Moreover, the asset market is thinner and the entrepreneur's netsupply of the asset and the hedge ratio are lower, although the asset priceis more informative and volatile.