959 resultados para Polynomial distributed lag models
Resumo:
We present new populational growth models, generalized logistic models which are proportional to beta densities with shape parameters p and 2, where p > 1, with Malthusian parameter r. The complex dynamical behaviour of these models is investigated in the parameter space (r, p), in terms of topological entropy, using explicit methods, when the Malthusian parameter r increases. This parameter space is split into different regions, according to the chaotic behaviour of the models.
Resumo:
Recent literature has proved that many classical pricing models (Black and Scholes, Heston, etc.) and risk measures (V aR, CV aR, etc.) may lead to “pathological meaningless situations”, since traders can build sequences of portfolios whose risk leveltends to −infinity and whose expected return tends to +infinity, i.e., (risk = −infinity, return = +infinity). Such a sequence of strategies may be called “good deal”. This paper focuses on the risk measures V aR and CV aR and analyzes this caveat in a discrete time complete pricing model. Under quite general conditions the explicit expression of a good deal is given, and its sensitivity with respect to some possible measurement errors is provided too. We point out that a critical property is the absence of short sales. In such a case we first construct a “shadow riskless asset” (SRA) without short sales and then the good deal is given by borrowing more and more money so as to invest in the SRA. It is also shown that the SRA is interested by itself, even if there are short selling restrictions.
Resumo:
Although stock prices fluctuate, the variations are relatively small and are frequently assumed to be normal distributed on a large time scale. But sometimes these fluctuations can become determinant, especially when unforeseen large drops in asset prices are observed that could result in huge losses or even in market crashes. The evidence shows that these events happen far more often than would be expected under the generalized assumption of normal distributed financial returns. Thus it is crucial to properly model the distribution tails so as to be able to predict the frequency and magnitude of extreme stock price returns. In this paper we follow the approach suggested by McNeil and Frey (2000) and combine the GARCH-type models with the Extreme Value Theory (EVT) to estimate the tails of three financial index returns DJI,FTSE 100 and NIKKEI 225 representing three important financial areas in the world. Our results indicate that EVT-based conditional quantile estimates are much more accurate than those from conventional AR-GARCH models assuming normal or Student’s t-distribution innovations when doing out-of-sample estimation (within the insample estimation, this is so for the right tail of the distribution of returns).
Resumo:
The aim of this paper is to analyze the forecasting ability of the CARR model proposed by Chou (2005) using the S&P 500. We extend the data sample, allowing for the analysis of different stock market circumstances and propose the use of various range estimators in order to analyze their forecasting performance. Our results show that there are two range-based models that outperform the forecasting ability of the GARCH model. The Parkinson model is better for upward trends and volatilities which are higher and lower than the mean while the CARR model is better for downward trends and mean volatilities.
Resumo:
We consider a simple model consisting of particles with four bonding sites ("patches"), two of type A and two of type B, on the square lattice, and investigate its global phase behavior by simulations and theory. We set the interaction between B patches to zero and calculate the phase diagram as the ratio between the AB and the AA interactions, epsilon(AB)*, varies. In line with previous work, on three-dimensional off-lattice models, we show that the liquid-vapor phase diagram exhibits a re-entrant or "pinched" shape for the same range of epsilon(AB)*, suggesting that the ratio of the energy scales - and the corresponding empty fluid regime - is independent of the dimensionality of the system and of the lattice structure. In addition, the model exhibits an order-disorder transition that is ferromagnetic in the re-entrant regime. The use of low-dimensional lattice models allows the simulation of sufficiently large systems to establish the nature of the liquid-vapor critical points and to describe the structure of the liquid phase in the empty fluid regime, where the size of the "voids" increases as the temperature decreases. We have found that the liquid-vapor critical point is in the 2D Ising universality class, with a scaling region that decreases rapidly as the temperature decreases. The results of simulations and theoretical analysis suggest that the line of order-disorder transitions intersects the condensation line at a multi-critical point at zero temperature and density, for patchy particle models with a re-entrant, empty fluid, regime. (C) 2011 American Institute of Physics. [doi: 10.1063/1.3657406]
Resumo:
We study the implications for two-Higgs-doublet models of the recent announcement at the LHC giving a tantalizing hint for a Higgs boson of mass 125 GeV decaying into two photons. We require that the experimental result be within a factor of 2 of the theoretical standard model prediction, and analyze the type I and type II models as well as the lepton-specific and flipped models, subject to this requirement. It is assumed that there is no new physics other than two Higgs doublets. In all of the models, we display the allowed region of parameter space taking the recent LHC announcement at face value, and we analyze the W+W-, ZZ, (b) over barb, and tau(+)tau(-) expectations in these allowed regions. Throughout the entire range of parameter space allowed by the gamma gamma constraint, the numbers of events for Higgs decays into WW, ZZ, and b (b) over bar are not changed from the standard model by more than a factor of 2. In contrast, in the lepton-specific model, decays to tau(+)tau(-) are very sensitive across the entire gamma gamma-allowed region.
Resumo:
The idiomatic expression “In Rome be a Roman” can be applied to leadership training and development as well. Leaders who can act as role models inspire other future leaders in their behaviour, attitudes and ways of thinking. Based on two examples of current leaders in the fields of Politics and Public Administration, I support the idea that exposure to role models during their training was decisive for their career paths and current activities as prominent characters in their profession. Issues such as how students should be prepared for community or national leadership as well as cross-cultural engagement are raised here. The hypothesis of transculturalism and cross-cultural commitment as a factor of leadership is presented. Based on current literature on Leadership as well as the presented case studies, I expect to raise a debate focusing on strategies for improving leaders’ training in their cross-cultural awareness.
Resumo:
The advances made in channel-capacity codes, such as turbo codes and low-density parity-check (LDPC) codes, have played a major role in the emerging distributed source coding paradigm. LDPC codes can be easily adapted to new source coding strategies due to their natural representation as bipartite graphs and the use of quasi-optimal decoding algorithms, such as belief propagation. This paper tackles a relevant scenario in distributedvideo coding: lossy source coding when multiple side information (SI) hypotheses are available at the decoder, each one correlated with the source according to different correlation noise channels. Thus, it is proposed to exploit multiple SI hypotheses through an efficient joint decoding technique withmultiple LDPC syndrome decoders that exchange information to obtain coding efficiency improvements. At the decoder side, the multiple SI hypotheses are created with motion compensated frame interpolation and fused together in a novel iterative LDPC based Slepian-Wolf decoding algorithm. With the creation of multiple SI hypotheses and the proposed decoding algorithm, bitrate savings up to 8.0% are obtained for similar decoded quality.
Resumo:
O documento em anexo encontra-se na versão post-print (versão corrigida pelo editor).
Resumo:
Long-term contractual decisions are the basis of an efficient risk management. However those types of decisions have to be supported with a robust price forecast methodology. This paper reports a different approach for long-term price forecast which tries to give answers to that need. Making use of regression models, the proposed methodology has as main objective to find the maximum and a minimum Market Clearing Price (MCP) for a specific programming period, and with a desired confidence level α. Due to the problem complexity, the meta-heuristic Particle Swarm Optimization (PSO) was used to find the best regression parameters and the results compared with the obtained by using a Genetic Algorithm (GA). To validate these models, results from realistic data are presented and discussed in detail.
Resumo:
Power systems have been through deep changes in recent years, namely with the operation of competitive electricity markets in the scope and the increasingly intensive use of renewable energy sources and distributed generation. This requires new business models able to cope with the new opportunities that have emerged. Virtual Power Players (VPPs) are a new player type which allows aggregating a diversity of players (Distributed Generation (DG), Storage Agents (SA), Electrical Vehicles, (V2G) and consumers), to facilitate their participation in the electricity markets and to provide a set of new services promoting generation and consumption efficiency, while improving players` benefits. A major task of VPPs is the remuneration of generation and services (maintenance, market operation costs and energy reserves), as well as charging energy consumption. This paper proposes a model to implement fair and strategic remuneration and tariff methodologies, able to allow efficient VPP operation and VPP goals accomplishment in the scope of electricity markets.
Resumo:
Renewable based power generation has significantly increased over the last years. However, this process has evolved separately from electricity markets, leading to an inadequacy of the present market models to cope with huge quantities of renewable energy resources, and to take full advantage of the presently existing and the increasing envisaged renewable based and distributed energy resources. This paper proposes the modelling of electricity markets at several levels (continental, regional and micro), taking into account the specific characteristics of the players and resources involved in each level and ensuring that the proposed models accommodate adequate business models able to support the contribution of all the resources in the system, from the largest to the smaller ones. The proposed market models are integrated in MASCEM (Multi- Agent Simulator of Competitive Electricity Markets), using the multi agent approach advantages for overcoming the current inadequacy and significant limitations of the presently existing electricity market simulators to deal with the complex electricity market models that must be adopted.
Resumo:
This chapter addresses the resolution of dynamic scheduling by means of meta-heuristic and multi-agent systems. Scheduling is an important aspect of automation in manufacturing systems. Several contributions have been proposed, but the problem is far from being solved satisfactorily, especially if scheduling concerns real world applications. The proposed multi-agent scheduling system assumes the existence of several resource agents (which are decision-making entities based on meta-heuristics) distributed inside the manufacturing system that interact with other agents in order to obtain optimal or near-optimal global performances.
Resumo:
In competitive electricity markets with deep concerns at the efficiency level, demand response programs gain considerable significance. In the same way, distributed generation has gained increasing importance in the operation and planning of power systems. Grid operators and utilities are taking new initiatives, recognizing the value of demand response and of distributed generation for grid reliability and for the enhancement of organized spot market´s efficiency. Grid operators and utilities become able to act in both energy and reserve components of electricity markets. This paper proposes a methodology for a joint dispatch of demand response and distributed generation to provide energy and reserve by a virtual power player that operates a distribution network. The proposed method has been computationally implemented and its application is illustrated in this paper using a 32 bus distribution network with 32 medium voltage consumers.
Resumo:
Distributed Energy Resources (DER) scheduling in smart grids presents a new challenge to system operators. The increase of new resources, such as storage systems and demand response programs, results in additional computational efforts for optimization problems. On the other hand, since natural resources, such as wind and sun, can only be precisely forecasted with small anticipation, short-term scheduling is especially relevant requiring a very good performance on large dimension problems. Traditional techniques such as Mixed-Integer Non-Linear Programming (MINLP) do not cope well with large scale problems. This type of problems can be appropriately addressed by metaheuristics approaches. This paper proposes a new methodology called Signaled Particle Swarm Optimization (SiPSO) to address the energy resources management problem in the scope of smart grids, with intensive use of DER. The proposed methodology’s performance is illustrated by a case study with 99 distributed generators, 208 loads, and 27 storage units. The results are compared with those obtained in other methodologies, namely MINLP, Genetic Algorithm, original Particle Swarm Optimization (PSO), Evolutionary PSO, and New PSO. SiPSO performance is superior to the other tested PSO variants, demonstrating its adequacy to solve large dimension problems which require a decision in a short period of time.