18 resultados para STOCHASTIC MARKETS
em Archivo Digital para la Docencia y la Investigación - Repositorio Institucional de la Universidad del País Vasco
Resumo:
30 p.
Resumo:
The aim of this paper is to explain under which circumstances using TACs as instrument to manage a fishery along with fishing periods may be interesting from a regulatory point of view. In order to do this, the deterministic analysis of Homans and Wilen (1997)and Anderson (2000) is extended to a stochastic scenario where the resource cannot be measured accurately. The resulting endogenous stochastic model is numerically solved for finding the optimal control rules in the Iberian sardine stock. Three relevant conclusions can be highligted from simulations. First, the higher the uncertainty about the state of the stock is, the lower the probability of closing the fishery is. Second, the use of TACs as management instrument in fisheries already regulated with fishing periods leads to: i) An increase of the optimal season length and harvests, especially for medium and high number of licences, ii) An improvement of the biological and economic variables when the size of the fleet is large; and iii) Eliminate the extinction risk for the resource. And third, the regulator would rather select the number of licences and do not restrict the season length.
Resumo:
The purpose of this article is to characterize dynamic optimal harvesting trajectories that maximize discounted utility assuming an age-structured population model, in the same line as Tahvonen (2009). The main novelty of our study is that uses as an age-structured population model the standard stochastic cohort framework applied in Virtual Population Analysis for fish stock assessment. This allows us to compare optimal harvesting in a discounted economic context with standard reference points used by fisheries agencies for long term management plans (e.g. Fmsy). Our main findings are the following. First, optimal steady state is characterized and sufficient conditions that guarantees its existence and uniqueness for the general case of n cohorts are shown. It is also proved that the optimal steady state coincides with the traditional target Fmsy when the utility function to be maximized is the yield and the discount rate is zero. Second, an algorithm to calculate the optimal path that easily drives the resource to the steady state is developed. And third, the algorithm is applied to the Northern Stock of hake. Results show that management plans based exclusively on traditional reference targets as Fmsy may drive fishery economic results far from the optimal.
Resumo:
This paper studies the behavior of the implied volatility function (smile) when the true distribution of the underlying asset is consistent with the stochastic volatility model proposed by Heston (1993). The main result of the paper is to extend previous results applicable to the smile as a whole to alternative degrees of moneyness. The conditions under which the implied volatility function changes whenever there is a change in the parameters associated with Hestons stochastic volatility model for a given degree of moneyness are given.
Resumo:
Published as an article in: Investigaciones Economicas, 2005, vol. 29, issue 3, pages 483-523.
Resumo:
In this article, we analyze how to evaluate fishery resource management under “ecological uncertainty”. In this context, an efficient policy consists of applying a different exploitation rule depending on the state of the resource and we could say that the stock is always in transition, jumping from one steady state to another. First, we propose a method for calibrating the growth path of the resource such that observed dynamics of resource and captures are matched. Second, we apply the calibration procedure proposed in two different fishing grounds: the European Anchovy (Division VIII) and the Southern Stock of Hake. Our results show that the role played by uncertainty is essential for the conclusions. For European Anchovy fishery (Division VIII) we find, in contrast with Del Valle et al. (2001), that this is not an overexploited fishing ground. However, we show that the Southern Stock of Hake is in a dangerous situation. In both cases our results are in accordance with ICES advice.
Resumo:
In this paper we introduce four scenario Cluster based Lagrangian Decomposition (CLD) procedures for obtaining strong lower bounds to the (optimal) solution value of two-stage stochastic mixed 0-1 problems. At each iteration of the Lagrangian based procedures, the traditional aim consists of obtaining the solution value of the corresponding Lagrangian dual via solving scenario submodels once the nonanticipativity constraints have been dualized. Instead of considering a splitting variable representation over the set of scenarios, we propose to decompose the model into a set of scenario clusters. We compare the computational performance of the four Lagrange multiplier updating procedures, namely the Subgradient Method, the Volume Algorithm, the Progressive Hedging Algorithm and the Dynamic Constrained Cutting Plane scheme for different numbers of scenario clusters and different dimensions of the original problem. Our computational experience shows that the CLD bound and its computational effort depend on the number of scenario clusters to consider. In any case, our results show that the CLD procedures outperform the traditional LD scheme for single scenarios both in the quality of the bounds and computational effort. All the procedures have been implemented in a C++ experimental code. A broad computational experience is reported on a test of randomly generated instances by using the MIP solvers COIN-OR and CPLEX for the auxiliary mixed 0-1 cluster submodels, this last solver within the open source engine COIN-OR. We also give computational evidence of the model tightening effect that the preprocessing techniques, cut generation and appending and parallel computing tools have in stochastic integer optimization. Finally, we have observed that the plain use of both solvers does not provide the optimal solution of the instances included in the testbed with which we have experimented but for two toy instances in affordable elapsed time. On the other hand the proposed procedures provide strong lower bounds (or the same solution value) in a considerably shorter elapsed time for the quasi-optimal solution obtained by other means for the original stochastic problem.
Resumo:
This paper models the mean and volatility spillovers of prices within the integrated Iberian and the interconnected Spanish and French electricity markets. Using the constant (CCC) and dynamic conditional correlation (DCC) bivariate models with three different specifications of the univariate variance processes, we study the extent to which increasing interconnection and harmonization in regulation have favoured price convergence. The data consist of daily prices calculated as the arithmetic mean of the hourly prices over a span from July 1st 2007 until February 29th 2012. The DCC model in which the variances of the univariate processes are specified with a VARMA(1,1) fits the data best for the integrated MIBEL whereas a CCC model with a GARCH(1,1) specification for the univariate variance processes is selected to model the price series in Spain and France. Results show that there are significant mean and volatility spillovers in the MIBEL, indicating strong interdependence between the two markets, while there is a weaker evidence of integration between the Spanish and French markets. We provide new evidence that the EU target of achieving a single electricity market largely depends on increasing trade between countries and homogeneous rules of market functioning.
Resumo:
[En]The present study aimed at investigating the existence of long memory properties in ten developed stock markets across the globe. When return series exhibit long memory, the series realizations are not independent over time and past returns can help predict future returns, thus violating the market efficiency hypothesis. It poses a serious challenge to the supporters of random walk behavior of the stock returns indicating a potentially predictable component in the series dynamics. We computed Hurst-Mandelbrot’s Classical R/S statistic, Lo’s statistic and semi parametric GPH statistic using spectral regression. The findings suggest existence of long memory in volatility and random walk for logarithmic return series in general for all the selected stock market indices. Findings are in line with the stylized facts of financial time series.
Resumo:
We present a scheme to generate clusters submodels with stage ordering from a (symmetric or a nonsymmetric one) multistage stochastic mixed integer optimization model using break stage. We consider a stochastic model in compact representation and MPS format with a known scenario tree. The cluster submodels are built by storing first the 0-1 the variables, stage by stage, and then the continuous ones, also stage by stage. A C++ experimental code has been implemented for reordering the stochastic model as well as the cluster decomposition after the relaxation of the non-anticipativiy constraints until the so-called breakstage. The computational experience shows better performance of the stage ordering in terms of elapsed time in a randomly generated testbed of multistage stochastic mixed integer problems.
Resumo:
We extend the classic Merton (1969, 1971) problem that investigates the joint consumption-savings and portfolio-selection problem under capital risk by assuming sophisticated but time-inconsistent agents. We introduce stochastic hyperbolic preferences as in Harris and Laibson (2013) and find closed-form solutions for Merton's optimal consumption and portfolio selection problem in continuous time. We find that the portfolio rule remains identical to the time-consistent solution with power utility and no borrowing constraints. However,the marginal propensity to consume out of wealth is unambiguously greater than the time-consistent, exponential case and,importantly, it is also more responsive to changes in risk. These results suggest that hyperbolic discounting with sophisticated agents offers promise for contributing to explaining important aspects of asset market data.
Resumo:
This paper deals with the economics of gasification facilities in general and IGCC power plants in particular. Regarding the prospects of these systems, passing the technological test is one thing, passing the economic test can be quite another. In this respect, traditional valuations assume constant input and/or output prices. Since this is hardly realistic, we allow for uncertainty in prices. We naturally look at the markets where many of the products involved are regularly traded. Futures markets on commodities are particularly useful for valuing uncertain future cash flows. Thus, revenues and variable costs can be assessed by means of sound financial concepts and actual market data. On the other hand, these complex systems provide a number of flexibility options (e.g., to choose among several inputs, outputs, modes of operation, etc.). Typically, flexibility contributes significantly to the overall value of real assets. Indeed, maximization of the asset value requires the optimal exercise of any flexibility option available. Yet the economic value of flexibility is elusive, the more so under (price) uncertainty. And the right choice of input fuels and/or output products is a main concern for the facility managers. As a particular application, we deal with the valuation of input flexibility. We follow the Real Options approach. In addition to economic variables, we also address technical and environmental issues such as energy efficiency, utility performance characteristics and emissions (note that carbon constraints are looming). Lastly, a brief introduction to some stochastic processes suitable for valuation purposes is provided.
Resumo:
4 p.
Resumo:
36 p.
Resumo:
167 p.