30 resultados para Stochastic discount factor
Resumo:
International fisheries agencies recommend exploitation paths that satisfy two features. First, for precautionary reasons exploitation paths should avoid high fishing mortality in those fisheries where the biomass is depleted to a degree that jeopardise the stock's capacity to produce the Maximum Sustainable Yield (MSY). Second, for economic and social reasons, captures should be as stable (smooth) as possible over time. In this article we show that a conflict between these two interests may occur when seeking for optimal exploitation paths using age structured bioeconomic approach. Our results show that this conflict be overtaken by using non constant discount factors that value future stocks considering their relative intertemporal scarcity.
Resumo:
I consider cooperation situations where players have network relations. Networks evolve according to a stationary transition probability matrix and at each moment in time players receive payoffs from a stationary allocation rule. Players discount the future by a common factor. The pair formed by an allocation rule and a transition probability matrix is called expected fair if for every link in the network both participants gain, marginally, and in discounted, expected terms, the same from it; and it is called a pairwise network formation procedure if the probability that a link is created (or eliminated) is positive if the discounted, expected gains to its two participants are positive too. The main result is the existence, for the discount factor small enough, of an expected fair and pairwise network formation procedure where the allocation rule is component balanced, meaning it distributes the total value of any maximal connected subnetwork among its participants. This existence result holds for all discount factors when the pairwise network formation procedure is restricted. I finally provide some comparison with previous models of farsighted network formation.
Resumo:
27 p.
Resumo:
This paper analyzes the effects of personal income tax progressivity on long-run economic growth, income inequality and social welfare. The quantitative implications of income tax progressivity increments are illustrated for the US economy under three main headings: individual effects (reduced labor supply and savings, and increased dispersion of tax rates); aggregate effects (lower GDP growth and lower income inequality); and welfare effects (lower dispersion of consumption across individuals and higher leisure levels, but also lower growth of future consumption). The social discount factor proves to be crucial for this third effect: a higher valuation of future generations' well-being requires a lower level of progressivity. Additionally, if tax revenues are used to provide a public good rather than just being discarded, a higher private valuation of such public goods will also call for a lower level of progressivity.
Resumo:
The purpose of this article is to characterize dynamic optimal harvesting trajectories that maximize discounted utility assuming an age-structured population model, in the same line as Tahvonen (2009). The main novelty of our study is that uses as an age-structured population model the standard stochastic cohort framework applied in Virtual Population Analysis for fish stock assessment. This allows us to compare optimal harvesting in a discounted economic context with standard reference points used by fisheries agencies for long term management plans (e.g. Fmsy). Our main findings are the following. First, optimal steady state is characterized and sufficient conditions that guarantees its existence and uniqueness for the general case of n cohorts are shown. It is also proved that the optimal steady state coincides with the traditional target Fmsy when the utility function to be maximized is the yield and the discount rate is zero. Second, an algorithm to calculate the optimal path that easily drives the resource to the steady state is developed. And third, the algorithm is applied to the Northern Stock of hake. Results show that management plans based exclusively on traditional reference targets as Fmsy may drive fishery economic results far from the optimal.
Resumo:
The aim of this paper is to explain under which circumstances using TACs as instrument to manage a fishery along with fishing periods may be interesting from a regulatory point of view. In order to do this, the deterministic analysis of Homans and Wilen (1997)and Anderson (2000) is extended to a stochastic scenario where the resource cannot be measured accurately. The resulting endogenous stochastic model is numerically solved for finding the optimal control rules in the Iberian sardine stock. Three relevant conclusions can be highligted from simulations. First, the higher the uncertainty about the state of the stock is, the lower the probability of closing the fishery is. Second, the use of TACs as management instrument in fisheries already regulated with fishing periods leads to: i) An increase of the optimal season length and harvests, especially for medium and high number of licences, ii) An improvement of the biological and economic variables when the size of the fleet is large; and iii) Eliminate the extinction risk for the resource. And third, the regulator would rather select the number of licences and do not restrict the season length.
Resumo:
This paper deals with the valuation of energy assets related to natural gas. In particular, we evaluate a baseload Natural Gas Combined Cycle (NGCC) power plant and an ancillary instalation, namely a Liquefied Natural Gas (LNG) facility, in a realistic setting; specifically, these investments enjoy a long useful life but require some non-negligible time to build. Then we focus on the valuation of several investment options again in a realistic setting. These include the option to invest in the power plant when there is uncertainty concerning the initial outlay, or the option's time to maturity, or the cost of CO2 emission permits, or when there is a chance to double the plant size in the future. Our model comprises three sources of risk. We consider uncertain gas prices with regard to both the current level and the long-run equilibrium level; the current electricity price is also uncertain. They all are assumed to show mean reversion. The two-factor model for natural gas price is calibrated using data from NYMEX NG futures contracts. Also, we calibrate the one-factor model for electricity price using data from the Spanish wholesale electricity market, respectively. Then we use the estimated parameter values alongside actual physical parameters from a case study to value natural gas plants. Finally, the calibrated parameters are also used in a Monte Carlo simulation framework to evaluate several American-type options to invest in these energy assets. We accomplish this by following the least squares MC approach.
Resumo:
This paper studies the behavior of the implied volatility function (smile) when the true distribution of the underlying asset is consistent with the stochastic volatility model proposed by Heston (1993). The main result of the paper is to extend previous results applicable to the smile as a whole to alternative degrees of moneyness. The conditions under which the implied volatility function changes whenever there is a change in the parameters associated with Hestons stochastic volatility model for a given degree of moneyness are given.
Resumo:
Published as an article in: The Quarterly Review of Economics and Finance, 2004, vol. 44, issue 2, pages 224-236.
Resumo:
In this article, we analyze how to evaluate fishery resource management under “ecological uncertainty”. In this context, an efficient policy consists of applying a different exploitation rule depending on the state of the resource and we could say that the stock is always in transition, jumping from one steady state to another. First, we propose a method for calibrating the growth path of the resource such that observed dynamics of resource and captures are matched. Second, we apply the calibration procedure proposed in two different fishing grounds: the European Anchovy (Division VIII) and the Southern Stock of Hake. Our results show that the role played by uncertainty is essential for the conclusions. For European Anchovy fishery (Division VIII) we find, in contrast with Del Valle et al. (2001), that this is not an overexploited fishing ground. However, we show that the Southern Stock of Hake is in a dangerous situation. In both cases our results are in accordance with ICES advice.
Resumo:
[ES] Desde hace varios años, el pequeño comercio en España y en otros países europeos viene enfrentándose a un entorno de creciente concurrencia, con la irrupción de nuevas formas comerciales más integradas vertical u horizontalmente, y mejor preparadas para competir en precio y variedad. Ante este hecho, las centrales de compra se ofrecen como una alternativa tendente a acortar distancias en estas dos cuestiones, a la vez que buscan la prestación de ciertos servicios que ayuden al pequeño comerciante a gestionar más eficaz y eficientemente su negocio.
Resumo:
In this paper we introduce four scenario Cluster based Lagrangian Decomposition (CLD) procedures for obtaining strong lower bounds to the (optimal) solution value of two-stage stochastic mixed 0-1 problems. At each iteration of the Lagrangian based procedures, the traditional aim consists of obtaining the solution value of the corresponding Lagrangian dual via solving scenario submodels once the nonanticipativity constraints have been dualized. Instead of considering a splitting variable representation over the set of scenarios, we propose to decompose the model into a set of scenario clusters. We compare the computational performance of the four Lagrange multiplier updating procedures, namely the Subgradient Method, the Volume Algorithm, the Progressive Hedging Algorithm and the Dynamic Constrained Cutting Plane scheme for different numbers of scenario clusters and different dimensions of the original problem. Our computational experience shows that the CLD bound and its computational effort depend on the number of scenario clusters to consider. In any case, our results show that the CLD procedures outperform the traditional LD scheme for single scenarios both in the quality of the bounds and computational effort. All the procedures have been implemented in a C++ experimental code. A broad computational experience is reported on a test of randomly generated instances by using the MIP solvers COIN-OR and CPLEX for the auxiliary mixed 0-1 cluster submodels, this last solver within the open source engine COIN-OR. We also give computational evidence of the model tightening effect that the preprocessing techniques, cut generation and appending and parallel computing tools have in stochastic integer optimization. Finally, we have observed that the plain use of both solvers does not provide the optimal solution of the instances included in the testbed with which we have experimented but for two toy instances in affordable elapsed time. On the other hand the proposed procedures provide strong lower bounds (or the same solution value) in a considerably shorter elapsed time for the quasi-optimal solution obtained by other means for the original stochastic problem.
Resumo:
[ES] La confianza en el mundo de los negocios es un factor esencial que viene siendo estudiado por la literatura del Management desde hace décadas. Partiendo de la propuesta seminal de Mayer et al. (1995), acerca de qué es la confianza y cómo se desarrolla en el seno de las organizaciones, proponemos trazar una crítica constructiva al modelo descrito a partir de las aportaciones de Spaemann, autor de dilatada trayectoria en el marco de la denominada “Filosofía Moral de la Europa Continental”.
Resumo:
[ES] En las últimas décadas el número de spin-offs universitarias creadas en el Sistema Universitario Español ha aumentado considerablemente; sin embargo, estas empresas tienen que hacer frente a problemas como la falta de financiación o de capacidades empresariales por parte de los fundadores. A partir de los datos extraídos de una encuesta a 72 spin-offs creadas en España, tratamos de detectar y analizar cuáles son los problemas más habituales con los que se encuentran estas empresas, y proponemos posibles vías de solución.
Resumo:
Due to the recent implantation of the Bologna process, the definition of competences in Higher Education is an important matter that deserves special attention and requires a detailed analysis. For that reason, we study the importance given to severa! competences for the professional activity and the degree to which these competences have been achieved through the received education. The answers include also competences observed in two periods of time given by individuals of multiple characteristics. In this context and in order to obtain synthesized results, we propose the use of Multiple Table Factor Analysis. Through this analysis, individuals are described by severa! groups, showing the most important variability factors of the individuals and allowing the analysis of the common structure ofthe different data tables. The obtained results will allow us finding out the existence or absence of a common structure in the answers of the various data tables, knowing which competences have similar answer structure in the groups of variables, as well as characterizing those answers through the individuals.