13 resultados para STOCHASTIC RESONANCE
em Archivo Digital para la Docencia y la Investigación - Repositorio Institucional de la Universidad del País Vasco
Resumo:
The aim of this paper is to explain under which circumstances using TACs as instrument to manage a fishery along with fishing periods may be interesting from a regulatory point of view. In order to do this, the deterministic analysis of Homans and Wilen (1997)and Anderson (2000) is extended to a stochastic scenario where the resource cannot be measured accurately. The resulting endogenous stochastic model is numerically solved for finding the optimal control rules in the Iberian sardine stock. Three relevant conclusions can be highligted from simulations. First, the higher the uncertainty about the state of the stock is, the lower the probability of closing the fishery is. Second, the use of TACs as management instrument in fisheries already regulated with fishing periods leads to: i) An increase of the optimal season length and harvests, especially for medium and high number of licences, ii) An improvement of the biological and economic variables when the size of the fleet is large; and iii) Eliminate the extinction risk for the resource. And third, the regulator would rather select the number of licences and do not restrict the season length.
Resumo:
The purpose of this article is to characterize dynamic optimal harvesting trajectories that maximize discounted utility assuming an age-structured population model, in the same line as Tahvonen (2009). The main novelty of our study is that uses as an age-structured population model the standard stochastic cohort framework applied in Virtual Population Analysis for fish stock assessment. This allows us to compare optimal harvesting in a discounted economic context with standard reference points used by fisheries agencies for long term management plans (e.g. Fmsy). Our main findings are the following. First, optimal steady state is characterized and sufficient conditions that guarantees its existence and uniqueness for the general case of n cohorts are shown. It is also proved that the optimal steady state coincides with the traditional target Fmsy when the utility function to be maximized is the yield and the discount rate is zero. Second, an algorithm to calculate the optimal path that easily drives the resource to the steady state is developed. And third, the algorithm is applied to the Northern Stock of hake. Results show that management plans based exclusively on traditional reference targets as Fmsy may drive fishery economic results far from the optimal.
Resumo:
This paper studies the behavior of the implied volatility function (smile) when the true distribution of the underlying asset is consistent with the stochastic volatility model proposed by Heston (1993). The main result of the paper is to extend previous results applicable to the smile as a whole to alternative degrees of moneyness. The conditions under which the implied volatility function changes whenever there is a change in the parameters associated with Hestons stochastic volatility model for a given degree of moneyness are given.
Resumo:
In this article, we analyze how to evaluate fishery resource management under “ecological uncertainty”. In this context, an efficient policy consists of applying a different exploitation rule depending on the state of the resource and we could say that the stock is always in transition, jumping from one steady state to another. First, we propose a method for calibrating the growth path of the resource such that observed dynamics of resource and captures are matched. Second, we apply the calibration procedure proposed in two different fishing grounds: the European Anchovy (Division VIII) and the Southern Stock of Hake. Our results show that the role played by uncertainty is essential for the conclusions. For European Anchovy fishery (Division VIII) we find, in contrast with Del Valle et al. (2001), that this is not an overexploited fishing ground. However, we show that the Southern Stock of Hake is in a dangerous situation. In both cases our results are in accordance with ICES advice.
Resumo:
In this paper we introduce four scenario Cluster based Lagrangian Decomposition (CLD) procedures for obtaining strong lower bounds to the (optimal) solution value of two-stage stochastic mixed 0-1 problems. At each iteration of the Lagrangian based procedures, the traditional aim consists of obtaining the solution value of the corresponding Lagrangian dual via solving scenario submodels once the nonanticipativity constraints have been dualized. Instead of considering a splitting variable representation over the set of scenarios, we propose to decompose the model into a set of scenario clusters. We compare the computational performance of the four Lagrange multiplier updating procedures, namely the Subgradient Method, the Volume Algorithm, the Progressive Hedging Algorithm and the Dynamic Constrained Cutting Plane scheme for different numbers of scenario clusters and different dimensions of the original problem. Our computational experience shows that the CLD bound and its computational effort depend on the number of scenario clusters to consider. In any case, our results show that the CLD procedures outperform the traditional LD scheme for single scenarios both in the quality of the bounds and computational effort. All the procedures have been implemented in a C++ experimental code. A broad computational experience is reported on a test of randomly generated instances by using the MIP solvers COIN-OR and CPLEX for the auxiliary mixed 0-1 cluster submodels, this last solver within the open source engine COIN-OR. We also give computational evidence of the model tightening effect that the preprocessing techniques, cut generation and appending and parallel computing tools have in stochastic integer optimization. Finally, we have observed that the plain use of both solvers does not provide the optimal solution of the instances included in the testbed with which we have experimented but for two toy instances in affordable elapsed time. On the other hand the proposed procedures provide strong lower bounds (or the same solution value) in a considerably shorter elapsed time for the quasi-optimal solution obtained by other means for the original stochastic problem.
Resumo:
We present a scheme to generate clusters submodels with stage ordering from a (symmetric or a nonsymmetric one) multistage stochastic mixed integer optimization model using break stage. We consider a stochastic model in compact representation and MPS format with a known scenario tree. The cluster submodels are built by storing first the 0-1 the variables, stage by stage, and then the continuous ones, also stage by stage. A C++ experimental code has been implemented for reordering the stochastic model as well as the cluster decomposition after the relaxation of the non-anticipativiy constraints until the so-called breakstage. The computational experience shows better performance of the stage ordering in terms of elapsed time in a randomly generated testbed of multistage stochastic mixed integer problems.
Resumo:
We extend the classic Merton (1969, 1971) problem that investigates the joint consumption-savings and portfolio-selection problem under capital risk by assuming sophisticated but time-inconsistent agents. We introduce stochastic hyperbolic preferences as in Harris and Laibson (2013) and find closed-form solutions for Merton's optimal consumption and portfolio selection problem in continuous time. We find that the portfolio rule remains identical to the time-consistent solution with power utility and no borrowing constraints. However,the marginal propensity to consume out of wealth is unambiguously greater than the time-consistent, exponential case and,importantly, it is also more responsive to changes in risk. These results suggest that hyperbolic discounting with sophisticated agents offers promise for contributing to explaining important aspects of asset market data.
Resumo:
6 p. Paper of the 17th Conference on Sensors and Their Applications held in Dubrovnik, Croatia. Sep 16-18, 2013
Resumo:
30 p.
Resumo:
167 p.
Resumo:
In this work we extend to the multistage case two recent risk averse measures for two-stage stochastic programs based on first- and second-order stochastic dominance constraints induced by mixed-integer linear recourse. Additionally, we consider Time Stochastic Dominance (TSD) along a given horizon. Given the dimensions of medium-sized problems augmented by the new variables and constraints required by those risk measures, it is unrealistic to solve the problem up to optimality by plain use of MIP solvers in a reasonable computing time, at least. Instead of it, decomposition algorithms of some type should be used. We present an extension of our Branch-and-Fix Coordination algorithm, so named BFC-TSD, where a special treatment is given to cross scenario group constraints that link variables from different scenario groups. A broad computational experience is presented by comparing the risk neutral approach and the tested risk averse strategies. The performance of the new version of the BFC algorithm versus the plain use of a state-of-the-artMIP solver is also reported.
Resumo:
In multisource industrial scenarios (MSIS) coexist NOAA generating activities with other productive sources of airborne particles, such as parallel processes of manufacturing or electrical and diesel machinery. A distinctive characteristic of MSIS is the spatially complex distribution of aerosol sources, as well as their potential differences in dynamics, due to the feasibility of multi-task configuration at a given time. Thus, the background signal is expected to challenge the aerosol analyzers at a probably wide range of concentrations and size distributions, depending of the multisource configuration at a given time. Monitoring and prediction by using statistical analysis of time series captured by on-line particle analyzers in industrial scenarios, have been proven to be feasible in predicting PNC evolution provided a given quality of net signals (difference between signal at source and background). However the analysis and modelling of non-consistent time series, influenced by low levels of SNR (Signal-Noise Ratio) could build a misleading basis for decision making. In this context, this work explores the use of stochastic models based on ARIMA methodology to monitor and predict exposure values (PNC). The study was carried out in a MSIS where an case study focused on the manufacture of perforated tablets of nano-TiO2 by cold pressing was performed
Resumo:
23rd Congress of the International Comission for Optics (ICO 23)