962 resultados para Stochastic frontier model
Resumo:
Classical metapopulation theory assumes a static landscape. However, empirical evidence indicates many metapopulations are driven by habitat succession and disturbance. We develop a stochastic metapopulation model, incorporating habitat disturbance and recovery, coupled with patch colonization and extinction, to investigate the effect of habitat dynamics on persistence. We discover that habitat dynamics play a fundamental role in metapopulation dynamics. The mean number of suitable habitat patches is not adequate for characterizing the dynamics of the metapopulation. For a fixed mean number of suitable patches, we discover that the details of how disturbance affects patches and how patches recover influences metapopulation dynamics in a fundamental way. Moreover, metapopulation persistence is dependent not only oil the average lifetime of a patch, but also on the variance in patch lifetime and the synchrony in patch dynamics that results from disturbance. Finally, there is an interaction between the habitat and metapopulation dynamics, for instance declining metapopulations react differently to habitat dynamics than expanding metapopulations. We close, emphasizing the importance of using performance measures appropriate to stochastic systems when evaluating their behavior, such as the probability distribution of the state of the. metapopulation, conditional on it being extant (i.e., the quasistationary distribution).
Resumo:
Purpose – The data used in this study is for the period 1980-2000. Almost midway through this period (in 1992), the Kenyan government liberalized the sugar industry and the role of the market increased, while the government's role with respect to control of prices, imports and other aspects in the sector declined. This exposed the local sugar manufacturers to external competition from other sugar producers, especially from the COMESA region. This study aims to find whether there were any changes in efficiency of production between the two periods (pre and post-liberalization). Design/methodology/approach – The study utilized two methodologies to efficiency estimation: data envelopment analysis (DEA) and the stochastic frontier. DEA uses mathematical programming techniques and does not impose any functional form on the data. However, it attributes all deviation from the mean function to inefficiencies. The stochastic frontier utilizes econometric techniques. Findings – The test for structural differences in the two periods does not show any statistically significant differences between the two periods. However, both methodologies show a decline in efficiency levels from 1992, with the lowest period experienced in 1998. From then on, efficiency levels began to increase. Originality/value – To the best of the authors' knowledge, this is the first paper to use both methodologies in the sugar industry in Kenya. It is shown that in industries where the noise (error) term is minimal (such as manufacturing), the DEA and stochastic frontier give similar results.
Resumo:
The water and sewerage industry of England and Wales was privatized in 1989 and subjected to a new regime of environmental, water quality and RPI+K price cap regulation. This paper estimates a quality-adjusted input distance function, with stochastic frontier techniques in order to estimate productivity growth rates for the period 1985-2000. Productivity is decomposed so as to account for the impact of technical change, efficiency change, and scale change. Compared with earlier studies by Saal and Parker [(2000) Managerial Decision Econ 21(6):253-268, (2001) J Regul Econ 20(1): 61-90], these estimates allow a more careful consideration of how and whether privatization and the new regulatory regime affected productivity growth in the industry. Strikingly, they suggest that while technical change improved after privatization, productivity growth did not improve, and this was attributable to efficiency losses as firms appear to have struggled to keep up with technical advances after privatization. Moreover, the results also suggest that the excessive scale of the WaSCs contributed negatively to productivity growth. © 2007 Springer Science+Business Media, LLC.
Resumo:
Cost functions are estimated, using random effects and stochastic frontier methods, for English higher education institutions. The article advances on existing literature by employing finer disaggregation by subject, institution type and location, and by introducing consideration of quality effects. Estimates are provided of average incremental costs attached to each output type, and of returns to scale and scope. Implications for the policy of expansion of higher education are discussed.
Resumo:
This study employs stochastic frontier analysis to analyze Malaysian commercial banks during 1996-2002, and particularly focuses on determining the impact of Islamic banking on performance. We derive both net and gross efficiency estimates, thereby demonstrating that differences in operating characteristics explain much of the difference in costs between Malaysian banks. We also decompose productivity change into efficiency, technical, and scale change using a generalised Malmquist productivity index. On average, Malaysian banks experience moderate scale economies and annual productivity change of 2.68 percent, with the latter driven primarily by technical change, which has declined over time. Our gross efficiency estimates suggest that Islamic banking is associated with higher input requirements. However, our productivity estimates indicate that full-fledged Islamic banks have overcome some of these cost disadvantages with rapid technical change, although this is not the case for conventional banks operating Islamic windows. Merged banks are found to have higher input usage and lower productivity change, suggesting that bank mergers have not contributed positively to bank performance. Finally, our results suggest that while the East Asian financial crisis had a short-term cost-reducing effect in 1998, the crisis triggered a more lasting negative impact by increasing the volume of non-performing loans.
Resumo:
This study employs Stochastic Frontier Analysis (SFA) to analyse Malaysian commercial banks during 1996–2002, and particularly focuses on determining the impact of Islamic banking on performance. We derive both net and gross efficiency estimates, thereby demonstrating that differences in operating characteristics explain much of the difference in costs between Malaysian banks. We also decompose productivity change into efficiency, technical, and scale change using a generalized Malmquist productivity index. On average, Malaysian banks experience moderate scale economies and annual productivity change of 2.68%, with the latter driven primarily by Technical Change (TC), which has declined over time. Our gross efficiency estimates suggest that Islamic banking is associated with higher input requirements. However, our productivity estimates indicate that full-fledged Islamic banks have overcome some of these cost disadvantages with rapid TC, although this is not the case for conventional banks operating Islamic windows. Merged banks are found to have higher input usage and lower productivity change, suggesting that bank mergers have not contributed positively to bank performance. Finally, our results suggest that while the East Asian financial crisis had a short-term costreducing effect in 1998, the crisis triggered a long-lasting negative impact by increasing the volume of nonperforming loans.
Resumo:
This study employs stochastic frontier analysis to analyze Malaysian commercial banks during 1996-2002, and particularly focuses on determining the impact of Islamic banking on performance. We derive both net and gross efficiency estimates, thereby demonstrating that differences in operating characteristics explain much of the difference in outputs between Malaysian banks. We also decompose productivity change into efficiency, technical, and scale change using a generalised Malmquist productivity index. On average, Malaysian banks experience mild decreasing return to scale and annual productivity change of 2.37 percent, with the latter driven primarily by technical change, which has declined over time. Our gross efficiency estimates suggest that Islamic banking is associated with higher input requirements. In addition, our productivity estimates indicate that the potential for full-fledged Islamic banks and conventional banks with Islamic banking operations to overcome the output disadvantages associated with Islamic banking are relatively limited. Merged banks are found to have higher input usage and lower productivity change, suggesting that bank mergers have not contributed positively to bank performance. Finally, our results suggest that while the East Asian financial crisis had an interim output-increasing effect in 1998, the crisis prompted a continuing negative impact on the output performance by increasing the volume of non-performing loans.
Resumo:
In efficiency studies using the stochastic frontier approach, the main focus is to explain inefficiency in terms of some exogenous variables and computation of marginal effects of each of these determinants. Although inefficiency is estimated by its mean conditional on the composed error term (the Jondrow et al., 1982 estimator), the marginal effects are computed from the unconditional mean of inefficiency (Wang, 2002). In this paper we derive the marginal effects based on the Jondrow et al. estimator and use the bootstrap method to compute confidence intervals of the marginal effects.
Resumo:
Using the risk measure CV aR in �nancial analysis has become more and more popular recently. In this paper we apply CV aR for portfolio optimization. The problem is formulated as a two-stage stochastic programming model, and the SRA algorithm, a recently developed heuristic algorithm, is applied for minimizing CV aR.
Resumo:
A CV aR kockázati mérték egyre nagyobb jelentőségre tesz szert portfóliók kockázatának megítélésekor. A portfolió egészére a CVaR kockázati mérték minimalizálását meg lehet fogalmazni kétlépcsős sztochasztikus feladatként. Az SRA algoritmus egy mostanában kifejlesztett megoldó algoritmus sztochasztikus programozási feladatok optimalizálására. Ebben a cikkben az SRA algoritmussal oldottam meg CV aR kockázati mérték minimalizálást. ___________ The risk measure CVaR is becoming more and more popular in recent years. In this paper we use CVaR for portfolio optimization. We formulate the problem as a two-stage stochastic programming model. We apply the SRA algorithm, which is a recently developed heuristic algorithm, to minimizing CVaR.
Resumo:
In this dissertation, I investigate three related topics on asset pricing: the consumption-based asset pricing under long-run risks and fat tails, the pricing of VIX (CBOE Volatility Index) options and the market price of risk embedded in stock returns and stock options. These three topics are fully explored in Chapter II through IV. Chapter V summarizes the main conclusions. In Chapter II, I explore the effects of fat tails on the equilibrium implications of the long run risks model of asset pricing by introducing innovations with dampened power law to consumption and dividends growth processes. I estimate the structural parameters of the proposed model by maximum likelihood. I find that the stochastic volatility model with fat tails can, without resorting to high risk aversion, generate implied risk premium, expected risk free rate and their volatilities comparable to the magnitudes observed in data. In Chapter III, I examine the pricing performance of VIX option models. The contention that simpler-is-better is supported by the empirical evidence using actual VIX option market data. I find that no model has small pricing errors over the entire range of strike prices and times to expiration. In general, Whaley’s Black-like option model produces the best overall results, supporting the simpler-is-better contention. However, the Whaley model does under/overprice out-of-the-money call/put VIX options, which is contrary to the behavior of stock index option pricing models. In Chapter IV, I explore risk pricing through a model of time-changed Lvy processes based on the joint evidence from individual stock options and underlying stocks. I specify a pricing kernel that prices idiosyncratic and systematic risks. This approach to examining risk premia on stocks deviates from existing studies. The empirical results show that the market pays positive premia for idiosyncratic and market jump-diffusion risk, and idiosyncratic volatility risk. However, there is no consensus on the premium for market volatility risk. It can be positive or negative. The positive premium on idiosyncratic risk runs contrary to the implications of traditional capital asset pricing theory.
Resumo:
In this dissertation, I investigate three related topics on asset pricing: the consumption-based asset pricing under long-run risks and fat tails, the pricing of VIX (CBOE Volatility Index) options and the market price of risk embedded in stock returns and stock options. These three topics are fully explored in Chapter II through IV. Chapter V summarizes the main conclusions. In Chapter II, I explore the effects of fat tails on the equilibrium implications of the long run risks model of asset pricing by introducing innovations with dampened power law to consumption and dividends growth processes. I estimate the structural parameters of the proposed model by maximum likelihood. I find that the stochastic volatility model with fat tails can, without resorting to high risk aversion, generate implied risk premium, expected risk free rate and their volatilities comparable to the magnitudes observed in data. In Chapter III, I examine the pricing performance of VIX option models. The contention that simpler-is-better is supported by the empirical evidence using actual VIX option market data. I find that no model has small pricing errors over the entire range of strike prices and times to expiration. In general, Whaley’s Black-like option model produces the best overall results, supporting the simpler-is-better contention. However, the Whaley model does under/overprice out-of-the-money call/put VIX options, which is contrary to the behavior of stock index option pricing models. In Chapter IV, I explore risk pricing through a model of time-changed Lévy processes based on the joint evidence from individual stock options and underlying stocks. I specify a pricing kernel that prices idiosyncratic and systematic risks. This approach to examining risk premia on stocks deviates from existing studies. The empirical results show that the market pays positive premia for idiosyncratic and market jump-diffusion risk, and idiosyncratic volatility risk. However, there is no consensus on the premium for market volatility risk. It can be positive or negative. The positive premium on idiosyncratic risk runs contrary to the implications of traditional capital asset pricing theory.
Resumo:
This study evaluates the cost effectiveness of municipalities in the state of Rio Grande do Norte in the execution of spending in basic education carried out in 2011, as well as analyze the determinants of the inefficiency of the same. For this, we used two methodological approaches (i) stochastic frontier cost, and (ii) analyze data envelopment (DEA), which allows to identify the efficient frontier of the municipalities analyzed non-parametrically. Results show that municipalities under review achieved low efficiency rates in the stochastic frontier cost, while the DEA method they achieved higher rates where nineteen among them reached full efficiency. The results suggest that a significant portion of the Potiguar municipalities should review its administrative practices, especially the means of allocation of resources. In regard to determining the efficiency observed distinct results by the two methods.
Resumo:
A scenario-based two-stage stochastic programming model for gas production network planning under uncertainty is usually a large-scale nonconvex mixed-integer nonlinear programme (MINLP), which can be efficiently solved to global optimality with nonconvex generalized Benders decomposition (NGBD). This paper is concerned with the parallelization of NGBD to exploit multiple available computing resources. Three parallelization strategies are proposed, namely, naive scenario parallelization, adaptive scenario parallelization, and adaptive scenario and bounding parallelization. Case study of two industrial natural gas production network planning problems shows that, while the NGBD without parallelization is already faster than a state-of-the-art global optimization solver by an order of magnitude, the parallelization can improve the efficiency by several times on computers with multicore processors. The adaptive scenario and bounding parallelization achieves the best overall performance among the three proposed parallelization strategies.
Resumo:
An RVE–based stochastic numerical model is used to calculate the permeability of randomly generated porous media at different values of the fiber volume fraction for the case of transverse flow in a unidirectional ply. Analysis of the numerical results shows that the permeability is not normally distributed. With the aim of proposing a new understanding on this particular topic, permeability data are fitted using both a mixture model and a unimodal distribution. Our findings suggest that permeability can be fitted well using a mixture model based on the lognormal and power law distributions. In case of a unimodal distribution, it is found, using the maximum-likelihood estimation method (MLE), that the generalized extreme value (GEV) distribution represents the best fit. Finally, an expression of the permeability as a function of the fiber volume fraction based on the GEV distribution is discussed in light of the previous results.