962 resultados para Stochastic frontier model


Relevância:

90.00% 90.00%

Publicador:

Resumo:

It has long been assumed that HIV-1 evolution is best described by deterministic evolutionary models because of the large population size. Recently, however, it was suggested that the effective population size (Ne) may be rather small, thereby allowing chance to influence evolution, a situation best described by a stochastic evolutionary model. To gain experimental evidence supporting one of the evolutionary models, we investigated whether the development of resistance to the protease inhibitor ritonavir affected the evolution of the env gene. Sequential serum samples from five patients treated with ritonavir were used for analysis of the protease gene and the V3 domain of the env gene. Multiple reverse transcription–PCR products were cloned, sequenced, and used to construct phylogenetic trees and to calculate the genetic variation and Ne. Genotypic resistance to ritonavir developed in all five patients, but each patient displayed a unique combination of mutations, indicating a stochastic element in the development of ritonavir resistance. Furthermore, development of resistance induced clear bottleneck effects in the env gene. The mean intrasample genetic variation, which ranged from 1.2% to 5.7% before treatment, decreased significantly (P < 0.025) during treatment. In agreement with these findings, Ne was estimated to be very small (500–15,000) compared with the total HIV-1 RNA copy number. This study combines three independent observations, strong population bottlenecking, small Ne, and selection of different combinations of protease-resistance mutations, all of which indicate that HIV-1 evolution is best described by a stochastic evolutionary model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper uses a stochastic translog cost frontier model and a panel data of five key mining industries in Australia over 1968-1969 to 1994-1995 to investigate the sources of output growth and the effects of cost inefficiency on total factor productivity (TFP) growth. The results indicate that mining output growth was largely input-driven rather than productivity-driven. Although there were some gains from technological progress and economics of scale in production, cost inefficiency which barely exceeded 1.1% since the mid-1970s in the mining industries was the main factor causing low TFP growth. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The paper investigates the effects of trade liberalisation on the technical efficiency of the Bangladesh manufacturing sector by estimating a combined stochastic frontier-inefficiency model using panel data for the period 197894 for 25 three-digit level industries. The results show that the overall technical efficiency of the manufacturing sector as well as the technical efficiencies of the majority of the individual industries has increased over time. The findings also clearly suggest that trade liberalisation, proxied by export orientation and capital deepening, has had significant impact on the reduction of the overall technical inefficiency. Similarly, the scale of operation and the proportion of non-production labour in total employment appear as important determinants of technical inefficiency. The evidence also indicates that both export-promoting and import-substituting industries have experienced rises in technical efficiencies over time. Besides, the results are suggestive of neutral technical change, although (at the 5 per cent level of significance) the empirical results indicate that there was no technical change in the manufacturing industries. Finally, the joint test based on the likelihood ratio (LR) test rejects the Cobb-Douglas production technology as description of the database given the specification of the translog production technology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a metafrontier production function model for firms in different groups having different technologies. The metafrontier model enables the calculation of comparable technical efficiencies for firms operating under different technologies. The model also enables the technology gaps to be estimated for firms under different technologies relative to the potential technology available to the industry as a whole. The metafrontier model is applied in the analysis of panel data on garment firms in five different regions of Indonesia, assuming that the regional stochastic frontier production function models have technical inefficiency effects with the time-varying structure proposed by Battese and Coelli ( 1992).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The estimated parameters of output distance functions frequently violate the monotonicity, quasi-convexity and convexity constraints implied by economic theory, leading to estimated elasticities and shadow prices that are incorrectly signed, and ultimately to perverse conclusions concerning the effects of input and output changes on productivity growth and relative efficiency levels. We show how a Bayesian approach can be used to impose these constraints on the parameters of a translog output distance function. Implementing the approach involves the use of a Gibbs sampler with data augmentation. A Metropolis-Hastings algorithm is also used within the Gibbs to simulate observations from truncated pdfs. Our methods are developed for the case where panel data is available and technical inefficiency effects are assumed to be time-invariant. Two models-a fixed effects model and a random effects model-are developed and applied to panel data on 17 European railways. We observe significant changes in estimated elasticities and shadow price ratios when regularity restrictions are imposed. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper analyses the mechanisms through which binding finance constraints can induce debt-constrained firms to improve technical efficiency to guarantee positive profits. This hypothesis is tested on a sample of firms belonging to the Italian manufacturing. Technical efficiency scores are computed by estimating parametric production frontiers using the one stage approach as in Battese and Coelli [Battese, G., Coelli, T., 1995. A model for technical efficiency effects in a stochastic frontier production function for panel data. Empirical Economics 20, 325-332]. The results support the hypothesis that a restriction in the availability of financial resources can affect positively efficiency. © 2004 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Airline industry is at the forefront of many technological developments and is often a pioneer in adopting such innovations in a large scale. It needs to improve its efficiency as the current trends for input prices and competitive pressures show that any airline will face increasingly challenging market conditions. This paper has focused on the relationship between ICT investments and efficiency in the airline industry and employed a two-stage analytical investigation, DEA, SFA and the Tobit regression model. In this study, we first estimate the productivity of the airline industry using a balanced panel of 17 airlines over the period 1999–2004 by the Data Envelop Analysis (DEA) and the Stochastic Frontier Analysis (SFA) methods. We then evaluate the impacts of the determinants of productivity in the industry concentrating on ICT. The results suggest that regardless of all the negative shocks to the airline industry during the sample period, ICT had a positive effect on productivity during 1999-2004.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Financial institutes are an integral part of any modern economy. In the 1970s and 1980s, Gulf Cooperation Council (GCC) countries made significant progress in financial deepening and in building a modern financial infrastructure. This study aims to evaluate the performance (efficiency) of financial institutes (banking sector) in GCC countries. Since, the selected variables include negative data for some banks and positive for others, and the available evaluation methods are not helpful in this case, so we developed a Semi Oriented Radial Model to perform this evaluation. Furthermore, since the SORM evaluation result provides a limited information for any decision maker (bankers, investors, etc...), we proposed a second stage analysis using classification and regression (C&R) method to get further results combining SORM results with other environmental data (Financial, economical and political) to set rules for the efficient banks, hence, the results will be useful for bankers in order to improve their bank performance and to the investors, maximize their returns. Mainly there are two approaches to evaluate the performance of Decision Making Units (DMUs), under each of them there are different methods with different assumptions. Parametric approach is based on the econometric regression theory and nonparametric approach is based on a mathematical linear programming theory. Under the nonparametric approaches, there are two methods: Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH). While there are three methods under the parametric approach: Stochastic Frontier Analysis (SFA); Thick Frontier Analysis (TFA) and Distribution-Free Analysis (DFA). The result shows that DEA and SFA are the most applicable methods in banking sector, but DEA is seem to be most popular between researchers. However DEA as SFA still facing many challenges, one of these challenges is how to deal with negative data, since it requires the assumption that all the input and output values are non-negative, while in many applications negative outputs could appear e.g. losses in contrast with profit. Although there are few developed Models under DEA to deal with negative data but we believe that each of them has it is own limitations, therefore we developed a Semi-Oriented-Radial-Model (SORM) that could handle the negativity issue in DEA. The application result using SORM shows that the overall performance of GCC banking is relatively high (85.6%). Although, the efficiency score is fluctuated over the study period (1998-2007) due to the second Gulf War and to the international financial crisis, but still higher than the efficiency score of their counterpart in other countries. Banks operating in Saudi Arabia seem to be the highest efficient banks followed by UAE, Omani and Bahraini banks, while banks operating in Qatar and Kuwait seem to be the lowest efficient banks; this is because these two countries are the most affected country in the second Gulf War. Also, the result shows that there is no statistical relationship between the operating style (Islamic or Conventional) and bank efficiency. Even though there is no statistical differences due to the operational style, but Islamic bank seem to be more efficient than the Conventional bank, since on average their efficiency score is 86.33% compare to 85.38% for Conventional banks. Furthermore, the Islamic banks seem to be more affected by the political crisis (second Gulf War), whereas Conventional banks seem to be more affected by the financial crisis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study presents some quantitative evidence from a number of simulation experiments on the accuracy of the productivitygrowth estimates derived from growthaccounting (GA) and frontier-based methods (namely data envelopment analysis-, corrected ordinary least squares-, and stochastic frontier analysis-based malmquist indices) under various conditions. These include the presence of technical inefficiency, measurement error, misspecification of the production function (for the GA and parametric approaches) and increased input and price volatility from one period to the next. The study finds that the frontier-based methods usually outperform GA, but the overall performance varies by experiment. Parametric approaches generally perform best when there is no functional form misspecification, but their accuracy greatly diminishes otherwise. The results also show that the deterministic approaches perform adequately even under conditions of (modest) measurement error and when measurement error becomes larger, the accuracy of all approaches (including stochastic approaches) deteriorates rapidly, to the point that their estimates could be considered unreliable for policy purposes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A cikkben paneladatok segítségével a magyar gabonatermesztő üzemek 2001 és 2009 közötti technikai hatékonyságát vizsgáljuk. A technikai hatékonyság szintjének becslésére egy hagyományos sztochasztikus határok modell (SFA) mellett a látens csoportok modelljét (LCM) használjuk, amely figyelembe veszi a technológiai különbségeket is. Eredményeink arra utalnak, hogy a technológiai heterogenitás fontos lehet egy olyan ágazatban is, mint a szántóföldi növénytermesztés, ahol viszonylag homogén technológiát alkalmaznak. A hagyományos, azonos technológiát feltételező és a látens osztályok modelljeinek összehasonlítása azt mutatja, hogy a gabonatermesztő üzemek technikai hatékonyságát a hagyományos modellek alábecsülhetik. _____ The article sets out to analyse the technical efficiency of Hungarian crop farms between 2001 and 2009, using panel data and employing both standard stochastic frontier analysis and the latent class model (LCM) to estimate technical efficiency. The findings suggest that technological heterogeneity plays an important role in the crop sector, though it is traditionally assumed to employ homogenous technology. A comparison of standard SFA models that assumes the technology is common to all farms and LCM estimates highlights the way the efficiency of crop farms can be underestimated using traditional SFA models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An emergency is a deviation from a planned course of events that endangers people, properties, or the environment. It can be described as an unexpected event that causes economic damage, destruction, and human suffering. When a disaster happens, Emergency Managers are expected to have a response plan to most likely disaster scenarios. Unlike earthquakes and terrorist attacks, a hurricane response plan can be activated ahead of time, since a hurricane is predicted at least five days before it makes landfall. This research looked into the logistics aspects of the problem, in an attempt to develop a hurricane relief distribution network model. We addressed the problem of how to efficiently and effectively deliver basic relief goods to victims of a hurricane disaster. Specifically, where to preposition State Staging Areas (SSA), which Points of Distributions (PODs) to activate, and the allocation of commodities to each POD. Previous research has addressed several of these issues, but not with the incorporation of the random behavior of the hurricane's intensity and path. This research presents a stochastic meta-model that deals with the location of SSAs and the allocation of commodities. The novelty of the model is that it treats the strength and path of the hurricane as stochastic processes, and models them as Discrete Markov Chains. The demand is also treated as stochastic parameter because it depends on the stochastic behavior of the hurricane. However, for the meta-model, the demand is an input that is determined using Hazards United States (HAZUS), a software developed by the Federal Emergency Management Agency (FEMA) that estimates losses due to hurricanes and floods. A solution heuristic has been developed based on simulated annealing. Since the meta-model is a multi-objective problem, the heuristic is a multi-objective simulated annealing (MOSA), in which the initial solution and the cooling rate were determined via a Design of Experiments. The experiment showed that the initial temperature (T0) is irrelevant, but temperature reduction (δ) must be very gradual. Assessment of the meta-model indicates that the Markov Chains performed as well or better than forecasts made by the National Hurricane Center (NHC). Tests of the MOSA showed that it provides solutions in an efficient manner. Thus, an illustrative example shows that the meta-model is practical.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This dissertation analyzes hospital efficiency using various econometric techniques. The first essay provides additional and recent evidence to the presence of contract management behavior in the U.S. hospital industry. Unlike previous studies, which focus on either an input-demand equation or the cost function of the firm, this paper estimates the two jointly using a system of nonlinear equations. Moreover, it addresses the longitudinal problem of institutions adopting contract management in different years, by creating a matched control group of non-adopters with the same longitudinal distribution as the group under study. The estimation procedure then finds that labor, and not capital, is the preferred input in U.S. hospitals regardless of managerial contract status. With institutions that adopt contract management benefiting from lower labor inefficiencies than the simulated non-contract adopters. These results suggest that while there is a propensity for expense preference behavior towards the labor input, contract managed firms are able to introduce efficiencies over conventional, owner controlled, firms. Using data for the years 1998 through 2007, the second essay investigates the production technology and cost efficiency faced by Florida hospitals. A stochastic frontier multiproduct cost function is estimated in order to test for economies of scale, economies of scope, and relative cost efficiencies. The results suggest that small-sized hospitals experience economies of scale, while large and medium sized institutions do not. The empirical findings show that Florida hospitals enjoy significant scope economies, regardless of size. Lastly, the evidence suggests that there is a link between hospital size and relative cost efficiency. The results of the study imply that state policy makers should be focused on increasing hospital scale for smaller institutions while facilitating the expansion of multiproduct production for larger hospitals. The third and final essay employs a two staged approach in analyzing the efficiency of hospitals in the state of Florida. In the first stage, the Banker, Charnes, and Cooper model of Data Envelopment Analysis is employed in order to derive overall technical efficiency scores for each non-specialty hospital in the state. Additionally, input slacks are calculated and reported in order to identify the factors of production that each hospital may be over utilizing. In the second stage, we employ a Tobit regression model in order to analyze the effects a number of structural, managerial, and environmental factors may have on a hospital’s efficiency. The results indicated that most non-specialty hospitals in the state are operating away from the efficient production frontier. The results also indicate that the structural make up, managerial choices, and level of competition Florida hospitals face have an impact on their overall technical efficiency.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In Brazil, the National Agency of Electric Energy (ANEEL) represents the energy regulator. The rates review have been one of its main tasks, which establish a pricing practice at a level to cover the efficient operating costs and also the appropriate return of the distributors investments. The changes in the procedures to redefine the efficient costs and the several studies on the methodologies employed to regulate this segment denote the challenge faced by regulators about the best methodological strategy to be employed. In this context, this research aims to propose a benchmarking evaluation applied to the national regulation system in the establishment of efficient operating costs of electricity distribution utilities. The model is formulated to promote the electricity market development, partnering with government policies ant to society benefit. To conduct this research, an integration of Data Envelopment Analysis (DEA) with the Stochastic Frontier Analysis (SFA) is adopted in a three stages procedure to correct the efficiency in terms of environmental effects: (i) evaluation by means of DEA to measure operating costs slacks of the utilities, in which environmental variables are omitted; (ii) The slacks calculated in the first stage are regressed on a set of environmental variables by means of SFA and operating costs are adjusted to account the environmental impact and statistical noise effects; and, (iii) reassess the performance of the electric power distribution utilities by means of DEA. Based on this methodology it is possible to obtain a performance evaluation exclusively expressed in terms of management efficiency, in which the operating environment and statistical noise effects are controlled.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The paper develops a novel realized matrix-exponential stochastic volatility model of multivariate returns and realized covariances that incorporates asymmetry and long memory (hereafter the RMESV-ALM model). The matrix exponential transformation guarantees the positivedefiniteness of the dynamic covariance matrix. The contribution of the paper ties in with Robert Basmann’s seminal work in terms of the estimation of highly non-linear model specifications (“Causality tests and observationally equivalent representations of econometric models”, Journal of Econometrics, 1988, 39(1-2), 69–104), especially for developing tests for leverage and spillover effects in the covariance dynamics. Efficient importance sampling is used to maximize the likelihood function of RMESV-ALM, and the finite sample properties of the quasi-maximum likelihood estimator of the parameters are analysed. Using high frequency data for three US financial assets, the new model is estimated and evaluated. The forecasting performance of the new model is compared with a novel dynamic realized matrix-exponential conditional covariance model. The volatility and co-volatility spillovers are examined via the news impact curves and the impulse response functions from returns to volatility and co-volatility.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This Ph.D. thesis contains 4 essays in mathematical finance with a focus on pricing Asian option (Chapter 4), pricing futures and futures option (Chapter 5 and Chapter 6) and time dependent volatility in futures option (Chapter 7). In Chapter 4, the applicability of the Albrecher et al.(2005)'s comonotonicity approach was investigated in the context of various benchmark models for equities and com- modities. Instead of classical Levy models as in Albrecher et al.(2005), the focus is the Heston stochastic volatility model, the constant elasticity of variance (CEV) model and the Schwartz (1997) two-factor model. It is shown that the method delivers rather tight upper bounds for the prices of Asian Options in these models and as a by-product delivers super-hedging strategies which can be easily implemented. In Chapter 5, two types of three-factor models were studied to give the value of com- modities futures contracts, which allow volatility to be stochastic. Both these two models have closed-form solutions for futures contracts price. However, it is shown that Model 2 is better than Model 1 theoretically and also performs very well empiri- cally. Moreover, Model 2 can easily be implemented in practice. In comparison to the Schwartz (1997) two-factor model, it is shown that Model 2 has its unique advantages; hence, it is also a good choice to price the value of commodity futures contracts. Fur- thermore, if these two models are used at the same time, a more accurate price for commodity futures contracts can be obtained in most situations. In Chapter 6, the applicability of the asymptotic approach developed in Fouque et al.(2000b) was investigated for pricing commodity futures options in a Schwartz (1997) multi-factor model, featuring both stochastic convenience yield and stochastic volatility. It is shown that the zero-order term in the expansion coincides with the Schwartz (1997) two-factor term, with averaged volatility, and an explicit expression for the first-order correction term is provided. With empirical data from the natural gas futures market, it is also demonstrated that a significantly better calibration can be achieved by using the correction term as compared to the standard Schwartz (1997) two-factor expression, at virtually no extra effort. In Chapter 7, a new pricing formula is derived for futures options in the Schwartz (1997) two-factor model with time dependent spot volatility. The pricing formula can also be used to find the result of the time dependent spot volatility with futures options prices in the market. Furthermore, the limitations of the method that is used to find the time dependent spot volatility will be explained, and it is also shown how to make sure of its accuracy.