5 resultados para structural efficiency
em Aston University Research Archive
Resumo:
Purpose – The data used in this study is for the period 1980-2000. Almost midway through this period (in 1992), the Kenyan government liberalized the sugar industry and the role of the market increased, while the government's role with respect to control of prices, imports and other aspects in the sector declined. This exposed the local sugar manufacturers to external competition from other sugar producers, especially from the COMESA region. This study aims to find whether there were any changes in efficiency of production between the two periods (pre and post-liberalization). Design/methodology/approach – The study utilized two methodologies to efficiency estimation: data envelopment analysis (DEA) and the stochastic frontier. DEA uses mathematical programming techniques and does not impose any functional form on the data. However, it attributes all deviation from the mean function to inefficiencies. The stochastic frontier utilizes econometric techniques. Findings – The test for structural differences in the two periods does not show any statistically significant differences between the two periods. However, both methodologies show a decline in efficiency levels from 1992, with the lowest period experienced in 1998. From then on, efficiency levels began to increase. Originality/value – To the best of the authors' knowledge, this is the first paper to use both methodologies in the sugar industry in Kenya. It is shown that in industries where the noise (error) term is minimal (such as manufacturing), the DEA and stochastic frontier give similar results.
Using interior point algorithms for the solution of linear programs with special structural features
Resumo:
Linear Programming (LP) is a powerful decision making tool extensively used in various economic and engineering activities. In the early stages the success of LP was mainly due to the efficiency of the simplex method. After the appearance of Karmarkar's paper, the focus of most research was shifted to the field of interior point methods. The present work is concerned with investigating and efficiently implementing the latest techniques in this field taking sparsity into account. The performance of these implementations on different classes of LP problems is reported here. The preconditional conjugate gradient method is one of the most powerful tools for the solution of the least square problem, present in every iteration of all interior point methods. The effect of using different preconditioners on a range of problems with various condition numbers is presented. Decomposition algorithms has been one of the main fields of research in linear programming over the last few years. After reviewing the latest decomposition techniques, three promising methods were chosen the implemented. Sparsity is again a consideration and suggestions have been included to allow improvements when solving problems with these methods. Finally, experimental results on randomly generated data are reported and compared with an interior point method. The efficient implementation of the decomposition methods considered in this study requires the solution of quadratic subproblems. A review of recent work on algorithms for convex quadratic was performed. The most promising algorithms are discussed and implemented taking sparsity into account. The related performance of these algorithms on randomly generated separable and non-separable problems is also reported.
Resumo:
This article focuses on the deviations from normality of stock returns before and after a financial liberalisation reform, and shows the extent to which inference based on statistical measures of stock market efficiency can be affected by not controlling for breaks. Drawing from recent advances in the econometrics of structural change, it compares the distribution of the returns of five East Asian emerging markets when breaks in the mean and variance are either (i) imposed using certain official liberalisation dates or (ii) detected non-parametrically using a data-driven procedure. The results suggest that measuring deviations from normality of stock returns with no provision for potentially existing breaks incorporates substantial bias. This is likely to severely affect any inference based on the corresponding descriptive or test statistics.
Resumo:
It is generally believed that the structural reforms that were introduced in India following the macro-economic crisis of 1991 ushered in competition and forced companies to become more efficient. However, whether the post-1991 growth is an outcome of more efficient use of resources or greater use of factor inputs remains an open empirical question. In this paper, we use plant-level data from 1989–1990 and 2000–2001 to address this question. Our results indicate that while there was an increase in the productivity of factor inputs during the 1990s, most of the growth in value added is explained by growth in the use of factor inputs. We also find that median technical efficiency declined in all but one of the industries between 1989–1990 and 2000–2001, and that change in technical efficiency explains a very small proportion of the change in gross value added.
Resumo:
Theoretical and empirical studies show that deindustrialisation, broadly observed in developed countries, is an inherent part of the economic development pattern. However, post-communist countries, while being only middle-income economies, have also experienced deindustrialisation. Building on the model developed by Rowthorn and Wells (1987) we explain this phenomenon and show that there is a strong negative relationship between the magnitude of deindustrialisation and the efficiency and consistency of market reforms. We also demonstrate that reforms of the agricultural sector play a significant role in placing a transition country on a development path that guarantees convergence to EU employment structures.