941 resultados para Stochastic Frontier Production Function Analysis
Resumo:
There is an urgent need for high purity, single chain, fully functional Eph/ephrin membrane proteins. This report outlines the pTIg-BOS-Fc vector and purification approach resulting in rapid increased production of fully functional single chain extracellular proteins that were isolated with high purity and used in structure-function analysis and pre-clinical studies.
Resumo:
In recent years, the multiparametric approach for evaluating perceptual rating of voice quality has been advocated. This study evaluates the accuracy of predicting perceived overall severity of voice quality with a minimal set of aerodynamic, voice range profile (phonetogram), and acoustic perturbation measures. One hundred and twelve dysphonic persons (93 women and 19 men) with laryngeal pathologies and 41 normal controls (35 women and six men) with normal voices participated in this study. Perceptual severity judgement was carried out by four listeners rating the G (overall grade) parameter of the GRBAS scale.(1) The minimal set of instrumental measures was selected based on the ability of the measure to discriminate between dysphonic and normal voices, and to attain at least a moderate correlation with perceived overall severity. Results indicated that perceived overall severity was best described by maximum phonation time of sustained /a/, peak intraoral pressure of the consonant-vowel /pi/ strings production, voice range profile area, and acoustic jitter. Direct-entry discriminant function analysis revealed that these four voice measures in combination correctly predicted 67.3% of perceived overall severity levels.
Resumo:
The judicial interest in ‘scientific’ evidence has driven recent work to quantify results for forensic linguistic authorship analysis. Through a methodological discussion and a worked example this paper examines the issues which complicate attempts to quantify results in work. The solution suggested to some of the difficulties is a sampling and testing strategy which helps to identify potentially useful, valid and reliable markers of authorship. An important feature of the sampling strategy is that these markers identified as being generally valid and reliable are retested for use in specific authorship analysis cases. The suggested approach for drawing quantified conclusions combines discriminant function analysis and Bayesian likelihood measures. The worked example starts with twenty comparison texts for each of three potential authors and then uses a progressively smaller comparison corpus, reducing to fifteen, ten, five and finally three texts per author. This worked example demonstrates how reducing the amount of data affects the way conclusions can be drawn. With greater numbers of reference texts quantified and safe attributions are shown to be possible, but as the number of reference texts reduces the analysis shows how the conclusion which should be reached is that no attribution can be made. The testing process at no point results in instances of a misattribution.
Resumo:
Discriminant analysis (also known as discriminant function analysis or multiple discriminant analysis) is a multivariate statistical method of testing the degree to which two or more populations may overlap with each other. It was devised independently by several statisticians including Fisher, Mahalanobis, and Hotelling ). The technique has several possible applications in Microbiology. First, in a clinical microbiological setting, if two different infectious diseases were defined by a number of clinical and pathological variables, it may be useful to decide which measurements were the most effective at distinguishing between the two diseases. Second, in an environmental microbiological setting, the technique could be used to study the relationships between different populations, e.g., to what extent do the properties of soils in which the bacterium Azotobacter is found differ from those in which it is absent? Third, the method can be used as a multivariate ‘t’ test , i.e., given a number of related measurements on two groups, the analysis can provide a single test of the hypothesis that the two populations have the same means for all the variables studied. This statnote describes one of the most popular applications of discriminant analysis in identifying the descriptive variables that can distinguish between two populations.
Resumo:
Airline industry is at the forefront of many technological developments and is often a pioneer in adopting such innovations in a large scale. It needs to improve its efficiency as the current trends for input prices and competitive pressures show that any airline will face increasingly challenging market conditions. This paper has focused on the relationship between ICT investments and efficiency in the airline industry and employed a two-stage analytical investigation, DEA, SFA and the Tobit regression model. In this study, we first estimate the productivity of the airline industry using a balanced panel of 17 airlines over the period 1999–2004 by the Data Envelop Analysis (DEA) and the Stochastic Frontier Analysis (SFA) methods. We then evaluate the impacts of the determinants of productivity in the industry concentrating on ICT. The results suggest that regardless of all the negative shocks to the airline industry during the sample period, ICT had a positive effect on productivity during 1999-2004.
Resumo:
This study employs stochastic frontier analysis to analyze Malaysian commercial banks during 1996-2002, and particularly focuses on determining the impact of Islamic banking on performance. We derive both net and gross efficiency estimates, thereby demonstrating that differences in operating characteristics explain much of the difference in costs between Malaysian banks. We also decompose productivity change into efficiency, technical, and scale change using a generalised Malmquist productivity index. On average, Malaysian banks experience moderate scale economies and annual productivity change of 2.68 percent, with the latter driven primarily by technical change, which has declined over time. Our gross efficiency estimates suggest that Islamic banking is associated with higher input requirements. However, our productivity estimates indicate that full-fledged Islamic banks have overcome some of these cost disadvantages with rapid technical change, although this is not the case for conventional banks operating Islamic windows. Merged banks are found to have higher input usage and lower productivity change, suggesting that bank mergers have not contributed positively to bank performance. Finally, our results suggest that while the East Asian financial crisis had a short-term cost-reducing effect in 1998, the crisis triggered a more lasting negative impact by increasing the volume of non-performing loans.
Resumo:
The performance of the manufacturing sector has been a major factor contributing to Sweden's economic growth. This paper comprises eight short cases describing a range of Swedish organisations together with the principal features of their production function. The cases are intended to general discussion and provide a greater understanding of the technical and organisational factors which influence the efficiency of production systems.
Resumo:
New techniques in manufacturing, popularly referred to as mechanization and automation, have been a preoccupation of social and economic theorists since the industrial revolution. A selection of relevant literature is reviewed, including the neoclassical economic treatment of technical change. This incorporates alterations to the mathematical production function and an associated increase in the efficiency with which the factors of production are converted into output. Other work emphasises the role of research and development and the process of diffusion, whereby new production techniques are propagated throughout industry. Some sociological writings attach importance to the type of production technology and its effect on the organisational structure and social relations within the factory. Nine detailed case studies are undertaken of examples of industrial innovation in the rubber, automobile, vehicle components, confectionery and clothing industries. The old and new techniques are compared for a range of variables, including capital equipment, labour employed, raw materials used, space requirements and energy consumption, which in most cases exhibit significant change with the innovation. The rate of output, labour productivity, product quality, maintenance requirements and other aspects are also examined. The process by which the change in production method was achieved is documented, including the development of new equipment and the strategy of its introduction into the factory, where appropriate. The firm, its environment, and the attitude of different sectors of the workforce are all seen to play a part in determining the motives for and consequences which flow from the innovations. The traditional association of technical progress with its labour-saving aspect, though an accurate enough description of the cases investigated, is clearly seen to afford an inadequate perspective for the proper understanding of this complex phenomenon, which also induces change in a wide range of other social, economic and technical variables.
Resumo:
Financial institutes are an integral part of any modern economy. In the 1970s and 1980s, Gulf Cooperation Council (GCC) countries made significant progress in financial deepening and in building a modern financial infrastructure. This study aims to evaluate the performance (efficiency) of financial institutes (banking sector) in GCC countries. Since, the selected variables include negative data for some banks and positive for others, and the available evaluation methods are not helpful in this case, so we developed a Semi Oriented Radial Model to perform this evaluation. Furthermore, since the SORM evaluation result provides a limited information for any decision maker (bankers, investors, etc...), we proposed a second stage analysis using classification and regression (C&R) method to get further results combining SORM results with other environmental data (Financial, economical and political) to set rules for the efficient banks, hence, the results will be useful for bankers in order to improve their bank performance and to the investors, maximize their returns. Mainly there are two approaches to evaluate the performance of Decision Making Units (DMUs), under each of them there are different methods with different assumptions. Parametric approach is based on the econometric regression theory and nonparametric approach is based on a mathematical linear programming theory. Under the nonparametric approaches, there are two methods: Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH). While there are three methods under the parametric approach: Stochastic Frontier Analysis (SFA); Thick Frontier Analysis (TFA) and Distribution-Free Analysis (DFA). The result shows that DEA and SFA are the most applicable methods in banking sector, but DEA is seem to be most popular between researchers. However DEA as SFA still facing many challenges, one of these challenges is how to deal with negative data, since it requires the assumption that all the input and output values are non-negative, while in many applications negative outputs could appear e.g. losses in contrast with profit. Although there are few developed Models under DEA to deal with negative data but we believe that each of them has it is own limitations, therefore we developed a Semi-Oriented-Radial-Model (SORM) that could handle the negativity issue in DEA. The application result using SORM shows that the overall performance of GCC banking is relatively high (85.6%). Although, the efficiency score is fluctuated over the study period (1998-2007) due to the second Gulf War and to the international financial crisis, but still higher than the efficiency score of their counterpart in other countries. Banks operating in Saudi Arabia seem to be the highest efficient banks followed by UAE, Omani and Bahraini banks, while banks operating in Qatar and Kuwait seem to be the lowest efficient banks; this is because these two countries are the most affected country in the second Gulf War. Also, the result shows that there is no statistical relationship between the operating style (Islamic or Conventional) and bank efficiency. Even though there is no statistical differences due to the operational style, but Islamic bank seem to be more efficient than the Conventional bank, since on average their efficiency score is 86.33% compare to 85.38% for Conventional banks. Furthermore, the Islamic banks seem to be more affected by the political crisis (second Gulf War), whereas Conventional banks seem to be more affected by the financial crisis.
Resumo:
Most empirical work in economic growth assumes either a Cobb–Douglas production function expressed in logs or a log-approximated constant elasticity of substitution specification. Estimates from each are likely biased due to logging the model and the latter can also suffer from approximation bias. We illustrate this with a successful replication of Masanjala and Papagerogiou (The Solow model with CES technology: nonlinearities and parameter heterogeneity, Journal of Applied Econometrics 2004; 19: 171–201) and then estimate both models in levels to avoid these biases. Our estimation in levels gives results in line with conventional wisdom.
Resumo:
This study employs Stochastic Frontier Analysis (SFA) to analyse Malaysian commercial banks during 1996–2002, and particularly focuses on determining the impact of Islamic banking on performance. We derive both net and gross efficiency estimates, thereby demonstrating that differences in operating characteristics explain much of the difference in costs between Malaysian banks. We also decompose productivity change into efficiency, technical, and scale change using a generalized Malmquist productivity index. On average, Malaysian banks experience moderate scale economies and annual productivity change of 2.68%, with the latter driven primarily by Technical Change (TC), which has declined over time. Our gross efficiency estimates suggest that Islamic banking is associated with higher input requirements. However, our productivity estimates indicate that full-fledged Islamic banks have overcome some of these cost disadvantages with rapid TC, although this is not the case for conventional banks operating Islamic windows. Merged banks are found to have higher input usage and lower productivity change, suggesting that bank mergers have not contributed positively to bank performance. Finally, our results suggest that while the East Asian financial crisis had a short-term costreducing effect in 1998, the crisis triggered a long-lasting negative impact by increasing the volume of nonperforming loans.
Resumo:
Waste biomass is generated during the conservation management of semi-natural habitats, and represents an unused resource and potential bioenergy feedstock that does not compete with food production. Thermogravimetric analysis was used to characterise a representative range of biomass generated during conservation management in Wales. Of the biomass types assessed, those dominated by rush (Juncus effuses) and bracken (Pteridium aquilinum) exhibited the highest and lowest volatile compositions respectively and were selected for bench scale conversion via fast pyrolysis. Each biomass type was ensiled and a sub-sample of silage was washed and pressed. Demineralization of conservation biomass through washing and pressing was associated with higher oil yields following fast pyrolysis. The oil yields were within the published range established for the dedicated energy crops miscanthus and willow. In order to examine the potential a multiple output energy system was developed with gross power production estimates following valorisation of the press fluid, char and oil. If used in multi fuel industrial burners the char and oil alone would displace 3.9 × 105 tonnes per year of No. 2 light oil using Welsh biomass from conservation management. Bioenergy and product development using these feedstocks could simultaneously support biodiversity management and displace fossil fuels, thereby reducing GHG emissions. Gross power generation predictions show good potential.
Resumo:
Amidst concerns about achieving high levels of technology to remain competitive in the global market without compromising economic development, national economies are experiencing a high demand for human capital. As higher education is assumed to be the main source of human capital, this analysis focused on a more specific and less explored area of the generally accepted idea that higher education contributes to economic growth. The purpose of this study, therefore, was to find whether higher education also contributes to economic development, and whether that contribution is more substantial in a globalized context. ^ Consequently, a multiple linear regression analysis was conducted to support with statistical significance the answer to the research question: Does higher education contributes to economic development in the context of globalization? The information analyzed was obtained from historical data of 91 selected countries, and the period of time of the study was 10 years (1990–2000). Some variables, however, were lagged back 5, 10 or 15 years along a 15-year timeframe (1975–1990). The resulting comparative static model was based on the Cobb-Douglas production function and the Solow model to specify economic growth as a function of physical capital, labor, technology, and productivity. Then, formal education, economic development, and globalization were added to the equation. ^ The findings of this study supported the assumption that the independent contribution of the changes in higher education completion and globalization to changes in economic growth is more substantial than the contribution of their interaction. The results also suggested that changes in higher and secondary education completion contribute much more to changes in economic growth in less developed countries than in their more developed counterparts. ^ As a conclusion, based on the results of this study, I proposed the implementation of public policy in less developed countries to promote and expand adequate secondary and higher education systems with the purpose of helping in the achievement of economic development. I also recommended further research efforts on this topic to emphasize the contribution of education to the economy, mainly in less developed countries. ^
Resumo:
In Brazil, the National Agency of Electric Energy (ANEEL) represents the energy regulator. The rates review have been one of its main tasks, which establish a pricing practice at a level to cover the efficient operating costs and also the appropriate return of the distributors investments. The changes in the procedures to redefine the efficient costs and the several studies on the methodologies employed to regulate this segment denote the challenge faced by regulators about the best methodological strategy to be employed. In this context, this research aims to propose a benchmarking evaluation applied to the national regulation system in the establishment of efficient operating costs of electricity distribution utilities. The model is formulated to promote the electricity market development, partnering with government policies ant to society benefit. To conduct this research, an integration of Data Envelopment Analysis (DEA) with the Stochastic Frontier Analysis (SFA) is adopted in a three stages procedure to correct the efficiency in terms of environmental effects: (i) evaluation by means of DEA to measure operating costs slacks of the utilities, in which environmental variables are omitted; (ii) The slacks calculated in the first stage are regressed on a set of environmental variables by means of SFA and operating costs are adjusted to account the environmental impact and statistical noise effects; and, (iii) reassess the performance of the electric power distribution utilities by means of DEA. Based on this methodology it is possible to obtain a performance evaluation exclusively expressed in terms of management efficiency, in which the operating environment and statistical noise effects are controlled.
Resumo:
This paper considers the analysis of data from randomized trials which offer a sequence of interventions and suffer from a variety of problems in implementation. In experiments that provide treatment in multiple periods (T>1), subjects have up to 2^{T}-1 counterfactual outcomes to be estimated to determine the full sequence of causal effects from the study. Traditional program evaluation and non-experimental estimators are unable to recover parameters of interest to policy makers in this setting, particularly if there is non-ignorable attrition. We examine these issues in the context of Tennessee's highly influential randomized class size study, Project STAR. We demonstrate how a researcher can estimate the full sequence of dynamic treatment effects using a sequential difference in difference strategy that accounts for attrition due to observables using inverse probability weighting M-estimators. These estimates allow us to recover the structural parameters of the small class effects in the underlying education production function and construct dynamic average treatment effects. We present a complete and different picture of the effectiveness of reduced class size and find that accounting for both attrition due to observables and selection due to unobservable is crucial and necessary with data from Project STAR