924 resultados para Efficiency analysis
Resumo:
The paper investigates the efficiency of a sample of Islamic and conventional banks in 10 countries that operate Islamic banking for the period 1996–2002, using an output distance function approach. We obtain measures of efficiency after allowing for environmental influences such as country macroeconomic conditions, accessibility of banking services and bank type. While these factors are assumed to directly influence the shape of the technology, we assume that country dummies and bank size directly influence technical inefficiency. The parameter estimates highlight that during the sample period, Islamic banking appears to be associated with higher input usage. Furthermore, by allowing for bank size and international differences in the underlying inefficiency distributions, we are also able to demonstrate statistically significant differences in inefficiency related to these factors even after controlling for specific environmental characteristics and Islamic banking. Thus, for example, our results suggest that Sudan and Yemen have relatively higher inefficiency while Bahrain and Bangladesh have lower estimated inefficiency. Except for Sudan, where banks exhibits relatively strong returns to scale, most sample banks exhibit very slight returns to scale, although Islamic banks are found to have moderately higher returns to scale than conventional banks. While this suggests that Islamic banks may benefit from increased scale, we would emphasize that our results suggest that identifying and overcoming the factors that cause Islamic banks to have relatively low potential outputs for given input usage levels will be the key challenge for Islamic banking in the coming decades.
Resumo:
Data Envelopment Analysis (DEA) is a nonparametric method for measuring the efficiency of a set of decision making units such as firms or public sector agencies, first introduced into the operational research and management science literature by Charnes, Cooper, and Rhodes (CCR) [Charnes, A., Cooper, W.W., Rhodes, E., 1978. Measuring the efficiency of decision making units. European Journal of Operational Research 2, 429–444]. The original DEA models were applicable only to technologies characterized by positive inputs/outputs. In subsequent literature there have been various approaches to enable DEA to deal with negative data. In this paper, we propose a semi-oriented radial measure, which permits the presence of variables which can take both negative and positive values. The model is applied to data on a notional effluent processing system to compare the results with those yielded by two alternative methods for dealing with negative data in DEA: The modified slacks-based model suggested by Sharp et al. [Sharp, J.A., Liu, W.B., Meng, W., 2006. A modified slacks-based measure model for data envelopment analysis with ‘natural’ negative outputs and inputs. Journal of Operational Research Society 57 (11) 1–6] and the range directional model developed by Portela et al. [Portela, M.C.A.S., Thanassoulis, E., Simpson, G., 2004. A directional distance approach to deal with negative data in DEA: An application to bank branches. Journal of Operational Research Society 55 (10) 1111–1121]. A further example explores the advantages of using the new model.
Resumo:
We investigate the feasibility of simultaneous suppressing of the amplification noise and nonlinearity, representing the most fundamental limiting factors in modern optical communication. To accomplish this task we developed a general design optimisation technique, based on concepts of noise and nonlinearity management. We demonstrate the immense efficiency of the novel approach by applying it to a design optimisation of transmission lines with periodic dispersion compensation using Raman and hybrid Raman-EDFA amplification. Moreover, we showed, using nonlinearity management considerations, that the optimal performance in high bit-rate dispersion managed fibre systems with hybrid amplification is achieved for a certain amplifier spacing – which is different from commonly known optimal noise performance corresponding to fully distributed amplification. Required for an accurate estimation of the bit error rate, the complete knowledge of signal statistics is crucial for modern transmission links with strong inherent nonlinearity. Therefore, we implemented the advanced multicanonical Monte Carlo (MMC) method, acknowledged for its efficiency in estimating distribution tails. We have accurately computed acknowledged for its efficiency in estimating distribution tails. We have accurately computed marginal probability density functions for soliton parameters, by numerical modelling of Fokker-Plank equation applying the MMC simulation technique. Moreover, applying a powerful MMC method we have studied the BER penalty caused by deviations from the optimal decision level in systems employing in-line 2R optical regeneration. We have demonstrated that in such systems the analytical linear approximation that makes a better fit in the central part of the regenerator nonlinear transfer function produces more accurate approximation of the BER and BER penalty. We present a statistical analysis of RZ-DPSK optical signal at direct detection receiver with Mach-Zehnder interferometer demodulation
Resumo:
This chapter provides the theoretical foundation and background on data envelopment analysis (DEA) method. We first introduce the basic DEA models. The balance of this chapter focuses on evidences showing DEA has been extensively applied for measuring efficiency and productivity of services including financial services (banking, insurance, securities, and fund management), professional services, health services, education services, environmental and public services, energy services, logistics, tourism, information technology, telecommunications, transport, distribution, audio-visual, media, entertainment, cultural and other business services. Finally, we provide information on the use of Performance Improvement Management Software (PIM-DEA). A free limited version of this software and downloading procedure is also included in this chapter.
Resumo:
The topography of the visual evoked magnetic response (VEMR) to a pattern onset stimulus was studied in five normal subjects using a single channel BTi magnetometer. Topographic distributions were analysed at regular intervals following stimulus onset (chronotopograpby). Two distinct field distributions were observed with half field stimulation: (1) activity corresponding to the C11 m which remains stable for an average of 34 msec and (2) activity corresponding to the C111 m which remains stable for about 50 msec. However, the full field topography of the largest peak within the first 130 msec does not have a predictable latency or topography in different subjects. The data suggest that the appearance of this peak is dependent on the amplitude, latency and duration of the half field C11 m peaks and the efficiency of half field summation. Hence, topographic mapping is essential to correctly identify the C11 m peak in a full field response as waveform morphology, peak latency and polarity are not reliable indicators. © 1993.
Resumo:
Grafting of antioxidants and other modifiers onto polymers by reactive extrusion, has been performed successfully by the Polymer Processing and Performance Group at Aston University. Traditionally the optimum conditions for the grafting process have been established within a Brabender internal mixer. Transfer of this batch process to a continuous processor, such as an extruder, has, typically, been empirical. To have more confidence in the success of direct transfer of the process requires knowledge of, and comparison between, residence times, mixing intensities, shear rates and flow regimes in the internal mixer and in the continuous processor.The continuous processor chosen for the current work in the closely intermeshing, co-rotating twin-screw extruder (CICo-TSE). CICo-TSEs contain screw elements that convey material with a self-wiping action and are widely used for polymer compounding and blending. Of the different mixing modules contained within the CICo-TSE, the trilobal elements, which impose intensive mixing, and the mixing discs, which impose extensive mixing, are of importance when establishing the intensity of mixing. In this thesis, the flow patterns within the various regions of the single-flighted conveying screw elements and within both the trilobal element and mixing disc zones of a Betol BTS40 CICo-TSE, have been modelled using the computational fluid dynamics package Polyflow. A major obstacle encountered when solving the flow problem within all of these sets of elements, arises from both the complex geometry and the time-dependent flow boundaries as the elements rotate about their fixed axes. Simulation of the time dependent boundaries was overcome by selecting a number of sequential 2D and 3D geometries, used to represent partial mixing cycles. The flow fields were simulated using the ideal rheological properties of polypropylene and characterised in terms of velocity vectors, shear stresses generated and a parameter known as the mixing efficiency. The majority of the large 3D simulations were performed on the Cray J90 supercomputer situated at the Rutherford-Appleton laboratories, with pre- and postprocessing operations achieved via a Silicon Graphics Indy workstation. A mechanical model was constructed consisting of various CICo-TSE elements rotating within a transparent outer barrel. A technique has been developed using coloured viscous clays whereby the flow patterns and mixing characteristics within the CICo-TSE may be visualised. In order to test and verify the simulated predictions, the patterns observed within the mechanical model were compared with the flow patterns predicted by the computational model. The flow patterns within the single-flighted conveying screw elements in particular, showed good agreement between the experimental and simulated results.
Resumo:
The suitability of a new plastic supporting medium for biofiltration was tested over a three year period. Tests were carried out on the stability, surface properties, mechanical strength, and dimensions of the medium. There was no evidence to suggest that the medium was deficient in any of these respects. The specific surface (320m2m-3) and the voidage (94%) of the new medium are unlike any other used in bio-filtration and a pilot plant containing two filters was built to observe its effects on ecology and performance. Performance was estimated by chemical analysis and ecology studied by film examination and fauna counts. A system of removable sampling baskets was designed to enable samples to be obtained from two intermediate depths of filter. One of the major operating problems of percolating filters is excessive accumulation of film. The amount of film is influenced by hydraulic and organic load and each filter was run at a different loading. One was operated at 1.2m3m-3day-1 (DOD load 0.24kgm-3day-1) judged at the time to be the lowest filtration rate to offer advantages over conventional media. The other filter was operated at more than twice this loading (2.4m3m-3day-lBOD load 0.55kgm-3day-1) giving a roughly 2.5x and 6x the conventional loadings recommended for a Royal Commission effluent. The amount of film in each filter was normally low (0.05-3kgm(3 as volatile solids) and did not affect efficiency. The evidence collected during the study indicated that the ecology of the filters was normal when compared with the data obtained from the literature relating to filters with mineral media. There were indications that full ecological stability was yet to be reached and this was affecting the efficiency of the filters. The lower rate filter produced an average 87% BOD removal giving a consistent Royal Commission effluent during the summer months. The higher rate filter produced a mean 83% BOD removal but at no stage a consistent Royal Commission effluent. From the data on ecology and performance the filters resembled conventional filters rather than high rate filters.
Resumo:
This thesis presents a number of methodological developments that were raised by a real life application to measuring the efficiency of bank branches. The advent of internet banking and phone banking is changing the role of bank branches from a predominantly transaction-based one to a sales-oriented role. This fact requires the development of new forms of assessing and comparing branches of a bank. In addition, performance assessment models must also take into account the fact that bank branches are service and for-profit organisations to which providing adequate service quality as well as being profitable are crucial objectives. This study analyses bank branches performance in their new roles in three different areas: their effectiveness in fostering the use of new transaction channels such as the internet and the telephone (transactional efficiency); their effectiveness in increasing sales and their customer base (operational efficiency); and their effectiveness in generating profits without compromising the quality of service (profit efficiency). The chosen methodology for the overall analysis is Data Envelopment Analysis (DEA). The application attempted here required some adaptations to existing DEA models and indeed some new models so that some specialities of our data could be handled. These concern the development of models that can account for negative data, the development of models to measure profit efficiency, and the development of models that yield production units with targets that are nearer to their observed levels than targets yielded by traditional DEA models. The application of the developed models to a sample of Portuguese bank branches allowed their classification according to the three performance dimensions (transactional, operational and profit efficiency). It also provided useful insights to bank managers regarding how bank branches compare between themselves in terms of their performance, and how, in general, the three performance dimensions are connected between themselves.
Resumo:
Orthodox contingency theory links effective organisational performance to compatible relationships between the environment and organisation strategy and structure and assumes that organisations have the capacity to adapt as the environment changes. Recent contributions to the literature on organisation theory claim that the key to effective performance is effective adaptation which in turn requires the simultaneous reconciliation of efficiency and innovation which is afforded by an unique environment-organisation configuration. The literature on organisation theory recognises the continuing confusion caused by the fragmented and often conflicting results from cross-sectional studies. Although the case is made for longitudinal studies which comprehensively describe the evolving relationship between the environment and the organisation there is little to suggest how such studies should be executed in practice. Typically the choice is between the approaches of the historicised case study and statistical analysis of large populations which examine the relationship between environment and organisation strategy and/or structure and ignore the product-process relationship. This study combines the historicised case study and the multi-variable and ordinal scale approach of statistical analysis to construct an analytical framework which tracks and exposes the environment-organisation-performance relationship over time. The framework examines changes in the environment, strategy and structure and uniquely includes an assessment of the organisation's product-process relationship and its contribution to organisational efficiency and innovation. The analytical framework is applied to examine the evolving environment-organisation relationship of two organisations in the same industry over the same twenty-five year period to provide a sector perspective of organisational adaptation. The findings demonstrate the significance of the environment-organisation configuration to the scope and frequency of adaptation and suggest that the level of sector homogeneity may be linked to the level of product-process standardisation.
Resumo:
This investigation is in two parts, theory and experimental verification. (1) Theoretical Study In this study it is, for obvious reasons, necessary to analyse the concept of formability first. For the purpose of the present investigation it is sufficient to define the four aspects of formability as follows: (a) the formability of the material at a critical section, (b) the formability of the material in general, (c) process efficiency, (d) proportional increase in surface area. A method of quantitative assessment is proposed for each of the four aspects of formability. The theoretical study also includes the distinction between coaxial and non-coaxial strains which occur, respectively, in axisymmetrical and unsymmetrical forming processes and the inadequacy of the circular grid system for the assessment of formability is explained in the light of this distinction. (2) Experimental Study As one of the bases of the experimental work, the determination of the end point of a forming process, which sets the limit to the formability of the work material, is discussed. The effects of three process parameters on draw-in are shown graphically. Then the delay of fracture in sheet metal forming resulting from draw-in is analysed in kinematical terms, namely, through the radial displacements, the radial and the circumferential strains, and the projected thickness of the workpiece. Through the equilibrium equation of the membrane stresses, the effect on the shape of the unsupported region of the workpiece, and hence the position of the critical section is explained. Then, the effect of draw-in on the four aspects of formability is discussed throughout this investigation. The triangular coordinate system is used to present and analyse the triaxial strains involved. This coordinate system has the advantage of showing all the three principal strains in a material simultaneously, as well as representing clearly the many types of strains involved in sheet metal work.
Resumo:
With the competitive challenge facing business today, the need to keep cost down and quality up is a matter of survival. One way in which wire manufacturers can meet this challenge is to possess a thorough understanding of deformation, friction and lubrication during the wire drawing process, and therefore to make good decisions regarding the selection and application of lubricants as well as the die design. Friction, lubrication and die design during wire drawing thus become the subject of this study. Although theoretical and experimental investigations have been being carried out ever since the establishment of wire drawing technology, many problems remain unsolved. It is therefore necessary to conduct further research on traditional and fundamental subjects such as the mechanics of deformation, friction, lubrication and die design in wire drawing. Drawing experiments were carried out on an existing bull-block under different cross-sectional area reductions, different speeds and different lubricants. The instrumentation to measure drawing load and drawing speed was set up and connected to the wire drawing machine, together with a data acquisition system. A die box connected to the existing die holder for using dry soap lubricant was designed and tested. The experimental results in terms of drawing stress vs percentage area reduction curves under different drawing conditions were analysed and compared. The effects on drawing stress of friction, lubrication, drawing speed and pressure die nozzle are discussed. In order to determine the flow stress of the material during deformation, tensile tests were performed on an Instron universal test machine, using the wires drawn under different area reductions. A polynomial function is used to correlate the flow stress of the material with the plastic strain, on which a general computer program has been written to find out the coefficients of the stress-strain function. The residual lubricant film on the steel wire after drawing was examined both radially and longitudinally using an SEM and optical microscope. The lubricant film on the drawn wire was clearly observed. Therefore, the micro-analysis by SEM provides a way of friction and lubrication assessment in wire drawing.
Resumo:
The main advantage of Data Envelopment Analysis (DEA) is that it does not require any priori weights for inputs and outputs and allows individual DMUs to evaluate their efficiencies with the input and output weights that are only most favorable weights for calculating their efficiency. It can be argued that if DMUs are experiencing similar circumstances, then the pricing of inputs and outputs should apply uniformly across all DMUs. That is using of different weights for DMUs makes their efficiencies unable to be compared and not possible to rank them on the same basis. This is a significant drawback of DEA; however literature observed many solutions including the use of common set of weights (CSW). Besides, the conventional DEA methods require accurate measurement of both the inputs and outputs; however, crisp input and output data may not relevant be available in real world applications. This paper develops a new model for the calculation of CSW in fuzzy environments using fuzzy DEA. Further, a numerical example is used to show the validity and efficacy of the proposed model and to compare the results with previous models available in the literature.
Resumo:
Financial institutes are an integral part of any modern economy. In the 1970s and 1980s, Gulf Cooperation Council (GCC) countries made significant progress in financial deepening and in building a modern financial infrastructure. This study aims to evaluate the performance (efficiency) of financial institutes (banking sector) in GCC countries. Since, the selected variables include negative data for some banks and positive for others, and the available evaluation methods are not helpful in this case, so we developed a Semi Oriented Radial Model to perform this evaluation. Furthermore, since the SORM evaluation result provides a limited information for any decision maker (bankers, investors, etc...), we proposed a second stage analysis using classification and regression (C&R) method to get further results combining SORM results with other environmental data (Financial, economical and political) to set rules for the efficient banks, hence, the results will be useful for bankers in order to improve their bank performance and to the investors, maximize their returns. Mainly there are two approaches to evaluate the performance of Decision Making Units (DMUs), under each of them there are different methods with different assumptions. Parametric approach is based on the econometric regression theory and nonparametric approach is based on a mathematical linear programming theory. Under the nonparametric approaches, there are two methods: Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH). While there are three methods under the parametric approach: Stochastic Frontier Analysis (SFA); Thick Frontier Analysis (TFA) and Distribution-Free Analysis (DFA). The result shows that DEA and SFA are the most applicable methods in banking sector, but DEA is seem to be most popular between researchers. However DEA as SFA still facing many challenges, one of these challenges is how to deal with negative data, since it requires the assumption that all the input and output values are non-negative, while in many applications negative outputs could appear e.g. losses in contrast with profit. Although there are few developed Models under DEA to deal with negative data but we believe that each of them has it is own limitations, therefore we developed a Semi-Oriented-Radial-Model (SORM) that could handle the negativity issue in DEA. The application result using SORM shows that the overall performance of GCC banking is relatively high (85.6%). Although, the efficiency score is fluctuated over the study period (1998-2007) due to the second Gulf War and to the international financial crisis, but still higher than the efficiency score of their counterpart in other countries. Banks operating in Saudi Arabia seem to be the highest efficient banks followed by UAE, Omani and Bahraini banks, while banks operating in Qatar and Kuwait seem to be the lowest efficient banks; this is because these two countries are the most affected country in the second Gulf War. Also, the result shows that there is no statistical relationship between the operating style (Islamic or Conventional) and bank efficiency. Even though there is no statistical differences due to the operational style, but Islamic bank seem to be more efficient than the Conventional bank, since on average their efficiency score is 86.33% compare to 85.38% for Conventional banks. Furthermore, the Islamic banks seem to be more affected by the political crisis (second Gulf War), whereas Conventional banks seem to be more affected by the financial crisis.
Resumo:
Liposomes due to their biphasic characteristic and diversity in design, composition and construction, offer a dynamic and adaptable technology for enhancing drug solubility. Starting with equimolar egg-phosphatidylcholine (PC)/cholesterol liposomes, the influence of the liposomal composition and surface charge on the incorporation and retention of a model poorly water soluble drug, ibuprofen was investigated. Both the incorporation and the release of ibuprofen were influenced by the lipid composition of the multi-lamellar vesicles (MLV) with inclusion of the long alkyl chain lipid (dilignoceroyl phosphatidylcholine (C 24PC)) resulting in enhanced ibuprofen incorporation efficiency and retention. The cholesterol content of the liposome bilayer was also shown to influence ibuprofen incorporation with maximum ibuprofen incorporation efficiency achieved when 4 μmol of cholesterol was present in the MLV formulation. Addition of anionic lipid dicetylphosphate (DCP) reduced ibuprofen drug loading presumably due to electrostatic repulsive forces between the carboxyl group of ibuprofen and the anionic head-group of DCP. In contrast, the addition of 2 μmol of the cationic lipid stearylamine (SA) to the liposome formulation (PC:Chol - 16 μmol:4 μmol) increased ibuprofen incorporation efficiency by approximately 8%. However further increases of the SA content to 4 μmol and above reduced incorporation by almost 50% compared to liposome formulations excluding the cationic lipid. Environmental scanning electron microscopy (ESEM) was used to dynamically follow the changes in liposome morphology during dehydration to provide an alternative assay of liposome stability. ESEM analysis clearly demonstrated that ibuprofen incorporation improved the stability of PC:Chol liposomes as evidenced by an increased resistance to coalescence during dehydration. These finding suggest a positive interaction between amphiphilic ibuprofen molecules and the bilayer structure of the liposome. © 2004 Elsevier B.V. All rights reserved.
Resumo:
This study employs Stochastic Frontier Analysis (SFA) to analyse Malaysian commercial banks during 1996–2002, and particularly focuses on determining the impact of Islamic banking on performance. We derive both net and gross efficiency estimates, thereby demonstrating that differences in operating characteristics explain much of the difference in costs between Malaysian banks. We also decompose productivity change into efficiency, technical, and scale change using a generalized Malmquist productivity index. On average, Malaysian banks experience moderate scale economies and annual productivity change of 2.68%, with the latter driven primarily by Technical Change (TC), which has declined over time. Our gross efficiency estimates suggest that Islamic banking is associated with higher input requirements. However, our productivity estimates indicate that full-fledged Islamic banks have overcome some of these cost disadvantages with rapid TC, although this is not the case for conventional banks operating Islamic windows. Merged banks are found to have higher input usage and lower productivity change, suggesting that bank mergers have not contributed positively to bank performance. Finally, our results suggest that while the East Asian financial crisis had a short-term costreducing effect in 1998, the crisis triggered a long-lasting negative impact by increasing the volume of nonperforming loans.