94 resultados para Eco efficiency performance


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the feasibility of simultaneous suppressing of the amplification noise and nonlinearity, representing the most fundamental limiting factors in modern optical communication. To accomplish this task we developed a general design optimisation technique, based on concepts of noise and nonlinearity management. We demonstrate the immense efficiency of the novel approach by applying it to a design optimisation of transmission lines with periodic dispersion compensation using Raman and hybrid Raman-EDFA amplification. Moreover, we showed, using nonlinearity management considerations, that the optimal performance in high bit-rate dispersion managed fibre systems with hybrid amplification is achieved for a certain amplifier spacing – which is different from commonly known optimal noise performance corresponding to fully distributed amplification. Required for an accurate estimation of the bit error rate, the complete knowledge of signal statistics is crucial for modern transmission links with strong inherent nonlinearity. Therefore, we implemented the advanced multicanonical Monte Carlo (MMC) method, acknowledged for its efficiency in estimating distribution tails. We have accurately computed acknowledged for its efficiency in estimating distribution tails. We have accurately computed marginal probability density functions for soliton parameters, by numerical modelling of Fokker-Plank equation applying the MMC simulation technique. Moreover, applying a powerful MMC method we have studied the BER penalty caused by deviations from the optimal decision level in systems employing in-line 2R optical regeneration. We have demonstrated that in such systems the analytical linear approximation that makes a better fit in the central part of the regenerator nonlinear transfer function produces more accurate approximation of the BER and BER penalty. We present a statistical analysis of RZ-DPSK optical signal at direct detection receiver with Mach-Zehnder interferometer demodulation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the effect of fibre base and grating profile on the efficiency of ultra-long Raman lasers. We show that for the studied parameters, FBG profile does not affect the performance when operating away from the zero-dispersion wavelength.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main aim of this work was to study the effect of two comonomers, trimethylolpropane trimethacrylate (TRIS) and divinylbenzene (DVB) on the nature and efficiency of grafting of two different monomers, glycidyl methacrylate (GMA) and maleic anhydride (MA) on polypropylene (P) and on natural rubber (NR) using reactive processing methods. Four different peroxides, benzoyl peroxide (BPO), dicumyl peroxide (DCP), 2,5-dimethyl-2,5-bis-(tert-butyl peroxy) hexane (t-101), and 1,1-di(tert-butylperoxy)-3,3,5-trimethyl cyclohexene (T-29B90) were examined as free radical initiators. An appropriate methodology was established and chemical composition and reactive processing parameters were examined and optimised. It was found that in the absence of the coagents DVB and TRIS, the grafting degree of GMA and MA increased with increasing peroxide concentration, but the level of grafting was low and the homopolymerisaton of GMA and the crosslinking of NR or chain scission of PP were identified as the main side reactions that competed with the desired grafting reaction in the polymers. At high concentrations of the peroxide T-101 (>0.02 mr) cross linking of NR and chain scission of PP became dominant and unacceptable. An attempt to add a reactive coagent, e.g. TRIS during grafting of GMA on natural rubber resulted in excessive crosslinking because of the very high reactivity of this comonomer with the C=C of the rubber. Therefore, the use of any multifunctional and highly reactive coagent such as TRIS, could not be applied in the grafting of GAM onto natural rubber. In the case of PP, however, the use of TRIS and DVB was shown to greatly enhance the grafting degree and reduce the chain scission with very little extent of monomer homopolymerisation taking place. The results showed that the grafting degree was increased with increasing GMA and MA concentrations. It was also found that T-101 was a suitable peroxide to initiate the grafting reaction of these monomers on NR and PP and the optimum temperature for this peroxide was =160°C. A very preliminary work was also conducted on the use of the functionalised-PP (f-PP) in the absence and presence of the two comonomers (f-PP-DVB or f-PP-TRIS) for the purpose of compatibilising PP-PBT blends through reactive blending. Examination of the morphology of the blends suggested that an effective compatibilisation has been achieved when using f-PP-DVB and f-PP-TRIS, however more work is required in this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate prediction of shellside pressure drop in a baffled shell-and-tube heat exchanger is very difficult because of the complicated shellside geometry. Ideally, all the shellside fluid should be alternately deflected across the tube bundle as it traverses from inlet to outlet. In practice, up to 60% of the shellside fluid may bypass the tube bundle or leak through the baffles. This short-circuiting of the main flow reduces the efficiency of the exchanger. Of the various shellside methods, it is shown that only the multi-stream methods, which attempt to obtain the shellside flow distribution, predict the pressure drop with any degree of accuracy, the various predictions ranging from -30% to +70%, generally overpredicting. It is shown that the inaccuracies are mainly due to the manner in which baffle leakage is modelled. The present multi-stream methods do not allow for interactions of the various flowstreams, and yet it is shown that three main effects are identified, a) there is a strong interaction between the main cross flow and the baffle leakage streams, enhancing the crossflow pressure drop, b) there is a further short-circuit not considered previously i.e. leakage in the window, and c) the crossflow does not penetrate as far, on average, as previously supposed. Models are developed for each of these three effects, along with a new windowflow pressure drop model, and it is shown that the effect of baffle leakage in the window is the most significant. These models developed to allow for various interactions, lead to an improved multi-stream method, named the "STREAM-INTERACTION" method. The overall method is shown to be consistently more accurate than previous methods, with virtually all the available shellside data being predicted to within ±30% and over 60% being within ±20%. The method is, thus, strongly recommended for use as a design method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The flow characteristics of neutral sodium silicate glass in an open hearth regenerative furnace have been studied using a one tenth scale physical model. The constraints of similarity have been investigated and discussed, and the use of sodium liquor as a cold modelling solution has been developed. Methylene Blue and Sulphacid Brill Pink are used as delineators, and a technique for analysing the concentration of each even in a mixture has been developed. The residence/time distributions from the model have been simulated using a mixed model computer program which identifies the nature and size of the most significant flow streams within the furnace. The results clearly show that the model gives a true representation of the furnace and illustrates a number of alternatives for operating or design changes which will lead to improved production efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is concerned with the nature of liquid flow across industrial sieve trays operating in the spray, mixed, and the emulsified flow regimes. In order to overcome the practical difficulties of removing many samples from a commercial tray, the mass transfer process was investigated in an air water simulator column by heat transfer analogy. The temperature of the warm water was measured by many thermocouples as the water flowed across the single pass 1.2 m diameter sieve tray. The thermocouples were linked to a mini computer for the storage of the data. The temperature data were then transferred to a main frame computer to generate temperature profiles - analogous to concentration profiles. A comprehensive study of the existing tray efficiency models was carried out using computerised numerical solutions. The calculated results were compared with experimental results published by the Fractionation Research Incorporation (FRl) and the existing models did not show any agreement with the experimental results. Only the Porter and Lockett model showed a reasonable agreement with the experimental results for cenain tray efficiency values. A rectangular active section tray was constructed and tested to establish the channelling effect and the result of its effect on circular tray designs. The developed flow patterns showed predominantly flat profiles and some indication of significant liquid flow through the central region of the tray. This comfirms that the rectangular tray configuration might not be a satisfactory solution for liquid maldistribution on sieve trays. For a typical industrial tray the flow of liquid as it crosses the tray from the inlet to the outlet weir could be affected by the mixing of liquid by the eddy, momentum and the weir shape in the axial or the transverse direction or both. Conventional U-shape profiles were developed when the operating conditions were such that the froth dispersion was in the mixed regime, with good liquid temperature distribution while in the spray regime. For the 12.5 mm hole diameter tray the constant temperature profiles were found to be in the axial direction while in the spray regime and in the transverse direction for the 4.5 mm hole tray. It was observed that the extent of the liquid stagnant zones at the sides of the tray depended on the tray hole diameter and was larger for the 4.5 mm hole tray. The liquid hold-up results show a high liquid hold-up at the areas of the tray with low liquid temperatures, this supports the doubts about the assumptions of constant point efficiency across an operating tray. Liquid flow over the outlet weir showed more liquid flow at the centre of the tray at high liquid loading with low liquid flow at both ends of the weir. The calculated results of the point and tray efficiency model showed a general increase in the calculated point and tray efficiencies with an increase in the weir loading, as the flow regime changed from the spray to the mixed regime the point and the tray efficiencies increased from approximately 30 to 80%.Through the mixed flow regime the efficiencies were found to remain fairly constant, and as the operating conditions were changed to maintain an emulsified flow regime there was a decrease in the resulting efficiencies. The results of the estimated coefficient of mixing for the small and large hole diameter trays show that the extent of liquid mixing on an operating tray generally increased with increasing capacity factor, but decreased with increasing weir loads. This demonstrates that above certain weir loads, the effect of eddy diffusion mechanism on the process of liquid mixing on an operating tray to be negligible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The suitability of a new plastic supporting medium for biofiltration was tested over a three year period. Tests were carried out on the stability, surface properties, mechanical strength, and dimensions of the medium. There was no evidence to suggest that the medium was deficient in any of these respects. The specific surface (320m2m-3) and the voidage (94%) of the new medium are unlike any other used in bio-filtration and a pilot plant containing two filters was built to observe its effects on ecology and performance. Performance was estimated by chemical analysis and ecology studied by film examination and fauna counts. A system of removable sampling baskets was designed to enable samples to be obtained from two intermediate depths of filter. One of the major operating problems of percolating filters is excessive accumulation of film. The amount of film is influenced by hydraulic and organic load and each filter was run at a different loading. One was operated at 1.2m3m-3day-1 (DOD load 0.24kgm-3day-1) judged at the time to be the lowest filtration rate to offer advantages over conventional media. The other filter was operated at more than twice this loading (2.4m3m-3day-lBOD load 0.55kgm-3day-1) giving a roughly 2.5x and 6x the conventional loadings recommended for a Royal Commission effluent. The amount of film in each filter was normally low (0.05-3kgm(3 as volatile solids) and did not affect efficiency. The evidence collected during the study indicated that the ecology of the filters was normal when compared with the data obtained from the literature relating to filters with mineral media. There were indications that full ecological stability was yet to be reached and this was affecting the efficiency of the filters. The lower rate filter produced an average 87% BOD removal giving a consistent Royal Commission effluent during the summer months. The higher rate filter produced a mean 83% BOD removal but at no stage a consistent Royal Commission effluent. From the data on ecology and performance the filters resembled conventional filters rather than high rate filters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a number of methodological developments that were raised by a real life application to measuring the efficiency of bank branches. The advent of internet banking and phone banking is changing the role of bank branches from a predominantly transaction-based one to a sales-oriented role. This fact requires the development of new forms of assessing and comparing branches of a bank. In addition, performance assessment models must also take into account the fact that bank branches are service and for-profit organisations to which providing adequate service quality as well as being profitable are crucial objectives. This study analyses bank branches performance in their new roles in three different areas: their effectiveness in fostering the use of new transaction channels such as the internet and the telephone (transactional efficiency); their effectiveness in increasing sales and their customer base (operational efficiency); and their effectiveness in generating profits without compromising the quality of service (profit efficiency). The chosen methodology for the overall analysis is Data Envelopment Analysis (DEA). The application attempted here required some adaptations to existing DEA models and indeed some new models so that some specialities of our data could be handled. These concern the development of models that can account for negative data, the development of models to measure profit efficiency, and the development of models that yield production units with targets that are nearer to their observed levels than targets yielded by traditional DEA models. The application of the developed models to a sample of Portuguese bank branches allowed their classification according to the three performance dimensions (transactional, operational and profit efficiency). It also provided useful insights to bank managers regarding how bank branches compare between themselves in terms of their performance, and how, in general, the three performance dimensions are connected between themselves.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since 1988, quasi-markets have been introduced into many areas of social policy in the UK, the NHS internal market is one example. Markets operate by price signals. The NHS Internal Market, if it is to operate efficiently, requires purchasers and providers to respond to price signals. The research hypothesis is - cost accounting methods can be developed to enable healthcare contracts to be priced on a cost-basis in a manner which will facilitate the achievement of economic efficiency in the NHS internal market. Surveys of hospitals in 1991 and 1994 established the cost methods adopted in deriving the prices for healthcare contracts in the first year of the market and three years on. An in-depth view of the costing for pricing process was gained through case studies. Hospitals had inadequate cost information on which to price healthcare contracts at the inception of the internal market: prices did not reflect the relative performance of healthcare providers sufficiently closely to enable the market's espoused efficiency aims to be achieved. Price variations were often due to differing costing approaches rather than efficiency. Furthermore, price comparisons were often meaningless because of inadequate definition of the services (products). In April 1993, the NHS Executive issued guidance on costing for contracting to all NHS providers in an attempt to improve the validity of price comparisons between alternative providers. The case studies and the 1994 survey show that although price comparison has improved, considerable problems remain. Consistency is not assured, and the problem of adequate product definition is still to be solved. Moreover, the case studies clearly highlight the mismatch of rigid, full-cost pricing rules with both the financial management considerations at local level and the emerging internal market(s). Incentives exist to cost-shift, and healthcare prices can easily be manipulated. In the search for a new health policy paradigm to replace traditional bureaucratic provision, cost-based pricing cannot be used to ensure a more efficient allocation of healthcare resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Orthodox contingency theory links effective organisational performance to compatible relationships between the environment and organisation strategy and structure and assumes that organisations have the capacity to adapt as the environment changes. Recent contributions to the literature on organisation theory claim that the key to effective performance is effective adaptation which in turn requires the simultaneous reconciliation of efficiency and innovation which is afforded by an unique environment-organisation configuration. The literature on organisation theory recognises the continuing confusion caused by the fragmented and often conflicting results from cross-sectional studies. Although the case is made for longitudinal studies which comprehensively describe the evolving relationship between the environment and the organisation there is little to suggest how such studies should be executed in practice. Typically the choice is between the approaches of the historicised case study and statistical analysis of large populations which examine the relationship between environment and organisation strategy and/or structure and ignore the product-process relationship. This study combines the historicised case study and the multi-variable and ordinal scale approach of statistical analysis to construct an analytical framework which tracks and exposes the environment-organisation-performance relationship over time. The framework examines changes in the environment, strategy and structure and uniquely includes an assessment of the organisation's product-process relationship and its contribution to organisational efficiency and innovation. The analytical framework is applied to examine the evolving environment-organisation relationship of two organisations in the same industry over the same twenty-five year period to provide a sector perspective of organisational adaptation. The findings demonstrate the significance of the environment-organisation configuration to the scope and frequency of adaptation and suggest that the level of sector homogeneity may be linked to the level of product-process standardisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The contributions in this research are split in to three distinct, but related, areas. The focus of the work is based on improving the efficiency of video content distribution in the networks that are liable to packet loss, such as the Internet. Initially, the benefits and limitations of content distribution using Forward Error Correction (FEC) in conjunction with the Transmission Control Protocol (TCP) is presented. Since added FEC can be used to reduce the number of retransmissions, the requirement for TCP to deal with any losses is greatly reduced. When real-time applications are needed, delay must be kept to a minimum, and retransmissions not desirable. A balance, therefore, between additional bandwidth and delays due to retransmissions must be struck. This is followed by the proposal of a hybrid transport, specifically for H.264 encoded video, as a compromise between the delay-prone TCP and the loss-prone UDP. It is argued that the playback quality at the receiver often need not be 100% perfect, providing a certain level is assured. Reliable TCP is used to transmit and guarantee delivery of the most important packets. The delay associated with the proposal is measured, and the potential for use as an alternative to the conventional methods of transporting video by either TCP or UDP alone is demonstrated. Finally, a new objective measurement is investigated for assessing the playback quality of video transported using TCP. A new metric is defined to characterise the quality of playback in terms of its continuity. Using packet traces generated from real TCP connections in a lossy environment, simulating the playback of a video is possible, whilst monitoring buffer behaviour to calculate pause intensity values. Subjective tests are conducted to verify the effectiveness of the metric introduced and show that the results of objective and subjective scores made are closely correlated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis consisted of two major parts, one determining the masking characteristics of pixel noise and the other investigating the properties of the detection filter employed by the visual system. The theoretical cut-off frequency of white pixel noise can be defined from the size of the noise pixel. The empirical cut-off frequency, i.e. the largest size of noise pixels that mimics the effect of white noise in detection, was determined by measuring contrast energy thresholds for grating stimuli in the presence of spatial noise consisting of noise pixels of various sizes and shapes. The critical i.e. minimum number of noise pixels per grating cycle needed to mimic the effect of white noise in detection was found to decrease with the bandwidth of the stimulus. The shape of the noise pixels did not have any effect on the whiteness of pixel noise as long as there was at least the minimum number of noise pixels in all spatial dimensions. Furthermore, the masking power of white pixel noise is best described when the spectral density is calculated by taking into account all the dimensions of noise pixels, i.e. width, height, and duration, even when there is random luminance only in one of these dimensions. The properties of the detection mechanism employed by the visual system were studied by measuring contrast energy thresholds for complex spatial patterns as a function of area in the presence of white pixel noise. Human detection efficiency was obtained by comparing human performance with an ideal detector. The stimuli consisted of band-pass filtered symbols, uniform and patched gratings, and point stimuli with randomised phase spectra. In agreement with the existing literature, the detection performance was found to decline with the increasing amount of detail and contour in the stimulus. A measure of image complexity was developed and successfully applied to the data. The accuracy of the detection mechanism seems to depend on the spatial structure of the stimulus and the spatial spread of contrast energy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Performance optimization of ultra-long Raman laser links is studied theoretically and experimentally. We demonstrate that it is possible to reduce the signal power excursion by adjusting FBG reflectivity without compromising pump efficiency. Furthermore, we experimentally demonstrate an OSNR improvement of 4.3 dB in our system after 4000 km transmission by switching from conventional erbium-doped fibre amplifiers to quasi-lossless transmission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Financial institutes are an integral part of any modern economy. In the 1970s and 1980s, Gulf Cooperation Council (GCC) countries made significant progress in financial deepening and in building a modern financial infrastructure. This study aims to evaluate the performance (efficiency) of financial institutes (banking sector) in GCC countries. Since, the selected variables include negative data for some banks and positive for others, and the available evaluation methods are not helpful in this case, so we developed a Semi Oriented Radial Model to perform this evaluation. Furthermore, since the SORM evaluation result provides a limited information for any decision maker (bankers, investors, etc...), we proposed a second stage analysis using classification and regression (C&R) method to get further results combining SORM results with other environmental data (Financial, economical and political) to set rules for the efficient banks, hence, the results will be useful for bankers in order to improve their bank performance and to the investors, maximize their returns. Mainly there are two approaches to evaluate the performance of Decision Making Units (DMUs), under each of them there are different methods with different assumptions. Parametric approach is based on the econometric regression theory and nonparametric approach is based on a mathematical linear programming theory. Under the nonparametric approaches, there are two methods: Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH). While there are three methods under the parametric approach: Stochastic Frontier Analysis (SFA); Thick Frontier Analysis (TFA) and Distribution-Free Analysis (DFA). The result shows that DEA and SFA are the most applicable methods in banking sector, but DEA is seem to be most popular between researchers. However DEA as SFA still facing many challenges, one of these challenges is how to deal with negative data, since it requires the assumption that all the input and output values are non-negative, while in many applications negative outputs could appear e.g. losses in contrast with profit. Although there are few developed Models under DEA to deal with negative data but we believe that each of them has it is own limitations, therefore we developed a Semi-Oriented-Radial-Model (SORM) that could handle the negativity issue in DEA. The application result using SORM shows that the overall performance of GCC banking is relatively high (85.6%). Although, the efficiency score is fluctuated over the study period (1998-2007) due to the second Gulf War and to the international financial crisis, but still higher than the efficiency score of their counterpart in other countries. Banks operating in Saudi Arabia seem to be the highest efficient banks followed by UAE, Omani and Bahraini banks, while banks operating in Qatar and Kuwait seem to be the lowest efficient banks; this is because these two countries are the most affected country in the second Gulf War. Also, the result shows that there is no statistical relationship between the operating style (Islamic or Conventional) and bank efficiency. Even though there is no statistical differences due to the operational style, but Islamic bank seem to be more efficient than the Conventional bank, since on average their efficiency score is 86.33% compare to 85.38% for Conventional banks. Furthermore, the Islamic banks seem to be more affected by the political crisis (second Gulf War), whereas Conventional banks seem to be more affected by the financial crisis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using a novel modeling approach, and cross-country firm level data for the textiles industry, we examine the impact of institutional quality on firm performance. Our methodology allows us to estimate the marginal impact of institutional quality on productivity of each firm. Our results bring into question conventional wisdom about the desirable characteristics of market institutions, which is based on empirical evidence about the impact of institutional quality on the average firm. We demonstrate, for example, that once both the direct impact of a change in institutional quality on total factor productivity and the indirect impact through changes in efficiency of use of factor inputs are taken into account, an increase in labor market rigidity may have a positive impact on firm output, at least for some firms. We also demonstrate that there are significant intra-country variations in the marginal impact of institutional quality, such that the characteristics of “winners” and “losers” will have to be taken into account before policy is introduced to change institutional quality in any direction.