33 resultados para Design efficiency
em Aston University Research Archive
Resumo:
Presents a simulation study of the costing of police custody operations at a UK police force. The custody operation incorporates the arrest, booking-in, interview, detention and court appearance activities. The Activity Based Costing (ABC) approach is used as a framework to show how costs are generated by the three “drivers” of cost, activity and resource. These relate to the design efficiency of the process, the timing and mix of demand on the process and the cost of resources used to undertake the process respectively. The use of discrete-event simulation allows the incorporation of dynamic (time-dependent) and stochastic (variability) elements in the cost analysis. This enables both the amount and timing of the use of capacity and the generation of cost to be established. The concept of committed and flexible resources directs management decisions to the redeployment of unused capacity or alternatively the identification of additional capacity requirements.
Resumo:
With the competitive challenge facing business today, the need to keep cost down and quality up is a matter of survival. One way in which wire manufacturers can meet this challenge is to possess a thorough understanding of deformation, friction and lubrication during the wire drawing process, and therefore to make good decisions regarding the selection and application of lubricants as well as the die design. Friction, lubrication and die design during wire drawing thus become the subject of this study. Although theoretical and experimental investigations have been being carried out ever since the establishment of wire drawing technology, many problems remain unsolved. It is therefore necessary to conduct further research on traditional and fundamental subjects such as the mechanics of deformation, friction, lubrication and die design in wire drawing. Drawing experiments were carried out on an existing bull-block under different cross-sectional area reductions, different speeds and different lubricants. The instrumentation to measure drawing load and drawing speed was set up and connected to the wire drawing machine, together with a data acquisition system. A die box connected to the existing die holder for using dry soap lubricant was designed and tested. The experimental results in terms of drawing stress vs percentage area reduction curves under different drawing conditions were analysed and compared. The effects on drawing stress of friction, lubrication, drawing speed and pressure die nozzle are discussed. In order to determine the flow stress of the material during deformation, tensile tests were performed on an Instron universal test machine, using the wires drawn under different area reductions. A polynomial function is used to correlate the flow stress of the material with the plastic strain, on which a general computer program has been written to find out the coefficients of the stress-strain function. The residual lubricant film on the steel wire after drawing was examined both radially and longitudinally using an SEM and optical microscope. The lubricant film on the drawn wire was clearly observed. Therefore, the micro-analysis by SEM provides a way of friction and lubrication assessment in wire drawing.
Resumo:
We propose a novel approach to ultra-narrow optical filtering based on a specially designed slightly asymmetric filter, which can be fabricated using fibre Bragg gratings. A feasibility of 8×40 Gbit/s DWDM RZ transmission with 0.8 bit/s/Hz spectral efficiency (without polarisation multiplexing) over 1280 km of SMF/DCF link without FEC has been confirmed by numerical modelling. © 2004 Elsevier Inc. All rights reserved.
Resumo:
Supply chain operations directly affect service levels. Decision on amendment of facilities is generally decided based on overall cost, leaving out the efficiency of each unit. Decomposing the supply chain superstructure, efficiency analysis of the facilities (warehouses or distribution centers) that serve customers can be easily implemented. With the proposed algorithm, the selection of a facility is based on service level maximization and not just cost minimization as this analysis filters all the feasible solutions utilizing Data Envelopment Analysis (DEA) technique. Through multiple iterations, solutions are filtered via DEA and only the efficient ones are selected leading to cost minimization. In this work, the problem of optimal supply chain networks design is addressed based on a DEA based algorithm. A Branch and Efficiency (B&E) algorithm is deployed for the solution of this problem. Based on this DEA approach, each solution (potentially installed warehouse, plant etc) is treated as a Decision Making Unit, thus is characterized by inputs and outputs. The algorithm through additional constraints named “efficiency cuts”, selects only efficient solutions providing better objective function values. The applicability of the proposed algorithm is demonstrated through illustrative examples.
Resumo:
Purpose – The data used in this study is for the period 1980-2000. Almost midway through this period (in 1992), the Kenyan government liberalized the sugar industry and the role of the market increased, while the government's role with respect to control of prices, imports and other aspects in the sector declined. This exposed the local sugar manufacturers to external competition from other sugar producers, especially from the COMESA region. This study aims to find whether there were any changes in efficiency of production between the two periods (pre and post-liberalization). Design/methodology/approach – The study utilized two methodologies to efficiency estimation: data envelopment analysis (DEA) and the stochastic frontier. DEA uses mathematical programming techniques and does not impose any functional form on the data. However, it attributes all deviation from the mean function to inefficiencies. The stochastic frontier utilizes econometric techniques. Findings – The test for structural differences in the two periods does not show any statistically significant differences between the two periods. However, both methodologies show a decline in efficiency levels from 1992, with the lowest period experienced in 1998. From then on, efficiency levels began to increase. Originality/value – To the best of the authors' knowledge, this is the first paper to use both methodologies in the sugar industry in Kenya. It is shown that in industries where the noise (error) term is minimal (such as manufacturing), the DEA and stochastic frontier give similar results.
Resumo:
The purpose of this research is to propose a procurement system across other disciplines and retrieved information with relevant parties so as to have a better co-ordination between supply and demand sides. This paper demonstrates how to analyze the data with an agent-based procurement system (APS) to re-engineer and improve the existing procurement process. The intelligence agents take the responsibility of searching the potential suppliers, negotiation with the short-listed suppliers and evaluating the performance of suppliers based on the selection criteria with mathematical model. Manufacturing firms and trading companies spend more than half of their sales dollar in the purchase of raw material and components. Efficient data collection with high accuracy is one of the key success factors to generate quality procurement which is to purchasing right material at right quality from right suppliers. In general, the enterprises spend a significant amount of resources on data collection and storage, but too little on facilitating data analysis and sharing. To validate the feasibility of the approach, a case study on a manufacturing small and medium-sized enterprise (SME) has been conducted. APS supports the data and information analyzing technique to facilitate the decision making such that the agent can enhance the negotiation and suppler evaluation efficiency by saving time and cost.
Resumo:
The key to the use of polymersomes as effective molecular delivery systems is in the ability to design processing routes that can efficiently encapsulate the molecular payload. We have evaluated various surface rehydration mechanisms for encapsulation, in each case characterizing the morphologies formed using DLS and confocal microscopy as well as determining the encapsulation efficiency for the hydrophilic dye Rhodamine B. In contrast to bulk methods, where the encapsulation efficiencies are low, we find that higher efficiencies can be obtained by the rehydration of thin films. We relate these results to the non-equilibrium mechanisms that underlie vesicle formation and discuss how an understanding of these mechanisms can help optimize encapsulation efficiencies. Our conclusion is that, even considering the good encapsulation efficiency, surface methods are still unsuitable for the massive scale-up needed when applied to commercial mass market molecular delivery scenarios. However, targeting more specialized applications for high value ingredients (like pharmaceuticals) might be more feasible.
Resumo:
The objective of this work was to design, construct and commission a new ablative pyrolysis reactor and a high efficiency product collection system. The reactor was to have a nominal throughput of 10 kg/11r of dry biomass and be inherently scalable up to an industrial scale application of 10 tones/hr. The whole process consists of a bladed ablative pyrolysis reactor, two high efficiency cyclones for char removal and a disk and doughnut quench column combined with a wet walled electrostatic precipitator, which is directly mounted on top, for liquids collection. In order to aid design and scale-up calculations, detailed mathematical modelling was undertaken of the reaction system enabling sizes, efficiencies and operating conditions to be determined. Specifically, a modular approach was taken due to the iterative nature of some of the design methodologies, with the output from one module being the input to the next. Separate modules were developed for the determination of the biomass ablation rate, specification of the reactor capacity, cyclone design, quench column design and electrostatic precipitator design. These models enabled a rigorous design protocol to be developed capable of specifying the required reactor and product collection system size for specified biomass throughputs, operating conditions and collection efficiencies. The reactor proved capable of generating an ablation rate of 0.63 mm/s for pine wood at a temperature of 525 'DC with a relative velocity between the heated surface and reacting biomass particle of 12.1 m/s. The reactor achieved a maximum throughput of 2.3 kg/hr, which was the maximum the biomass feeder could supply. The reactor is capable of being operated at a far higher throughput but this would require a new feeder and drive motor to be purchased. Modelling showed that the reactor is capable of achieving a reactor throughput of approximately 30 kg/hr. This is an area that should be considered for the future as the reactor is currently operating well below its theoretical maximum. Calculations show that the current product collection system could operate efficiently up to a maximum feed rate of 10 kg/Fir, provided the inert gas supply was adjusted accordingly to keep the vapour residence time in the electrostatic precipitator above one second. Operation above 10 kg/hr would require some modifications to the product collection system. Eight experimental runs were documented and considered successful, more were attempted but due to equipment failure had to be abandoned. This does not detract from the fact that the reactor and product collection system design was extremely efficient. The maximum total liquid yield was 64.9 % liquid yields on a dry wood fed basis. It is considered that the liquid yield would have been higher had there been sufficient development time to overcome certain operational difficulties and if longer operating runs had been attempted to offset product losses occurring due to the difficulties in collecting all available product from a large scale collection unit. The liquids collection system was highly efficient and modeling determined a liquid collection efficiency of above 99% on a mass basis. This was validated due to the fact that a dry ice/acetone condenser and a cotton wool filter downstream of the collection unit enabled mass measurements of the amount of condensable product exiting the product collection unit. This showed that the collection efficiency was in excess of 99% on a mass basis.
Resumo:
In this paper we present a novel method for emulating a stochastic, or random output, computer model and show its application to a complex rabies model. The method is evaluated both in terms of accuracy and computational efficiency on synthetic data and the rabies model. We address the issue of experimental design and provide empirical evidence on the effectiveness of utilizing replicate model evaluations compared to a space-filling design. We employ the Mahalanobis error measure to validate the heteroscedastic Gaussian process based emulator predictions for both the mean and (co)variance. The emulator allows efficient screening to identify important model inputs and better understanding of the complex behaviour of the rabies model.
Resumo:
Manufacturing firms are driven by competitive pressures to continually improve the effectiveness and efficiency of their organisations. For this reason, manufacturing engineers often implement changes to existing processes, or design new production facilities, with the expectation of making further gains in manufacturing system performance. This thesis relates to how the likely outcome of this type of decision should be predicted prior to its implementation. The thesis argues that since manufacturing systems must also interact with many other parts of an organisation, the expected performance improvements can often be significantly hampered by constraints that arise elsewhere in the business. As a result, decision-makers should attempt to predict just how well a proposed design will perform when these other factors, or 'support departments', are taken into consideration. However, the thesis also demonstrates that, in practice, where quantitative analysis is used to evaluate design decisions, the analysis model invariably ignores the potential impact of support functions on a system's overall performance. A more comprehensive modelling approach is therefore required. A study of how various business functions interact establishes that to properly represent the kind of delays that give rise to support department constraints, a model should actually portray the dynamic and stochastic behaviour of entities in both the manufacturing and non-manufacturing aspects of a business. This implies that computer simulation be used to model design decisions but current simulation software does not provide a sufficient range of functionality to enable the behaviour of all of these entities to be represented in this way. The main objective of the research has therefore been the development of a new simulator that will overcome limitations of existing software and so enable decision-makers to conduct a more holistic evaluation of design decisions. It is argued that the application of object-oriented techniques offers a potentially better way of fulfilling both the functional and ease-of-use issues relating to development of the new simulator. An object-oriented analysis and design of the system, called WBS/Office, are therefore presented that extends to modelling a firm's administrative and other support activities in the context of the manufacturing system design process. A particularly novel feature of the design is the ability for decision-makers to model how a firm's specific information and document processing requirements might hamper shop-floor performance. The simulator is primarily intended for modelling make-to-order batch manufacturing systems and the thesis presents example models created using a working version of WBS/Office that demonstrate the feasibility of using the system to analyse manufacturing system designs in this way.
Resumo:
This industrial based research project was undertaken for British Leyland and arose as a result of poor system efficiency on the Maxi and Marina vehicle body build lines. The major factors in the deterioration of system efficiency were identified as: a) The introduction of a 'Gateline' system of vehicle body build. b) The degeneration of a newly introduced measured daywork payment scheme. By relating the conclusions of past work on payment systems to the situation at Cowley, it was concluded that a combination of poor industrial relations and a lack of managerial control had caused the measured daywork scheme to degenerate into a straightforward payment for time at work. This ellminated the monetary incentive to achieve schedule with the consequence that both inefficiency and operating costs increased. To analyse further the cause of inefficiency, a study of Marina gateline stoppage logs was carried out. This revealed that poor system efficiency on the gateline was caused more by the nature of its design than poor reliability on individual items of' plant. The consideration given to system efficiency at the design stage was found to be negligible, the main obstacles being: a) A lack of understanding pertaining to the influence of certain design factors on the efficiency of a production line. b) The absence of data and techniques to predict system efficiency at the design stage. To remedy this situation, a computer simulation study of' the design factors was carried out from which relationships with system efficiency were established and empirical efficiency equations developed. Sets of tables were compiled from the equations and efficiency data relevant to vehicle body building established from the gateline stoppage logs. Computer simulation, the equations and the tables,when used in conjunction. with good efficiency data, are shown to be accurate methods of predicting production line system.efficiency.
Resumo:
Background: We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific-purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. Objectives: The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Methods: Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. Results: The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a content-based design outperforms the traditional VLE-based design. © 2011 Wessa et al.
Resumo:
Three novel solar thermal collector concepts derived from the Linear Fresnel Reflector (LFR) are developed and evaluated through a multi-criteria decision-making methodology, comprising the following techniques: Quality Function Deployment (QFD), the Analytical Hierarchy Process (AHP) and the Pugh selection matrix. Criteria are specified by technical and customer requirements gathered from Gujarat, India. The concepts are compared to a standard LFR for reference, and as a result, a novel 'Elevation Linear Fresnel Reflector' (ELFR) concept using elevating mirrors is selected. A detailed version of this concept is proposed and compared against two standard LFR configurations, one using constant and the other using variable horizontal mirror spacing. Annual performance is analysed for a typical meteorological year. Financial assessment is made through the construction of a prototype. The novel LFR has an annual optical efficiency of 49% and increases exergy by 13-23%. Operational hours above a target temperature of 300 C are increased by 9-24%. A 17% reduction in land usage is also achievable. However, the ELFR suffers from additional complexity and a 16-28% increase in capital cost. It is concluded that this novel design is particularly promising for industrial applications and locations with restricted land availability or high land costs. The decision analysis methodology adopted is considered to have a wider potential for applications in the fields of renewable energy and sustainable design. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
This paper describes the strategies used by AstonCAT-Plus, the post-tournament version of the specialist designed for the TAC Market Design Tournament 2010. It details how AstonCATPlus accepts shouts, clears market, sets transaction prices and charges fees. Through empirical evaluation, we show that AstonCAT-Plus not only outperforms AstonCAT (tournament version) significantly but also achieves the second best overall score against some top entrants of the competition. In particular, it achieves the highest allocative efficiency, transaction success rate and average trader profit among all the specialists in our controlled experiments.
Resumo:
Based on the rate equations describing the operation of the Er3+, Pr3+ -codoped ZBLAN fiber lasers with different pump configurations, theoretical calculations that relate to the population characteristics and optimization of CW operation of high power Er3+, Pr3+ :ZBLAN double-clad fiber lasers are presented. Using the measured ET (energy-transfer), ETU (energy-transfer-upconversion) and CR (cross-relaxation) parameters relevant to Er3+, Pr3+ -codoped ZBLAN, a good agreement between the theoretical results from the model and recently reported experimental measurements is obtained. The effects on the slope efficiency of a number of laser parameters including fiber length, reflectance of the output mirror and pumping configuration are quantitatively analyzed and used for the design and optimization of high power Er3+, Pr3+ -codoped ZBLAN fiber lasers.