941 resultados para 290601 Chemical Engineering Design
Resumo:
A Lewis acidic chlorogallate(III) ionic liquid, 1-ethyl-3-methylimidazolium hepta-chlorodigallate(III), [C(2)mim][Ga2Cl7], was successfully used to oligomerise 1-pentene. The influence of temperature, time, catalyst concentration, and stirring rate on conversion and product distribution was modelled using a design of experiment (DoE) approach (chemometrics). The process was optimised for lubricant base oils production; the C20-C50 fraction (where Cn indicates the number of carbons in the oligomer) was maximised, while the heavier oligomer fraction (>C50) was minimised.
Resumo:
The advantage of using an available and abundant residual biomass, such as lignin, as a raw material for activated carbons is that it provides additional economical interest to the technical studies. In the current investigation, a more complete understanding of adsorption of Cr(VI) from aqueous systems onto H PO -acid activated lignin has been achieved via microcolumns, which were operated under various process conditions. The practice of using microcolumn is appropriate for defining the adsorption parameters and for screening a large number of potential adsorbents. The effects of solution pH (2-8), initial metal ion concentration (0.483-1.981 mmol·L ), flow rate (1.0-3.1 cm ·min ), ionic strength (0.01-0.30 mmol·L ) and adsorbent mass (0.11-0.465 g) on Cr(VI) adsorption were studied by assessing the microcolumn breakthrough curve. The microcolumn data were fitted by the Thomas model, the modified Dose model and the BDST model. As expected, the adsorption capacity increased with initial Cr(VI) concentration. High linear flow rates, pH values and ionic strength led to early breakthrough of Cr(VI). The model constants obtained in this study can be used for the design of pilot scale adsorption process. © 2012 Chemical Industry and Engineering Society of China (CIESC) and Chemical Industry Press (CIP).
Resumo:
This paper presents a thorough investigation of the combined allocator design for Networks-on-Chip (NoC). Particularly, we discuss the interlock of the combined NoC allocator, which is caused by the lock mechanism of priority updating between the local and global arbiters. Architectures and implementations of three interlock-free combined allocators are presented in detail. Their cost, critical path, as well as network level performance are demonstrated based on 65-nm standard cell technology.
Resumo:
Background: In this study, the efficiency of Guar gum as a biopolymer has been compared with two other widely used inorganic coagulants, ferric chloride (FeCl3) and aluminum chloride (AlCl3), for the treatment of effluent collected from the rubber-washing tanks of a rubber concentrate factory. Settling velocity distribution curves were plotted to demonstrate the flocculating effect of FeCl3, AlCl3 and Guar gum. FeCl3 and AlCl3 displayed better turbidity removal than Guar gum at all settling velocities.
Result: FeCl3, AlCl3 and Guar gum removed 92.8%, 88.2% and 88.1% turbidity, respectively, of raw wastewater at a settling velocity of 0.1 cm min-1, respectively. Scanning electron microscopic (SEM) study conducted on the flocs revealed that Guar gum and FeCl3produced strong intercoiled honeycomb patterned floc structure capable of entrapping suspended particulate matter. Statistical experimental design Response Surface Methodology (RSM) was used to design all experiments, where the type and dosage of flocculant, pH and mixing speed were taken as control factors and, an optimum operational setting was proposed.
Conclusion: Due to biodegradability issues, the use of Guar gum as a flocculating agent for wastewater treatment in industry is highly recommended.
Resumo:
This article examines the influence on the engineering design process of the primary objective of validation, whether it is proving a model, a technology or a product. Through the examination of a number of stiffened panel case studies, the relationships between simulation, validation, design and the final product are established and discussed. The work demonstrates the complex interactions between the original (or anticipated) design model, the analysis model, the validation activities and the product in service. The outcome shows clearly some unintended consequences. High fidelity validation test simulations require a different set of detailed parameters to accurately capture behaviour. By doing so, there is a divergence from the original computer-aided design model, intrinsically limiting the value of the validation with respect to the product. This work represents a shift from the traditional perspective of encapsulating and controlling errors between simulation and experimental test to consideration of the wider design-test process. Specifically, it is a reflection on the implications of how models are built and validated, and the effect on results and understanding of structural behaviour. This article then identifies key checkpoints in the design process and how these should be used to update the computer-aided design system parameters for a design. This work strikes at a fundamental challenge in understanding the interaction between design, certification and operation of any complex system.
Resumo:
Cascade control is one of the routinely used control strategies in industrial processes because it can dramatically improve the performance of single-loop control, reducing both the maximum deviation and the integral error of the disturbance response. Currently, many control performance assessment methods of cascade control loops are developed based on the assumption that all the disturbances are subject to Gaussian distribution. However, in the practical condition, several disturbance sources occur in the manipulated variable or the upstream exhibits nonlinear behaviors. In this paper, a general and effective index of the performance assessment of the cascade control system subjected to the unknown disturbance distribution is proposed. Like the minimum variance control (MVC) design, the output variances of the primary and the secondary loops are decomposed into a cascade-invariant and a cascade-dependent term, but the estimated ARMA model for the cascade control loop based on the minimum entropy, instead of the minimum mean squares error, is developed for non-Gaussian disturbances. Unlike the MVC index, an innovative control performance index is given based on the information theory and the minimum entropy criterion. The index is informative and in agreement with the expected control knowledge. To elucidate wide applicability and effectiveness of the minimum entropy cascade control index, a simulation problem and a cascade control case of an oil refinery are applied. The comparison with MVC based cascade control is also included.
Resumo:
In this study thermodynamically stable dispersions of amorphous quinine, a model BCS class 2 therapeutic agent, within an amorphous polymeric platform (HPC), termed a solid-in-solid dispersion, were produced using hot melt extrusion. Characterisation of the pre-extrudates and extrudates was performed using hyper-differential scanning calorimetry (DSC), powder X-ray diffraction (PXRD) and Raman spectroscopy. Water uptake by the raw materials was determined using dynamic vapour sorption (DVS) analysis. Furthermore, the presence or absence of crystalline drug following storage at 25 °C/60% relative humidity and 40 °C/75% relative humidity in a sealed glass jar, and at 40 °C/75% relative humidity in an open glass jar for 3 months was determined using PXRD. Amorphous quinine was generated in situ during extrusion from both quinine base (5%, 10%, 20% w/w drug loading) and from quinine hydrochloride (5%, 10% w/w drug loading) and remained thermodynamically stable as a solid-in-solid dispersion within the HPC extrudates. When processed with HPC, quinine hydrochloride (20% w/w) was converted to amorphous quinine hydrochloride. Whilst stable for up to 3 months when stored under sealed conditions, this amorphous form was unstable, resulting in recrystallisation of the hydrochloride salt following storage for 1 month at 40 °C/75% relative humidity in an open glass jar. The behaviour of the amorphous quinine hydrochloride (20% w/w) HPC extrudate was related, at least in part, to the lower stability and the hygroscopic properties of this amorphous form.
Resumo:
In this work, a highly instrumented single screw extruder has been used to study the effect of polymer rheology on the thermal efficiency of the extrusion process. Three different molecular weight grades of high density polyethylene (HDPE) were extruded at a range of conditions. Three geometries of extruder screws were used at several set temperatures and screw rotation speeds. The extruder was equipped with real-time quantification of energy consumption; thermal dynamics of the process were examined using thermocouple grid sensors at the entrance to the die. Results showed that polymer rheology had a significant effect on process energy consumption and thermal homogeneity of the melt. Highest specific energy consumption and poorest homogeneity was observed for the highest viscosity grade of HDPE. Extruder screw geometry, set extrusion temperature and screw rotation speed were also found to have a direct effect on energy consumption and melt consistency. In particular, specific energy consumption was lower using a barrier flighted screw compared to single flighted screws at the same set conditions. These results highlight the complex nature of extrusion thermal dynamics and provide evidence that rheological properties of the polymer can significantly influence the thermal efficiency of the process.
Resumo:
Ionic liquids (ILs) are popular designer green chemicals with great potential for use in diverse energy-related applications. Apart from the well-known low vapor pressure, the physical properties of ILs, such as hydrogen-bond-forming capacity, physical state, shape, and size, can be fine-tuned for specific applications. Natural gas hydrates are easily formed in gas pipelines and pose potential problems to the oil and natural gas industry, particularly during deep-sea exploration and production. This review summarizes the recent advances in IL research as dual-function gas hydrate inhibitors. Almost all of the available thermodynamic and kinetic inhibition data in the presence of ILs have been systematically reviewed to evaluate the efficiency of ILs in gas hydrate inhibition, compared to other conventional thermodynamic and kinetic gas hydrate inhibitors. The principles of natural gas hydrate formation, types of gas hydrates and their inhibitors, apparatuses and methods used, reported experimental data, and theoretical methods are thoroughly and critically discussed. The studies in this field will facilitate the design of advanced ILs for energy savings through the development of efficient low-dosage gas hydrate inhibitors.
Resumo:
Thesis (Ph.D.)--University of Washington, 2015
Resumo:
This book argues for novel strategies to integrate engineering design procedures and structural analysis data into architectural design. Algorithmic procedures that recently migrated into the architectural practice are utilized to improve the interface of both disciplines. Architectural design is predominately conducted as a negotiation process of various factors but often lacks rigor and data structures to link it to quantitative procedures. Numerical structural design on the other hand could act as a role model for handling data and robust optimization but it often lacks the complexity of architectural design. The goal of this research is to bring together robust methods from structural design and complex dependency networks from architectural design processes. The book presents three case studies of tools and methods that are developed to exemplify, analyze and evaluate a collaborative work flow.
Resumo:
This report outlines the problem of intelligent failure recovery in a problem-solver for electrical design. We want our problem solver to learn as much as it can from its mistakes. Thus we cast the engineering design process on terms of Problem Solving by Debugging Almost-Right Plans, a paradigm for automatic problem solving based on the belief that creation and removal of "bugs" is an unavoidable part of the process of solving a complex problem. The process of localization and removal of bugs called for by the PSBDARP theory requires an approach to engineering analysis in which every result has a justification which describes the exact set of assumptions it depends upon. We have developed a program based on Analysis by Propagation of Constraints which can explain the basis of its deductions. In addition to being useful to a PSBDARP designer, these justifications are used in Dependency-Directed Backtracking to limit the combinatorial search in the analysis routines. Although the research we will describe is explicitly about electrical circuits, we believe that similar principles and methods are employed by other kinds of engineers, including computer programmers.
Resumo:
El proyecto de investigación parte de la dinámica del modelo de distribución tercerizada para una compañía de consumo masivo en Colombia, especializada en lácteos, que para este estudio se ha denominado “Lactosa”. Mediante datos de panel con estudio de caso, se construyen dos modelos de demanda por categoría de producto y distribuidor y mediante simulación estocástica, se identifican las variables relevantes que inciden sus estructuras de costos. El problema se modela a partir del estado de resultados por cada uno de los cuatro distribuidores analizados en la región central del país. Se analiza la estructura de costos y el comportamiento de ventas dado un margen (%) de distribución logístico, en función de las variables independientes relevantes, y referidas al negocio, al mercado y al entorno macroeconómico, descritas en el objeto de estudio. Entre otros hallazgos, se destacan brechas notorias en los costos de distribución y costos en la fuerza de ventas, pese a la homogeneidad de segmentos. Identifica generadores de valor y costos de mayor dispersión individual y sugiere uniones estratégicas de algunos grupos de distribuidores. La modelación con datos de panel, identifica las variables relevantes de gestión que inciden sobre el volumen de ventas por categoría y distribuidor, que focaliza los esfuerzos de la dirección. Se recomienda disminuir brechas y promover desde el productor estrategias focalizadas a la estandarización de procesos internos de los distribuidores; promover y replicar los modelos de análisis, sin pretender remplazar conocimiento de expertos. La construcción de escenarios fortalece de manera conjunta y segura la posición competitiva de la compañía y sus distribuidores.