928 resultados para cost-aware process design
Resumo:
Split-plot design (SPD) and near-infrared chemical imaging were used to study the homogeneity of the drug paracetamol loaded in films and prepared from mixtures of the biocompatible polymers hydroxypropyl methylcellulose, polyvinylpyrrolidone, and polyethyleneglycol. The study was split into two parts: a partial least-squares (PLS) model was developed for a pixel-to-pixel quantification of the drug loaded into films. Afterwards, a SPD was developed to study the influence of the polymeric composition of films and the two process conditions related to their preparation (percentage of the drug in the formulations and curing temperature) on the homogeneity of the drug dispersed in the polymeric matrix. Chemical images of each formulation of the SPD were obtained by pixel-to-pixel predictions of the drug using the PLS model of the first part, and macropixel analyses were performed for each image to obtain the y-responses (homogeneity parameter). The design was modeled using PLS regression, allowing only the most relevant factors to remain in the final model. The interpretation of the SPD was enhanced by utilizing the orthogonal PLS algorithm, where the y-orthogonal variations in the design were separated from the y-correlated variation.
Resumo:
This paper presents a rational approach to the design of a catamaran's hydrofoil applied within a modern context of multidisciplinary optimization. The approach used includes the use of response surfaces represented by neural networks and a distributed programming environment that increases the optimization speed. A rational approach to the problem simplifies the complex optimization model; when combined with the distributed dynamic training used for the response surfaces, this model increases the efficiency of the process. The results achieved using this approach have justified this publication.
Resumo:
This investigation presents a comprehensive characterization of magnetic and transport properties of an interesting superconducting wire, Nb-Ti -Ta, obtained through the solid-state diffusion between Nb-12 at.% Ta alloy and pure Ti. The physical properties obtained from magnetic and transport measurements related to the microstructure unambiguously confirmed a previous proposition that the superconducting currents flow in the center of the diffusion layer, which has a steep composition variation. The determination of the critical field also confirmed that the flux line core size is not constant, and in addition it was possible to determine that, in the center of the layer, the flux line core is smaller than at the borders. A possible core shape design is proposed. Among the wires studied, the one that presented the best critical current density was achieved for a diffusion layer with a composition of about Nb-32% Ti-10% Ta, obtained with a heat treatment at 700 degrees C during 120 h, in agreement with previous studies. It was determined that this wire has the higher upper critical field, indicating that the optimization of the superconducting behavior is related to an intrinsic property of the ternary alloy.
Resumo:
Enzyme production is a growing field in biotechnology and increasing attention has been devoted to the solid-state fermentation (SSF) of lignocellulosic biomass for production of industrially relevant lignocellulose deconstruction enzymes, especially manganese-peroxidase (MnP), which plays a crucial role in lignin degradation. However, there is a scarcity of studies regarding extraction of the secreted metabolities that are commonly bound to the fermented solids, preventing their accurate detection and limiting recovery efficiency. In the present work, we assessed the effectiveness of extraction process variables (pH, stirring rate, temperature, and extraction time) on recovery efficiency of manganese-peroxidase (MnP) obtained by SSF of eucalyptus residues using Lentinula edodes using statistical design of experiments. The results from this study indicated that of the variables studied, pH was the most significant (p < 0.05%) parameter affecting MnP recovery yield, while temperature, extraction time, and stirring rate presented no statistically significant effects in the studied range. The optimum pH for extraction of MnP was at 4.0-5.0, which yielded 1500-1700 IU kg (1) of enzyme activity at extraction time 4-5 h, under static condition at room temperature. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The design of supplementary damping controllers to mitigate the effects of electromechanical oscillations in power systems is a highly complex and time-consuming process, which requires a significant amount of knowledge from the part of the designer. In this study, the authors propose an automatic technique that takes the burden of tuning the controller parameters away from the power engineer and places it on the computer. Unlike other approaches that do the same based on robust control theories or evolutionary computing techniques, our proposed procedure uses an optimisation algorithm that works over a formulation of the classical tuning problem in terms of bilinear matrix inequalities. Using this formulation, it is possible to apply linear matrix inequality solvers to find a solution to the tuning problem via an iterative process, with the advantage that these solvers are widely available and have well-known convergence properties. The proposed algorithm is applied to tune the parameters of supplementary controllers for thyristor controlled series capacitors placed in the New England/New York benchmark test system, aiming at the improvement of the damping factor of inter-area modes, under several different operating conditions. The results of the linear analysis are validated by non-linear simulation and demonstrate the effectiveness of the proposed procedure.
Resumo:
The productivity associated with commonly available disassembly methods today seldomly makes disassembly the preferred end-of-life solution for massive take back product streams. Systematic reuse of parts or components, or recycling of pure material fractions are often not achievable in an economically sustainable way. In this paper a case-based review of current disassembly practices is used to analyse the factors influencing disassembly feasibility. Data mining techniques were used to identify major factors influencing the profitability of disassembly operations. Case characteristics such as involvement of the product manufacturer in the end-of-life treatment and continuous ownership are some of the important dimensions. Economic models demonstrate that the efficiency of disassembly operations should be increased an order of magnitude to assure the competitiveness of ecologically preferred, disassembly oriented end-of-life scenarios for large waste of electric and electronic equipment (WEEE) streams. Technological means available to increase the productivity of the disassembly operations are summarized. Automated disassembly techniques can contribute to the robustness of the process, but do not allow to overcome the efficiency gap if not combined with appropriate product design measures. Innovative, reversible joints, collectively activated by external trigger signals, form a promising approach to low cost, mass disassembly in this context. A short overview of the state-of-the-art in the development of such self-disassembling joints is included. (c) 2008 CIRP.
Resumo:
This paper discusses the integrated design of parallel manipulators, which exhibit varying dynamics. This characteristic affects the machine stability and performance. The design methodology consists of four main steps: (i) the system modeling using flexible multibody technique, (ii) the synthesis of reduced-order models suitable for control design, (iii) the systematic flexible model-based input signal design, and (iv) the evaluation of some possible machine designs. The novelty in this methodology is to take structural flexibilities into consideration during the input signal design; therefore, enhancing the standard design process which mainly considers rigid bodies dynamics. The potential of the proposed strategy is exploited for the design evaluation of a two degree-of-freedom high-speed parallel manipulator. The results are experimentally validated. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The main objective of this research was to evaluate the potential use of a bench-scale anaerobic sequencing batch biofilm reactor (ASBBR) containing mineral coal as inert support for removal Of Sulfide and organic matter effluents from an ASBBR (1.2 m(3)) utilized for treatment of sulfate-rich wastewater. The cycle time was 48 h, including the steps of feeding (2 h), reaction with continuous liquid recirculation (44 h) and discharge (2 h). COD removal efficiency was up to 90% and the effluents total sulfide concentrations (H(2)S, HS(-), S(2-)) remained in the range of 1.5 to 7.5 mg.l(-1) during the 50 days of operation (25 cycles). The un-ionized Sulfide and ionized sulfides were converted by biological process to elemental sulfur (S(0)) under oxygen limited conditions. The results obtained in the bench-scale reactor were used to design an ASBBR in pilot scale for use in post-treatment to achieve the emission standards (sulfide and COD) for sulfate reduction. The pilot-scale reactor, with a total volume of 0.43 m(3), the COD and total sulfide removal achieved 88% and 57%, respectively, for a cycle time of 48 h (70 days of operation or 35 cycles).
Resumo:
Purpose - The purpose of this paper is to identify the key elements of a new rapid prototyping process, which involves layer-by-layer deposition of liquid-state material and at the same time using an ultraviolet line source to cure the deposited material. This paper reports studies about the behaviour of filaments, deposition accuracy, filaments interaction and functional feasibility of system. Additionally, the author describes the process which has been proposed, the equipment that has been used for these studies and the material which was developed in this application. Design/methodology/approach - The research has been separated into three study areas in accordance with their goals. In the first, both the behaviour of filament and deposition accuracy was studied. The design of the experiment is described with focus on four response factors (bead width, filament quality, deposition accuracy and deposition continuity) along with function of three control factors (deposition height, deposition velocity and extrusion velocity). The author also studied the interaction between filaments as a function of bead centre distance. In addition, two test samples were prepared to serve as a proof of the methodology and to verify the functional feasibility of the process which has been studied. Findings - The results show that the proposed process is functionally feasible, and that it is possible to identify the main effects of control factors over response factors. That analysis is used to predict the condition of process as a function of the parameters which control the process. Also identified were distances of centre beads which result in a specific behaviour. The types of interaction between filaments were analysed and sorted into: union, separation and indeterminate. At the end, the functional feasibility of process was proved whereby two test parts could be built. Originality/value - This paper proposes a new rapid prototyping process and also presents test studies related to this proposition. The author has focused on the filament behaviour, deposition accuracy, interaction between filaments and studied the functional feasibility of process to provide new information about this process, which at the same time is useful to the development of other rapid prototyping processes.
Resumo:
Ecological niche modelling combines species occurrence points with environmental raster layers in order to obtain models for describing the probabilistic distribution of species. The process to generate an ecological niche model is complex. It requires dealing with a large amount of data, use of different software packages for data conversion, for model generation and for different types of processing and analyses, among other functionalities. A software platform that integrates all requirements under a single and seamless interface would be very helpful for users. Furthermore, since biodiversity modelling is constantly evolving, new requirements are constantly being added in terms of functions, algorithms and data formats. This evolution must be accompanied by any software intended to be used in this area. In this scenario, a Service-Oriented Architecture (SOA) is an appropriate choice for designing such systems. According to SOA best practices and methodologies, the design of a reference business process must be performed prior to the architecture definition. The purpose is to understand the complexities of the process (business process in this context refers to the ecological niche modelling problem) and to design an architecture able to offer a comprehensive solution, called a reference architecture, that can be further detailed when implementing specific systems. This paper presents a reference business process for ecological niche modelling, as part of a major work focused on the definition of a reference architecture based on SOA concepts that will be used to evolve the openModeller software package for species modelling. The basic steps that are performed while developing a model are described, highlighting important aspects, based on the knowledge of modelling experts. In order to illustrate the steps defined for the process, an experiment was developed, modelling the distribution of Ouratea spectabilis (Mart.) Engl. (Ochnaceae) using openModeller. As a consequence of the knowledge gained with this work, many desirable improvements on the modelling software packages have been identified and are presented. Also, a discussion on the potential for large-scale experimentation in ecological niche modelling is provided, highlighting opportunities for research. The results obtained are very important for those involved in the development of modelling tools and systems, for requirement analysis and to provide insight on new features and trends for this category of systems. They can also be very helpful for beginners in modelling research, who can use the process and the experiment example as a guide to this complex activity. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
The roots of swarm intelligence are deeply embedded in the biological study of self-organized behaviors in social insects. Particle swarm optimization (PSO) is one of the modern metaheuristics of swarm intelligence, which can be effectively used to solve nonlinear and non-continuous optimization problems. The basic principle of PSO algorithm is formed on the assumption that potential solutions (particles) will be flown through hyperspace with acceleration towards more optimum solutions. Each particle adjusts its flying according to the flying experiences of both itself and its companions using equations of position and velocity. During the process, the coordinates in hyperspace associated with its previous best fitness solution and the overall best value attained so far by other particles within the group are kept track and recorded in the memory. In recent years, PSO approaches have been successfully implemented to different problem domains with multiple objectives. In this paper, a multiobjective PSO approach, based on concepts of Pareto optimality, dominance, archiving external with elite particles and truncated Cauchy distribution, is proposed and applied in the design with the constraints presence of a brushless DC (Direct Current) wheel motor. Promising results in terms of convergence and spacing performance metrics indicate that the proposed multiobjective PSO scheme is capable of producing good solutions.
Resumo:
This work shows the application of the analytic hierarchy process (AHP) in the full cost accounting (FCA) within the integrated resource planning (IRP) process. For this purpose, a pioneer case was developed and different energy solutions of supply and demand for a metropolitan airport (Congonhas) were considered [Moreira, E.M., 2005. Modelamento energetico para o desenvolvimento limpo de aeroporto metropolitano baseado na filosofia do PIR-O caso da metropole de Sao Paulo. Dissertacao de mestrado, GEPEA/USP]. These solutions were compared and analyzed utilizing the software solution ""Decision Lens"" that implements the AHP. The final part of this work has a classification of resources that can be considered to be the initial target as energy resources, thus facilitating the restraints of the IRP of the airport and setting parameters aiming at sustainable development. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The concrete offshore platforms, which are subjected a several loading combinations and, thus, requires an analysis more generic possible, can be designed using the concepts adopted to shell elements, but the resistance must be verify in particular cross-sections to shear forces. This work about design of shell elements will be make using the three-layer shell theory. The elements are subject to combined loading of membrane and plate, totalizing eight components of internal forces, which are three membrane forces, three moments (two out-of-plane bending moments and one in-plane, or torsion, moment) and two shear forces. The design method adopted, utilizing the iterative process proposed by Lourenco & Figueiras (1993) obtained from equations of equilibrium developed by Gupta (1896) , will be compared to results of experimentally tested shell elements found in the literature using the program DIANA.
Resumo:
This paper presents both the theoretical and the experimental approaches of the development of a mathematical model to be used in multi-variable control system designs of an active suspension for a sport utility vehicle (SUV), in this case a light pickup truck. A complete seven-degree-of-freedom model is successfully quickly identified, with very satisfactory results in simulations and in real experiments conducted with the pickup truth. The novelty of the proposed methodology is the use of commercial software in the early stages of the identification to speed up the process and to minimize the need for a large number of costly experiments. The paper also presents major contributions to the identification of uncertainties in vehicle suspension models and in the development of identification methods using the sequential quadratic programming, where an innovation regarding the calculation of the objective function is proposed and implemented. Results from simulations of and practical experiments with the real SUV are presented, analysed, and compared, showing the potential of the method.
Resumo:
Compliant mechanisms can achieve a specified motion as a mechanism without relying on the use of joints and pins. They have broad application in precision mechanical devices and Micro-Electro Mechanical Systems (MEMS) but may lose accuracy and produce undesirable displacements when subjected to temperature changes. These undesirable effects can be reduced by using sensors in combination with control techniques and/or by applying special design techniques to reduce such undesirable effects at the design stage, a process generally termed ""design for precision"". This paper describes a design for precision method based on a topology optimization method (TOM) for compliant mechanisms that includes thermal compensation features. The optimization problem emphasizes actuator accuracy and it is formulated to yield optimal compliant mechanism configurations that maximize the desired output displacement when a force is applied, while minimizing undesirable thermal effects. To demonstrate the effectiveness of the method, two-dimensional compliant mechanisms are designed considering thermal compensation, and their performance is compared with compliant mechanisms designs that do not consider thermal compensation. (C) 2010 Elsevier B.V. All rights reserved.