84 resultados para design III


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Box–Behnken factorial design coupled with surface response methodology was used to evaluate the effects of temperature, pH and initial concentration in the Cu(II) sorption process onto the marine macroalgae Ascophyllum nodosum. The effect of the operating variables on metal uptake capacitywas studied in a batch system and a mathematical model showing the influence of each variable and their interactions was obtained. Study ranges were 10–40ºC for temperature, 3.0–5.0 for pH and 50–150mgL−1 for initial Cu(II) concentration. Within these ranges, the biosorption capacity is slightly dependent on temperature but markedly increases with pH and initial concentration of Cu(II). The uptake capacities predicted by the model are in good agreement with the experimental values. Maximum biosorption capacity of Cu(II) by A. nodosum is 70mgg−1 and corresponds to the following values of those variables: temperature = 40ºC, pH= 5.0 and initial Cu(II) concentration = 150mgL−1.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solvent extraction is considered as a multi-criteria optimization problem, since several chemical species with similar extraction kinetic properties are frequently present in the aqueous phase and the selective extraction is not practicable. This optimization, applied to mixer–settler units, considers the best parameters and operating conditions, as well as the best structure or process flow-sheet. Global process optimization is performed for a specific flow-sheet and a comparison of Pareto curves for different flow-sheets is made. The positive weight sum approach linked to the sequential quadratic programming method is used to obtain the Pareto set. In all investigated structures, recovery increases with hold-up, residence time and agitation speed, while the purity has an opposite behaviour. For the same treatment capacity, counter-current arrangements are shown to promote recovery without significant impairment in purity. Recycling the aqueous phase is shown to be irrelevant, but organic recycling with as many stages as economically feasible clearly improves the design criteria and reduces the most efficient organic flow-rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study uses the process simulator ASPEN Plus and Life Cycle Assessment (LCA) to compare three process design alternatives for biodiesel production from waste vegetable oils that are: the conventional alkali-catalyzed process including a free fatty acids (FFAs) pre-treatment, the acid-catalyzed process, and the supercritical methanol process using propane as co-solvent. Results show that the supercritical methanol process using propane as co-solvent is the most environmentally favorable alternative. Its smaller steam consumption in comparison with the other process design alternatives leads to a lower contribution to the potential environmental impacts (PEI’s). The acid-catalyzed process generally shows the highest PEI’s, in particular due to the high energy requirements associated with methanol recovery operations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An analytical method using microwave-assisted extraction (MAE) and liquid chromatography (LC) with fluorescence detection (FD) for the determination of ochratoxin A (OTA) in bread samples is described. A 24 orthogonal composite design coupled with response surface methodology was used to study the influence of MAE parameters (extraction time, temperature, solvent volume, and stirring speed) in order to maximize OTA recovery. The optimized MAE conditions were the following: 25 mL of acetonitrile, 10 min of extraction, at 80 °C, and maximum stirring speed. Validation of the overall methodology was performed by spiking assays at five levels (0.1–3.00 ng/g). The quantification limit was 0.005 ng/g. The established method was then applied to 64 bread samples (wheat, maize, and wheat/maize bread) collected in Oporto region (Northern Portugal). OTAwas detected in 84 % of the samples with a maximum value of 2.87 ng/g below the European maximum limit established for OTA in cereal products of 3 ng/g.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, the effect of incorporation of recycled glass fibre reinforced plastics (GFRP) waste materials, obtained by means of shredding and milling processes, on mechanical behaviour of polyester polymer mortars (PM) was assessed. For this purpose, different contents of GFRP recyclates, between 4% up to 12% in weight, were incorporated into polyester PM materials as sand aggregates and filler replacements. The effect of the addition of a silane coupling agent to resin binder was also evaluated. Applied waste material was proceeding from the shredding of the leftovers resultant from the cutting and assembly processes of GFRP pultrusion profiles. Currently, these leftovers as well as non-conform products and scrap resulting from pultrusion manufacturing process are landfilled, with additional costs to producers and suppliers. Hence, besides the evident environmental benefits, a viable and feasible solution for these wastes would also conduct to significant economic advantages. Design of experiments and data treatment were accomplish by means of full factorial design approach and analysis of variance ANOVA. Experimental results were promising toward the recyclability of GFRP waste materials as partial replacement of aggregates and reinforcement for PM materials, with significant improvements on mechanical properties of resultant mortars with regards to waste-free formulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our day-to-day life is dependent on several embedded devices, and in the near future, many more objects will have computation and communication capabilities enabling an Internet of Things. Correspondingly, with an increase in the interaction of these devices around us, developing novel applications is set to become challenging with current software infrastructures. In this paper, we argue that a new paradigm for operating systems needs to be conceptualized to provide aconducive base for application development on Cyber-physical systems. We demonstrate its need and importance using a few use-case scenarios and provide the design principles behind, and an architecture of a co-operating system or CoS that can serve as an example of this new paradigm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stringent cost and energy constraints impose the use of low-cost and low-power radio transceivers in large-scale wireless sensor networks (WSNs). This fact, together with the harsh characteristics of the physical environment, requires a rigorous WSN design. Mechanisms for WSN deployment and topology control, MAC and routing, resource and mobility management, greatly depend on reliable link quality estimators (LQEs). This paper describes the RadiaLE framework, which enables the experimental assessment, design and optimization of LQEs. RadiaLE comprises (i) the hardware components of the WSN testbed and (ii) a software tool for setting-up and controlling the experiments, automating link measurements gathering through packets-statistics collection, and analyzing the collected data, allowing for LQEs evaluation. We also propose a methodology that allows (i) to properly set different types of links and different types of traffic, (ii) to collect rich link measurements, and (iii) to validate LQEs using a holistic and unified approach. To demonstrate the validity and usefulness of RadiaLE, we present two case studies: the characterization of low-power links and a comparison between six representative LQEs. We also extend the second study for evaluating the accuracy of the TOSSIM 2 channel model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Variations of manufacturing process parameters and environmental aspects may affect the quality and performance of composite materials, which consequently affects their structural behaviour. Reliability-based design optimisation (RBDO) and robust design optimisation (RDO) searches for safe structural systems with minimal variability of response when subjected to uncertainties in material design parameters. An approach that simultaneously considers reliability and robustness is proposed in this paper. Depending on a given reliability index imposed on composite structures, a trade-off is established between the performance targets and robustness. Robustness is expressed in terms of the coefficient of variation of the constrained structural response weighted by its nominal value. The Pareto normed front is built and the nearest point to the origin is estimated as the best solution of the bi-objective optimisation problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An approach for the analysis of uncertainty propagation in reliability-based design optimization of composite laminate structures is presented. Using the Uniform Design Method (UDM), a set of design points is generated over a domain centered on the mean reference values of the random variables. A methodology based on inverse optimal design of composite structures to achieve a specified reliability level is proposed, and the corresponding maximum load is outlined as a function of ply angle. Using the generated UDM design points as input/output patterns, an Artificial Neural Network (ANN) is developed based on an evolutionary learning process. Then, a Monte Carlo simulation using ANN development is performed to simulate the behavior of the critical Tsai number, structural reliability index, and their relative sensitivities as a function of the ply angle of laminates. The results are generated for uniformly distributed random variables on a domain centered on mean values. The statistical analysis of the results enables the study of the variability of the reliability index and its sensitivity relative to the ply angle. Numerical examples showing the utility of the approach for robust design of angle-ply laminates are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Structural health monitoring has long been identified as a prominent application of Wireless Sensor Networks (WSNs), as traditional wired-based solutions present some inherent limitations such as installation/maintenance cost, scalability and visual impact. Nevertheless, there is a lack of ready-to-use and off-the-shelf WSN technologies that are able to fulfill some most demanding requirements of these applications, which can span from critical physical infrastructures (e.g. bridges, tunnels, mines, energy grid) to historical buildings or even industrial machinery and vehicles. Low-power and low-cost yet extremely sensitive and accurate accelerometer and signal acquisition hardware and stringent time synchronization of all sensors data are just examples of the requirements imposed by most of these applications. This paper presents a prototype system for health monitoring of civil engineering structures that has been jointly conceived by a team of civil, and electrical and computer engineers. It merges the benefits of standard and off-the-shelf (COTS) hardware and communication technologies with a minimum set of custom-designed signal acquisition hardware that is mandatory to fulfill all application requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last twenty years genetic algorithms (GAs) were applied in a plethora of fields such as: control, system identification, robotics, planning and scheduling, image processing, and pattern and speech recognition (Bäck et al., 1997). In robotics the problems of trajectory planning, collision avoidance and manipulator structure design considering a single criteria has been solved using several techniques (Alander, 2003). Most engineering applications require the optimization of several criteria simultaneously. Often the problems are complex, include discrete and continuous variables and there is no prior knowledge about the search space. These kind of problems are very more complex, since they consider multiple design criteria simultaneously within the optimization procedure. This is known as a multi-criteria (or multiobjective) optimization, that has been addressed successfully through GAs (Deb, 2001). The overall aim of multi-criteria evolutionary algorithms is to achieve a set of non-dominated optimal solutions known as Pareto front. At the end of the optimization procedure, instead of a single optimal (or near optimal) solution, the decision maker can select a solution from the Pareto front. Some of the key issues in multi-criteria GAs are: i) the number of objectives, ii) to obtain a Pareto front as wide as possible and iii) to achieve a Pareto front uniformly spread. Indeed, multi-objective techniques using GAs have been increasing in relevance as a research area. In 1989, Goldberg suggested the use of a GA to solve multi-objective problems and since then other researchers have been developing new methods, such as the multi-objective genetic algorithm (MOGA) (Fonseca & Fleming, 1995), the non-dominated sorted genetic algorithm (NSGA) (Deb, 2001), and the niched Pareto genetic algorithm (NPGA) (Horn et al., 1994), among several other variants (Coello, 1998). In this work the trajectory planning problem considers: i) robots with 2 and 3 degrees of freedom (dof ), ii) the inclusion of obstacles in the workspace and iii) up to five criteria that are used to qualify the evolving trajectory, namely the: joint traveling distance, joint velocity, end effector / Cartesian distance, end effector / Cartesian velocity and energy involved. These criteria are used to minimize the joint and end effector traveled distance, trajectory ripple and energy required by the manipulator to reach at destination point. Bearing this ideas in mind, the paper addresses the planning of robot trajectories, meaning the development of an algorithm to find a continuous motion that takes the manipulator from a given starting configuration up to a desired end position without colliding with any obstacle in the workspace. The chapter is organized as follows. Section 2 describes the trajectory planning and several approaches proposed in the literature. Section 3 formulates the problem, namely the representation adopted to solve the trajectory planning and the objectives considered in the optimization. Section 4 studies the algorithm convergence. Section 5 studies a 2R manipulator (i.e., a robot with two rotational joints/links) when the optimization trajectory considers two and five objectives. Sections 6 and 7 show the results for the 3R redundant manipulator with five goals and for other complementary experiments are described, respectively. Finally, section 8 draws the main conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work reports on an experimental and finite element method (FEM) parametric study of adhesively-bonded single and double-strap repairs on carbon-epoxy structures under buckling unrestrained compression. The influence of the overlap length and patch thickness was evaluated. This loading gains a particular significance from the additional characteristic mechanisms of structures under compression, such as fibres microbuckling, for buckling restrained structures, or global buckling of the assembly, if no transverse restriction exists. The FEM analysis is based on the use of cohesive elements including mixed-mode criteria to simulate a cohesive fracture of the adhesive layer. Trapezoidal laws in pure modes I and II were used to account for the ductility of most structural adhesives. These laws were estimated for the adhesive used from double cantilever beam (DCB) and end-notched flexure (ENF) tests, respectively, using an inverse technique. The pure mode III cohesive law was equalled to the pure mode II one. Compression failure in the laminates was predicted using a stress-based criterion. The accurate FEM predictions open a good prospect for the reduction of the extensive experimentation in the design of carbon-epoxy repairs. Design principles were also established for these repairs under buckling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fractional calculus (FC) is currently being applied in many areas of science and technology. In fact, this mathematical concept helps the researches to have a deeper insight about several phenomena that integer order models overlook. Genetic algorithms (GA) are an important tool to solve optimization problems that occur in engineering. This methodology applies the concepts that describe biological evolution to obtain optimal solution in many different applications. In this line of thought, in this work we use the FC and the GA concepts to implement the electrical fractional order potential. The performance of the GA scheme, and the convergence of the resulting approximation, are analyzed. The results are analyzed for different number of charges and several fractional orders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A genetic algorithm used to design radio-frequency binary-weighted differential switched capacitor arrays (RFDSCAs) is presented in this article. The algorithm provides a set of circuits all having the same maximum performance. This article also describes the design, implementation, and measurements results of a 0.25 lm BiCMOS 3-bit RFDSCA. The experimental results show that the circuit presents the expected performance up to 40 GHz. The similarity between the evolutionary solutions, circuit simulations, and measured results indicates that the genetic synthesis method is a very useful tool for designing optimum performance RFDSCAs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing complexity of VLSI circuits and the reduced accessibility of modern packaging and mounting technologies restrict the usefulness of conventional in-circuit debugging tools, such as in-circuit emulators for microprocessors and microcontrollers. However, this same trend enables the development of more complex products, which in turn require more powerful debugging tools. These conflicting demands could be met if the standard scan test infrastructures now common in most complex components were able to match the debugging requirements of design verification and prototype validation. This paper analyses the main debug requirements in the design of microprocessor-based applications and the feasibility of their implementation using the mandatory, optional and additional operating modes of the standard IEEE 1149.1 test infrastructure.