61 resultados para Key cutting algorithm
Resumo:
The enzymatic kinetic resolution of tert-butyl 2-(1-hydroxyethyl) phenylcarbamate via lipase-catalyzed transesterification reaction was studied. We investigated several reaction conditions and the carbamate was resolved by Candida antarctica lipase B (CAL-B), leading to the optically pure (R)- and (S)-enantiomers. The enzymatic process showed excellent enantioselectivity (E > 200). (R)- and (S)-tert-butyl 2-(1-hydroxyethyl) phenylcarbamate were easily transformed into the corresponding (R)and (S)-1-(2-aminophenyl)ethanols.
Resumo:
Estimates of greenhouse-gas emissions from deforestation are highly uncertain because of high variability in key parameters and because of the limited number of studies providing field measurements of these parameters. One such parameter is burning efficiency, which determines how much of the original forest`s aboveground carbon stock will be released in the burn, as well as how much will later be released by decay and how much will remain as charcoal. In this paper we examined the fate of biomass from a semideciduous tropical forest in the ""arc of deforestation,"" where clearing activity is concentrated along the southern edge of the Amazon forest. We estimated carbon content, charcoal formation and burning efficiency by direct measurements (cutting and weighing) and by line-intersect sampling (LIS) done along the axis of each plot before and after burning of felled vegetation. The total aboveground dry biomass found here (219.3 Mg ha(-1)) is lower than the values found in studies that have been done in other parts of the Amazon region. Values for burning efficiency (65%) and charcoal formation (6.0%, or 5.98 Mg C ha(-1)) were much higher than those found in past studies in tropical areas. The percentage of trunk biomass lost in burning (49%) was substantially higher than has been found in previous studies. This difference may be explained by the concentration of more stems in the smaller diameter classes and the low humidity of the fuel (the dry season was unusually long in 2007, the year of the burn). This study provides the first measurements of forest burning parameters for a group of forest types that is now undergoing rapid deforestation. The burning parameters estimated here indicate substantially higher burning efficiency than has been found in other Amazonian forest types. Quantification of burning efficiency is critical to estimates of trace-gas emissions from deforestation. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The evaluation of hexose and pentose in pre-cultivation of Candida guilliermondii FTI 20037 yeast on xylose reductase (XR) and xylitol dehydrogenase (XDH) enzymes activities was performed during fermentation in sugarcane bagasse hemicellulosic hydrolysate. The xylitol production was evaluated by using cells previously growth in 30.0 gl(-1) xylose, 30.0 gl(-1) glucose and in both sugars mixture (30.0 gl(-1) xylose and 2.0 gl(-1) glucose). The vacuum evaporated hydrolysate (80 gl(-1)) was detoxificated by ion exchange resin (A-860S; A500PS and C-150-Purolite(A (R))). The total phenolic compounds and acetic acid were 93.0 and 64.9%, respectively, removed by the resin hydrolysate treatment. All experiments were carried out in Erlenmeyer flasks at 200 rpm, 30A degrees C. The maximum XR (0.618 Umg (Prot) (-1) ) and XDH (0.783 Umg (Prot) (-1) ) enzymes activities was obtained using inoculum previously growth in both sugars mixture. The highest cell concentration (10.6 gl(-1)) was obtained with inoculum pre-cultivated in the glucose. However, the xylitol yield and xylitol volumetric productivity were favored using the xylose as carbon source. In this case, it was observed maximum xylose (81%) and acetic acid (100%) consumption. It is very important to point out that maximum enzymatic activities were obtained when the mixture of sugars was used as carbon source of inoculum, while the highest fermentative parameters were obtained when xylose was used.
Resumo:
The power loss reduction in distribution systems (DSs) is a nonlinear and multiobjective problem. Service restoration in DSs is even computationally hard since it additionally requires a solution in real-time. Both DS problems are computationally complex. For large-scale networks, the usual problem formulation has thousands of constraint equations. The node-depth encoding (NDE) enables a modeling of DSs problems that eliminates several constraint equations from the usual formulation, making the problem solution simpler. On the other hand, a multiobjective evolutionary algorithm (EA) based on subpopulation tables adequately models several objectives and constraints, enabling a better exploration of the search space. The combination of the multiobjective EA with NDE (MEAN) results in the proposed approach for solving DSs problems for large-scale networks. Simulation results have shown the MEAN is able to find adequate restoration plans for a real DS with 3860 buses and 632 switches in a running time of 0.68 s. Moreover, the MEAN has shown a sublinear running time in function of the system size. Tests with networks ranging from 632 to 5166 switches indicate that the MEAN can find network configurations corresponding to a power loss reduction of 27.64% for very large networks requiring relatively low running time.
Resumo:
The main objective of this paper is to relieve the power system engineers from the burden of the complex and time-consuming process of power system stabilizer (PSS) tuning. To achieve this goal, the paper proposes an automatic process for computerized tuning of PSSs, which is based on an iterative process that uses a linear matrix inequality (LMI) solver to find the PSS parameters. It is shown in the paper that PSS tuning can be written as a search problem over a non-convex feasible set. The proposed algorithm solves this feasibility problem using an iterative LMI approach and a suitable initial condition, corresponding to a PSS designed for nominal operating conditions only (which is a quite simple task, since the required phase compensation is uniquely defined). Some knowledge about the PSS tuning is also incorporated in the algorithm through the specification of bounds defining the allowable PSS parameters. The application of the proposed algorithm to a benchmark test system and the nonlinear simulation of the resulting closed-loop models demonstrate the efficiency of this algorithm. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
In this article a novel algorithm based on the chemotaxis process of Echerichia coil is developed to solve multiobjective optimization problems. The algorithm uses fast nondominated sorting procedure, communication between the colony members and a simple chemotactical strategy to change the bacterial positions in order to explore the search space to find several optimal solutions. The proposed algorithm is validated using 11 benchmark problems and implementing three different performance measures to compare its performance with the NSGA-II genetic algorithm and with the particle swarm-based algorithm NSPSO. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The machining of hardened steels has always been a great challenge in metal cutting, particularly for drilling operations. Generally, drilling is the machining process that is most difficult to cool due to the tool`s geometry. The aim of this work is to determine the heat flux and the coefficient of convection in drilling using the inverse heat conduction method. Temperature was assessed during the drilling of hardened AISI H13 steel using the embedded thermocouple technique. Dry machining and two cooling/lubrication systems were used, and thermocouples were fixed at distances very close to the hole`s wall. Tests were replicated for each condition, and were carried out with new and worn drills. An analytical heat conduction model was used to calculate the temperature at tool-workpiece interface and to define the heat flux and the coefficient of convection. In all tests using new and worn out drills, the lowest temperatures and decrease of heat flux were observed using the flooded system, followed by the MQL, considering the dry condition as reference. The decrease of temperature was directly proportional to the amount of lubricant applied and was significant in the MQL system when compared to dry cutting. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The general flowshop scheduling problem is a production problem where a set of n jobs have to be processed with identical flow pattern on in machines. In permutation flowshops the sequence of jobs is the same on all machines. A significant research effort has been devoted for sequencing jobs in a flowshop minimizing the makespan. This paper describes the application of a Constructive Genetic Algorithm (CGA) to makespan minimization on flowshop scheduling. The CGA was proposed recently as an alternative to traditional GA approaches, particularly, for evaluating schemata directly. The population initially formed only by schemata, evolves controlled by recombination to a population of well-adapted structures (schemata instantiation). The CGA implemented is based on the NEH classic heuristic and a local search heuristic used to define the fitness functions. The parameters of the CGA are calibrated using a Design of Experiments (DOE) approach. The computational results are compared against some other successful algorithms from the literature on Taillard`s well-known standard benchmark. The computational experience shows that this innovative CGA approach provides competitive results for flowshop scheduling; problems. (C) 2007 Elsevier Ltd. All rights reserved.
A hybrid Particle Swarm Optimization - Simplex algorithm (PSOS) for structural damage identification
Resumo:
This study proposes a new PSOS-model based damage identification procedure using frequency domain data. The formulation of the objective function for the minimization problem is based on the Frequency Response Functions (FRFs) of the system. A novel strategy for the control of the Particle Swarm Optimization (PSO) parameters based on the Nelder-Mead algorithm (Simplex method) is presented; consequently, the convergence of the PSOS becomes independent of the heuristic constants and its stability and confidence are enhanced. The formulated hybrid method performs better in different benchmark functions than the Simulated Annealing (SA) and the basic PSO (PSO(b)). Two damage identification problems, taking into consideration the effects of noisy and incomplete data, were studied: first, a 10-bar truss and second, a cracked free-free beam, both modeled with finite elements. In these cases, the damage location and extent were successfully determined. Finally, a non-linear oscillator (Duffing oscillator) was identified by PSOS providing good results. (C) 2009 Elsevier Ltd. All rights reserved
Resumo:
Wireless Sensor Networks (WSNs) have a vast field of applications, including deployment in hostile environments. Thus, the adoption of security mechanisms is fundamental. However, the extremely constrained nature of sensors and the potentially dynamic behavior of WSNs hinder the use of key management mechanisms commonly applied in modern networks. For this reason, many lightweight key management solutions have been proposed to overcome these constraints. In this paper, we review the state of the art of these solutions and evaluate them based on metrics adequate for WSNs. We focus on pre-distribution schemes well-adapted for homogeneous networks (since this is a more general network organization), thus identifying generic features that can improve some of these metrics. We also discuss some challenges in the area and future research directions. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This paper presents an Adaptive Maximum Entropy (AME) approach for modeling biological species. The Maximum Entropy algorithm (MaxEnt) is one of the most used methods in modeling biological species geographical distribution. The approach presented here is an alternative to the classical algorithm. Instead of using the same set features in the training, the AME approach tries to insert or to remove a single feature at each iteration. The aim is to reach the convergence faster without affect the performance of the generated models. The preliminary experiments were well performed. They showed an increasing on performance both in accuracy and in execution time. Comparisons with other algorithms are beyond the scope of this paper. Some important researches are proposed as future works.
Resumo:
This paper presents a free software tool that supports the next-generation Mobile Communications, through the automatic generation of models of components and electronic devices based on neural networks. This tool enables the creation, training, validation and simulation of the model directly from measurements made on devices of interest, using an interface totally oriented to non-experts in neural models. The resulting model can be exported automatically to a traditional circuit simulator to test different scenarios.
Resumo:
This paper presents a new methodology to estimate unbalanced harmonic distortions in a power system, based on measurements of a limited number of given sites. The algorithm utilizes evolutionary strategies (ES), a development branch of evolutionary algorithms. The problem solving algorithm herein proposed makes use of data from various power quality meters, which can either be synchronized by high technology GPS devices or by using information from a fundamental frequency load flow, what makes the overall power quality monitoring system much less costly. The ES based harmonic estimation model is applied to a 14 bus network to compare its performance to a conventional Monte Carlo approach. It is also applied to a 50 bus subtransmission network in order to compare the three-phase and single-phase approaches as well as the robustness of the proposed method. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
An improvement to the quality bidimensional Delaunay mesh generation algorithm, which combines the mesh refinement algorithms strategy of Ruppert and Shewchuk is proposed in this research. The developed technique uses diametral lenses criterion, introduced by L. P. Chew, with the purpose of eliminating the extremely obtuse triangles in the boundary mesh. This method splits the boundary segment and obtains an initial prerefinement, and thus reducing the number of necessary iterations to generate a high quality sequential triangulation. Moreover, it decreases the intensity of the communication and synchronization between subdomains in parallel mesh refinement.