906 resultados para Normalization-based optimization
Resumo:
A synbiotic yoghurt based on a combination of soymilk and yacon water extract (from yacon root tubers) was developed as a novel food product fermented with a probiotic culture of Enterococcus faecium CRL 183 and Lactobacillus helveticus ssp jugurti 4l6. Response surface methodology (RSM) was used to optimize the independent variables soymilk protein concentration and percentage of yacon extract in the formulation through a Central Composite Rotatable Design (CCRD), consisting of a 22 factorial design with two levels (-1, +1), two central points (0) and four axial points (± a, 0) (0, ± α). The responses were assessed by consumer acceptance tests. The optimization indicated that a formulation with a soymilk protein concentration of 1.74g/L and 25.86% of yacon extract gave the best average values, 5.91 for the taste and 6.00 for the overall impression responses. The formulation with 40% of yacon extract and the same concentration of soymilk protein achieved similar acceptance values: taste (5.94) and overall impression (5.87), however, with the extra yacon, it probably had a greater content of prebiotic fructooligosaccharides. Consequently, both formulations may give useful functional foods, with sensory properties comparable with those of soy yoghurt (control formulation). Copyright © 2010 by New Century Health Publishers.
Resumo:
A significant set of information stored in different databases around the world, can be shared through peer-topeer databases. With that, is obtained a large base of knowledge, without the need for large investments because they are used existing databases, as well as the infrastructure in place. However, the structural characteristics of peer-topeer, makes complex the process of finding such information. On the other side, these databases are often heterogeneous in their schemas, but semantically similar in their content. A good peer-to-peer databases systems should allow the user access information from databases scattered across the network and receive only the information really relate to your topic of interest. This paper proposes to use ontologies in peer-to-peer database queries to represent the semantics inherent to the data. The main contribution of this work is enable integration between heterogeneous databases, improve the performance of such queries and use the algorithm of optimization Ant Colony to solve the problem of locating information on peer-to-peer networks, which presents an improve of 18% in results. © 2011 IEEE.
Resumo:
This work develops two approaches based on the fuzzy set theory to solve a class of fuzzy mathematical optimization problems with uncertainties in the objective function and in the set of constraints. The first approach is an adaptation of an iterative method that obtains cut levels and later maximizes the membership function of fuzzy decision making using the bound search method. The second one is a metaheuristic approach that adapts a standard genetic algorithm to use fuzzy numbers. Both approaches use a decision criterion called satisfaction level that reaches the best solution in the uncertain environment. Selected examples from the literature are presented to compare and to validate the efficiency of the methods addressed, emphasizing the fuzzy optimization problem in some import-export companies in the south of Spain. © 2012 Brazilian Operations Research Society.
Resumo:
Feathers are rich in amino acids and can be employed as a dietary protein supplement for animal feed. Microbial degradation is an alternative technology for improving the nutritional value of feathers. Other potential applications of keratinase include use in the leather industry, detergents and medicine as well as the pharmaceutical for the treatment of acne, psoriasis and calluses. A new keratinolytic enzyme production bacterium was isolated from a poultry processing plant. To improve keratinase yield, statistically based experimental designs were applied to optimize three significant variables: temperature, substrate concentration (feathers) and agitation speed. Response surface methodology demonstrated an increase in keratinolytic activity at temperature, agitation speed and substrate concentration of 26.6°C, 150 rpm and 2%, respectively. Liquid chromatography revealed the release of amino acids in the Bacillus amyloliquefaciens culture broth, thereby demonstrating the potential of feather meal in the animal feed industry. © Global Science Publications.
Resumo:
The growing demand for steels with tighter compositional specifications led the Companhia Siderúrgica Nacional (CSN) to develop more efficient processes. To solve this problem this paper aims to identify the operational variables more impacting in the desulfurization process, specifically in torpedo car, as well as its causes and solutions. Then select and test, with laboratorial and industrial tests, desulfurizing agents based of CaC 2, CaO, CaCO3, and Mg to assess the cost per quantity of product desulfurized. The mixture with best results was not that one with highest content of CaC2. It is believed that this mixture showed better efficiency because of the increased agitation of the bath, produced by the releasing of gas from compound CaCO3 present in this mixture. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Resumo:
This paper tackles a Nurse Scheduling Problem which consists of generating work schedules for a set of nurses while considering their shift preferences and other requirements. The objective is to maximize the satisfaction of nurses' preferences and minimize the violation of soft constraints. This paper presents a new deterministic heuristic algorithm, called MAPA (multi-assignment problem-based algorithm), which is based on successive resolutions of the assignment problem. The algorithm has two phases: a constructive phase and an improvement phase. The constructive phase builds a full schedule by solving successive assignment problems, one for each day in the planning period. The improvement phase uses a couple of procedures that re-solve assignment problems to produce a better schedule. Given the deterministic nature of this algorithm, the same schedule is obtained each time that the algorithm is applied to the same problem instance. The performance of MAPA is benchmarked against published results for almost 250,000 instances from the NSPLib dataset. In most cases, particularly on large instances of the problem, the results produced by MAPA are better when compared to best-known solutions from the literature. The experiments reported here also show that the MAPA algorithm finds more feasible solutions compared with other algorithms in the literature, which suggest that this proposed approach is effective and robust. © 2013 Springer Science+Business Media New York.
Resumo:
The optimal reactive dispatch problem is a nonlinear programming problem containing continuous and discrete control variables. Owing to the difficulty caused by discrete variables, this problem is usually solved assuming all variables as continuous variables, therefore the original discrete variables are rounded off to the closest discrete value. This approach may provide solutions far from optimal or even unfeasible solutions. This paper presents an efficient handling of discrete variables by penalty function so that the problem becomes continuous and differentiable. Simulations with the IEEE test systems were performed showing the efficiency of the proposed approach. © 1969-2012 IEEE.
Resumo:
Goal Programming (GP) is an important analytical approach devised to solve many realworld problems. The first GP model is known as Weighted Goal Programming (WGP). However, Multi-Choice Aspirations Level (MCAL) problems cannot be solved by current GP techniques. In this paper, we propose a Multi-Choice Mixed Integer Goal Programming model (MCMI-GP) for the aggregate production planning of a Brazilian sugar and ethanol milling company. The MC-MIGP model was based on traditional selection and process methods for the design of lots, representing the production system of sugar, alcohol, molasses and derivatives. The research covers decisions on the agricultural and cutting stages, sugarcane loading and transportation by suppliers and, especially, energy cogeneration decisions; that is, the choice of production process, including storage stages and distribution. The MCMIGP allows decision-makers to set multiple aspiration levels for their problems in which the more/higher, the better and the less/lower, the better in the aspiration levels are addressed. An application of the proposed model for real problems in a Brazilian sugar and ethanol mill was conducted; producing interesting results that are herein reported and commented upon. Also, it was made a comparison between MCMI GP and WGP models using these real cases. © 2013 Elsevier Inc.
Resumo:
The by-products generated from industrial filleting of tilapia surimi can be used for the manufacture of surimi. The surimi production uses large amounts of water, which generates a wastewater rich in organic compounds (lipids, soluble proteins and blood). Optimizing the number of washing cycles will contribute to a more sustainable production. A mathematical model of mechanically recovered tilapia meat (Oreochromis niloticus) for the processing of surimi (minced fish washing cycles and tapioca starch addition) based on two quality parameters (texture and moisture) was constructed by applying the response surface methodology (RSM). Each factor had an important effect on the moisture and texture of surimi. This study found that the optimal formulation for producing the best surimi using the by-products of tilapia filleting in manufacturing fish burger were the addition of 10% tapioca starch and three minced fish washing cycles. A microstructural evaluation supported the findings of the mathematical model. Practical Applications: The use of mechanically recovered fish meat (MRFM) for the production of surimi enables the utilization of the by-products of filleting fish. However, the inferior quality of the surimi produced from MRFM in relation to that produced with fillets necessitates the addition of starch; secondly, surimi production consumes a large volume of water. RSM provides a valuable means for optimizing the number of washing cycles and starch amounts utilized in fish burger production. Tapioca starch, widely produced in Brazil, has desirable characteristics (surface sheen, smooth texture, neutral taste and clarity in solution) for use in MRFM-produced surimi. © 2013 Wiley Periodicals, Inc.
Resumo:
The effects of soybean and castorbean meals were evaluated separately, and in combinations at different ratios, as substrates for lipase production by Botryosphaeria ribis EC-01 in submerged fermentation using only distilled water. The addition of glycerol analytical grade (AG) and glycerol crude (CG) to soybean and castorbean meals separately and in combination, were also examined for lipase production. Glycerol-AG increased enzyme production, whereas glycerol-CG decreased it. A 24 factorial design was developed to determine the best concentrations of soybean meal, castorbean meal, glycerol-AG, and KH2PO4 to optimize lipase production by B. ribis EC-01. Soybean meal and glycerol-AG had a significant effect on lipase production, whereas castorbean meal did not. A second treatment (22 factorial design central composite) was developed, and optimal lipase production (4,820 U/g of dry solids content (ds)) was obtained when B. ribis EC-01 was grown on 0.5 % (w/v) soybean meal and 5.2 % (v/v) glycerol in distilled water, which was in agreement with the predicted value (4,892 U/g ds) calculated by the model. The unitary cost of lipase production determined under the optimized conditions developed ranged from US$0.42 to 0.44 based on nutrient costs. The fungal lipase was immobilized onto Celite and showed high thermal stability and was used for transesterification of soybean oil in methanol (1:3) resulting in 36 % of fatty acyl alkyl ester content. The apparent K m and V max were determined and were 1.86 mM and 14.29 μmol min -1 mg-1, respectively. © 2013 Springer Science+Business Media New York.
Resumo:
Image restoration is a research field that attempts to recover a blurred and noisy image. Since it can be modeled as a linear system, we propose in this paper to use the meta-heuristics optimization algorithm Harmony Search (HS) to find out near-optimal solutions in a Projections Onto Convex Sets-based formulation to solve this problem. The experiments using HS and four of its variants have shown that we can obtain near-optimal and faster restored images than other evolutionary optimization approach. © 2013 IEEE.
Resumo:
The strut-and-tie models are widely used in certain types of structural elements in reinforced concrete and in regions with complexity of the stress state, called regions D, where the distribution of deformations in the cross section is not linear. This paper introduces a numerical technique to determine the strut-and-tie models using a variant of the classical Evolutionary Structural Optimization, which is called Smooth Evolutionary Structural Optimization. The basic idea of this technique is to identify the numerical flow of stresses generated in the structure, setting out in more technical and rational members of strut-and-tie, and to quantify their value for future structural design. This paper presents an index performance based on the evolutionary topology optimization method for automatically generating optimal strut-and-tie models in reinforced concrete structures with stress constraints. In the proposed approach, the element with the lowest Von Mises stress is calculated for element removal, while a performance index is used to monitor the evolutionary optimization process. Thus, a comparative analysis of the strut-and-tie models for beams is proposed with the presentation of examples from the literature that demonstrates the efficiency of this formulation. © 2013 Elsevier Ltd.
Resumo:
ABSTRACT: The femtocell concept aims to combine fixed-line broadband access with mobile telephony using the deployment of low-cost, low-power third and fourth generation base stations in the subscribers' homes. While the self-configuration of femtocells is a plus, it can limit the quality of service (QoS) for the users and reduce the efficiency of the network, based on outdated allocation parameters such as signal power level. To this end, this paper presents a proposal for optimized allocation of users on a co-channel macro-femto network, that enable self-configuration and public access, aiming to maximize the quality of service of applications and using more efficiently the available energy, seeking the concept of Green networking. Thus, when the user needs to connect to make a voice or a data call, the mobile phone has to decide which network to connect, using the information of number of connections, the QoS parameters (packet loss and throughput) and the signal power level of each network. For this purpose, the system is modeled as a Markov Decision Process, which is formulated to obtain an optimal policy that can be applied on the mobile phone. The policy created is flexible, allowing different analyzes, and adaptive to the specific characteristics defined by the telephone company. The results show that compared to traditional QoS approaches, the policy proposed here can improve energy efficiency by up to 10%.
Resumo:
O método de empilhamento sísmico por Superfície de Reflexão Comum (ou empilhamento SRC) produz a simulação de seções com afastamento nulo (NA) a partir dos dados de cobertura múltipla. Para meios 2D, o operador de empilhamento SRC depende de três parâmetros que são: o ângulo de emergência do raio central com fonte-receptor nulo (β0), o raio de curvatura da onda ponto de incidência normal (RNIP) e o raio de curvatura da onda normal (RN). O problema crucial para a implementação do método de empilhamento SRC consiste na determinação, a partir dos dados sísmicos, dos três parâmetros ótimos associados a cada ponto de amostragem da seção AN a ser simulada. No presente trabalho foi desenvolvido uma nova sequência de processamento para a simulação de seções AN por meio do método de empilhamento SRC. Neste novo algoritmo, a determinação dos três parâmetros ótimos que definem o operador de empilhamento SRC é realizada em três etapas: na primeira etapa são estimados dois parâmetros (β°0 e R°NIP) por meio de uma busca global bidimensional nos dados de cobertura múltipla. Na segunda etapa é usado o valor de β°0 estimado para determinar-se o terceiro parâmetro (R°N) através de uma busca global unidimensional na seção AN resultante da primeira etapa. Em ambas etapas as buscas globais são realizadas aplicando o método de otimização Simulated Annealing (SA). Na terceira etapa são determinados os três parâmetros finais (β0, RNIP e RN) através uma busca local tridimensional aplicando o método de otimização Variable Metric (VM) nos dados de cobertura múltipla. Nesta última etapa é usado o trio de parâmetros (β°0, R°NIP, R°N) estimado nas duas etapas anteriores como aproximação inicial. Com o propósito de simular corretamente os eventos com mergulhos conflitantes, este novo algoritmo prevê a determinação de dois trios de parâmetros associados a pontos de amostragem da seção AN onde há intersecção de eventos. Em outras palavras, nos pontos da seção AN onde dois eventos sísmicos se cruzam são determinados dois trios de parâmetros SRC, os quais serão usados conjuntamente na simulação dos eventos com mergulhos conflitantes. Para avaliar a precisão e eficiência do novo algoritmo, este foi aplicado em dados sintéticos de dois modelos: um com interfaces contínuas e outro com uma interface descontinua. As seções AN simuladas têm elevada razão sinal-ruído e mostram uma clara definição dos eventos refletidos e difratados. A comparação das seções AN simuladas com as suas similares obtidas por modelamento direto mostra uma correta simulação de reflexões e difrações. Além disso, a comparação dos valores dos três parâmetros otimizados com os seus correspondentes valores exatos calculados por modelamento direto revela também um alto grau de precisão. Usando a aproximação hiperbólica dos tempos de trânsito, porém sob a condição de RNIP = RN, foi desenvolvido um novo algoritmo para a simulação de seções AN contendo predominantemente campos de ondas difratados. De forma similar ao algoritmo de empilhamento SRC, este algoritmo denominado empilhamento por Superfícies de Difração Comum (SDC) também usa os métodos de otimização SA e VM para determinar a dupla de parâmetros ótimos (β0, RNIP) que definem o melhor operador de empilhamento SDC. Na primeira etapa utiliza-se o método de otimização SA para determinar os parâmetros iniciais β°0 e R°NIP usando o operador de empilhamento com grande abertura. Na segunda etapa, usando os valores estimados de β°0 e R°NIP, são melhorados as estimativas do parâmetro RNIP por meio da aplicação do algoritmo VM na seção AN resultante da primeira etapa. Na terceira etapa são determinados os melhores valores de β°0 e R°NIP por meio da aplicação do algoritmo VM nos dados de cobertura múltipla. Vale salientar que a aparente repetição de processos tem como efeito a atenuação progressiva dos eventos refletidos. A aplicação do algoritmo de empilhamento SDC em dados sintéticos contendo campos de ondas refletidos e difratados, produz como resultado principal uma seção AN simulada contendo eventos difratados claramente definidos. Como uma aplicação direta deste resultado na interpretação de dados sísmicos, a migração pós-empilhamento em profundidade da seção AN simulada produz uma seção com a localização correta dos pontos difratores associados às descontinuidades do modelo.
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS