918 resultados para Multicast Packing Problem. Multiobjective Optimization. Network Optimization. Multicast
Resumo:
This paper proposes a novel method for controlling the convergence rate of a particle swarm optimization algorithm using fractional calculus (FC) concepts. The optimization is tested for several well-known functions and the relationship between the fractional order velocity and the convergence of the algorithm is observed. The FC demonstrates a potential for interpreting evolution of the algorithm and to control its convergence.
Resumo:
We present the modeling efforts on antenna design and frequency selection to monitor brain temperature during prolonged surgery using noninvasive microwave radiometry. A tapered log-spiral antenna design is chosen for its wideband characteristics that allow higher power collection from deep brain. Parametric analysis with the software HFSS is used to optimize antenna performance for deep brain temperature sensing. Radiometric antenna efficiency (eta) is evaluated in terms of the ratio of power collected from brain to total power received by the antenna. Anatomical information extracted from several adult computed tomography scans is used to establish design parameters for constructing an accurate layered 3-D tissue phantom. This head phantom includes separate brain and scalp regions, with tissue equivalent liquids circulating at independent temperatures on either side of an intact skull. The optimized frequency band is 1.1-1.6 GHz producing an average antenna efficiency of 50.3% from a two turn log-spiral antenna. The entire sensor package is contained in a lightweight and low-profile 2.8 cm diameter by 1.5 cm high assembly that can be held in place over the skin with an electromagnetic interference shielding adhesive patch. The calculated radiometric equivalent brain temperature tracks within 0.4 degrees C of the measured brain phantom temperature when the brain phantom is lowered 10. C and then returned to the original temperature (37 degrees C) over a 4.6-h experiment. The numerical and experimental results demonstrate that the optimized 2.5-cm log-spiral antenna is well suited for the noninvasive radiometric sensing of deep brain temperature.
Resumo:
In order to correctly assess the biaxial fatigue material properties one must experimentally test different load conditions and stress levels. With the rise of new in-plane biaxial fatigue testing machines, using smaller and more efficient electrical motors, instead of the conventional hydraulic machines, it is necessary to reduce the specimen size and to ensure that the specimen geometry is appropriated for the load capacity installed. At the present time there are no standard specimen’s geometries and the indications on literature how to design an efficient test specimen are insufficient. The main goal of this paper is to present the methodology on how to obtain an optimal cruciform specimen geometry, with thickness reduction in the gauge area, appropriated for fatigue crack initiation, as a function of the base material sheet thickness used to build the specimen. The geometry is optimized for maximum stress using several parameters, ensuring that in the gauge area the stress is uniform and maximum with two limit phase shift loading conditions. Therefore the fatigue damage will always initiate on the center of the specimen, avoiding failure outside this region. Using the Renard Series of preferred numbers for the base material sheet thickness as a reference, the reaming geometry parameters are optimized using a derivative-free methodology, called direct multi search (DMS) method. The final optimal geometry as a function of the base material sheet thickness is proposed, as a guide line for cruciform specimens design, and as a possible contribution for a future standard on in-plane biaxial fatigue tests. © 2014, Gruppo Italiano Frattura. All rights reserved.
Resumo:
Dissertation presented in partial fulfillment of the requirements for the degree of Master in Biotechnology
Resumo:
As it is well known, competitive electricity markets require new computing tools for power companies that operate in retail markets in order to enhance the management of its energy resources. During the last years there has been an increase of the renewable penetration into the micro-generation which begins to co-exist with the other existing power generation, giving rise to a new type of consumers. This paper develops a methodology to be applied to the management of the all the aggregators. The aggregator establishes bilateral contracts with its clients where the energy purchased and selling conditions are negotiated not only in terms of prices but also for other conditions that allow more flexibility in the way generation and consumption is addressed. The aggregator agent needs a tool to support the decision making in order to compose and select its customers' portfolio in an optimal way, for a given level of profitability and risk.
Resumo:
Os mercados de energia elétrica são atualmente uma realidade um pouco por todo o mundo. Contudo, não é consensual o modelo regulatório a utilizar, o que origina a utilização de diferentes modelos nos diversos países que deram início ao processo de liberalização e de reestruturação do sector elétrico. A esses países, dado que a energia elétrica não é um bem armazenável, pelo menos em grandes quantidades, colocam-se questões importantes relacionadas com a gestão propriamente dita do seu sistema elétrico. Essas questões implicam a adoção de regras impostas pelo regulador que permitam ultrapassar essas questões. Este trabalho apresenta um estudo feito aos mercados de energia elétrica existentes um pouco por todo o mundo e que o autor considerou serem os mais importantes. Foi também feito um estudo de ferramentas de otimização essencialmente baseado em meta-heurísticas aplicadas a problemas relacionados com a operação dos mercados e com os sistemas elétricos de energia, como é o exemplo da resolução do problema do Despacho Económico. Foi desenvolvida uma aplicação que simula o funcionamento de um mercado que atua com o modelo Pool Simétrico, em que são transmitidas as ofertas de venda e compra de energia elétrica por parte dos produtores, por um lado, e dos comercializadores, consumidores elegíveis ou intermediários financeiros, por outro, analisando a viabilidade técnica do Despacho Provisório. A análise da viabilidade técnica do Despacho Provisório é verificada através do modelo DC de trânsito de potências. No caso da inviabilidade do Despacho Provisório, por violação de restrições afetas ao problema, são determinadas medidas corretivas a esse despacho, com base nas ofertas realizadas e recorrendo a um Despacho Ótimo. Para a determinação do Despacho Ótimo recorreu-se à meta-heurística Algoritmos Genéticos. A aplicação foi desenvolvida no software MATLAB utilizando a ferramenta Graphical User Interfaces. A rede de teste utilizada foi a rede de 14 barramentos do Institute of Electrical and Electronics Engineers (IEEE). A aplicação mostra-se competente no que concerne à simulação de um mercado com tipo de funcionamento Pool Simétrico onde são efetuadas ofertas simples e onde as transações ocorrem no mercado diário, porém, não reflete o problema real relacionado a este tipo de mercados. Trata-se, portanto, de um simulador básico de um mercado de energia cujo modelo de funcionamento se baseia no tipo Pool Simétrico.
Resumo:
The aim of this study is to optimize the heat flow through the pultrusion die assembly system on the manufacturing process of a specific glass-fiber reinforced polymer (GFRP) pultrusion profile. The control of heat flow and its distribution through whole die assembly system is of vital importance in optimizing the actual GFRP pultrusion process. Through mathematical modeling of heating-die process, by means of Finite Element Analysis (FEA) program, an optimum heater selection, die position and temperature control was achieved. The thermal environment within the die was critically modeled relative not only to the applied heat sources, but also to the conductive and convective losses, as well as the thermal contribution arising from the exothermic reaction of resin matrix as it cures or polymerizes from the liquid to solid condition. Numerical simulation was validated with basis on thermographic measurements carried out on key points along the die during pultrusion process.
Resumo:
Glass fibre-reinforced plastics (GFRP), nowadays commonly used in the construction, transportation and automobile sectors, have been considered inherently difficult to recycle due to both: cross-linked nature of thermoset resins, which cannot be remolded, and complex composition of the composite itself, which includes glass fibres, matrix and different types of inorganic fillers. Presently, most of the GFRP waste is landfilled leading to negative environmental impacts and supplementary added costs. With an increasing awareness of environmental matters and the subsequent desire to save resources, recycling would convert an expensive waste disposal into a profitable reusable material. There are several methods to recycle GFR thermostable materials: (a) incineration, with partial energy recovery due to the heat generated during organic part combustion; (b) thermal and/or chemical recycling, such as solvolysis, pyrolisis and similar thermal decomposition processes, with glass fibre recovering; and (c) mechanical recycling or size reduction, in which the material is subjected to a milling process in order to obtain a specific grain size that makes the material suitable as reinforcement in new formulations. This last method has important advantages over the previous ones: there is no atmospheric pollution by gas emission, a much simpler equipment is required as compared with ovens necessary for thermal recycling processes, and does not require the use of chemical solvents with subsequent environmental impacts. In this study the effect of incorporation of recycled GFRP waste materials, obtained by means of milling processes, on mechanical behavior of polyester polymer mortars was assessed. For this purpose, different contents of recycled GFRP waste materials, with distinct size gradings, were incorporated into polyester polymer mortars as sand aggregates and filler replacements. The effect of GFRP waste treatment with silane coupling agent was also assessed. Design of experiments and data treatment were accomplish by means of factorial design and analysis of variance ANOVA. The use of factorial experiment design, instead of the one factor at-a-time method is efficient at allowing the evaluation of the effects and possible interactions of the different material factors involved. Experimental results were promising toward the recyclability of GFRP waste materials as polymer mortar aggregates, without significant loss of mechanical properties with regard to non-modified polymer mortars.
Resumo:
IEEE International Symposium on Circuits and Systems, pp. 724 – 727, Seattle, EUA
Resumo:
Manufacturing processes need permanently to innovate and optimize because any can be susceptible to continuous improvement. Innovation and commitment to the development of these new solutions resulting from existing expertise and the continuing need to increase productivity, flexibility and ensuring the necessary quality of the manufactured products. To increase flexibility, it is necessary to significantly reduce set-up times and lead time in order to ensure the delivery of products ever faster. This objective can be achieved through a normalization of the pultrusion line elements. Implicitly, there is an increase of productivity by this way. This work is intended to optimize the pultrusion process of structural profiles. We consider all elements of the system from the storehouse of the fibers (rack) to the pultrusion die. Particular attention was devoted to (a) the guidance system of the fibers and webs, (b) the resin container where the fibers are impregnated, (c) standard plates positioning of the fibers towards the entrance to the spinneret and also (d) reviewed the whole process of assembling and fixing the die as well as its the heating system. With the implementation of these new systems was achieved a significant saving of time set-up and were clearly reduced the unit costs of production. Quality assurance was also increased.
Resumo:
The massification of electric vehicles (EVs) can have a significant impact on the power system, requiring a new approach for the energy resource management. The energy resource management has the objective to obtain the optimal scheduling of the available resources considering distributed generators, storage units, demand response and EVs. The large number of resources causes more complexity in the energy resource management, taking several hours to reach the optimal solution which requires a quick solution for the next day. Therefore, it is necessary to use adequate optimization techniques to determine the best solution in a reasonable amount of time. This paper presents a hybrid artificial intelligence technique to solve a complex energy resource management problem with a large number of resources, including EVs, connected to the electric network. The hybrid approach combines simulated annealing (SA) and ant colony optimization (ACO) techniques. The case study concerns different EVs penetration levels. Comparisons with a previous SA approach and a deterministic technique are also presented. For 2000 EVs scenario, the proposed hybrid approach found a solution better than the previous SA version, resulting in a cost reduction of 1.94%. For this scenario, the proposed approach is approximately 94 times faster than the deterministic approach.
Resumo:
Smart grids with an intensive penetration of distributed energy resources will play an important role in future power system scenarios. The intermittent nature of renewable energy sources brings new challenges, requiring an efficient management of those sources. Additional storage resources can be beneficially used to address this problem; the massive use of electric vehicles, particularly of vehicle-to-grid (usually referred as gridable vehicles or V2G), becomes a very relevant issue. This paper addresses the impact of Electric Vehicles (EVs) in system operation costs and in power demand curve for a distribution network with large penetration of Distributed Generation (DG) units. An efficient management methodology for EVs charging and discharging is proposed, considering a multi-objective optimization problem. The main goals of the proposed methodology are: to minimize the system operation costs and to minimize the difference between the minimum and maximum system demand (leveling the power demand curve). The proposed methodology perform the day-ahead scheduling of distributed energy resources in a distribution network with high penetration of DG and a large number of electric vehicles. It is used a 32-bus distribution network in the case study section considering different scenarios of EVs penetration to analyze their impact in the network and in the other energy resources management.
Resumo:
Trihalomethanes (THMs) are widely referred and studied as disinfection by-products (DBPs). The THMs that are most commonly detected are chloroform (TCM), bromodichloromethane (BDCM), chlorodibromomethane (CDBM), and bromoform (TBM). Several studies regarding the determination of THMs in swimming pool water and air samples have been published. This paper reviews the most recent work in this field, with a special focus on water and air sampling, sample preparation and analytical determination methods. An experimental study has been developed in order to optimize the headspace solid-phasemicroextraction (HS-SPME) conditions of TCM, BDCM, CDBM and TBM from water samples using a 23 factorial design. An extraction temperature of 45 °C, for 25min, and a desorption time of 5 min were found to be the best conditions. Analysis was performed by gas chromatography with an electron capture detector (GC-ECD). The method was successfully applied to a set of 27 swimming pool water samples collected in the Oporto area (Portugal). TCM was the only THM detected with levels between 4.5 and 406.5 μg L−1. Four of the samples exceeded the guideline value for total THMs in swimming pool water (100 μgL−1) indicated by the Portuguese Health Authority.
Resumo:
This study aims to optimize the water quality monitoring of a polluted watercourse (Leça River, Portugal) through the principal component analysis (PCA) and cluster analysis (CA). These statistical methodologies were applied to physicochemical, bacteriological and ecotoxicological data (with the marine bacterium Vibrio fischeri and the green alga Chlorella vulgaris) obtained with the analysis of water samples monthly collected at seven monitoring sites and during five campaigns (February, May, June, August, and September 2006). The results of some variables were assigned to water quality classes according to national guidelines. Chemical and bacteriological quality data led to classify Leça River water quality as “bad” or “very bad”. PCA and CA identified monitoring sites with similar pollution pattern, giving to site 1 (located in the upstream stretch of the river) a distinct feature from all other sampling sites downstream. Ecotoxicity results corroborated this classification thus revealing differences in space and time. The present study includes not only physical, chemical and bacteriological but also ecotoxicological parameters, which broadens new perspectives in river water characterization. Moreover, the application of PCA and CA is very useful to optimize water quality monitoring networks, defining the minimum number of sites and their location. Thus, these tools can support appropriate management decisions.
Resumo:
Coffee silverskin is a major roasting by-product that could be valued as a source of antioxidant compounds. The effect of the major variables (solvent polarity, temperature and extraction time) affecting the extraction yields of bioactive compounds and antioxidant activity of silverskin extracts was evaluated. The extracts composition varied significantly with the extraction conditions used. A factorial experimental design showed that the use of a hydroalcoholic solvent (50%:50%) at 40 °C for 60 min is a sustainable option to maximize the extraction yield of bioactive compounds and the antioxidant capacity of extracts. Using this set of conditions it was possible to obtain extracts containing total phenolics (302.5 ± 7.1 mg GAE/L), tannins (0.43 ± 0.06 mg TAE/L), and flavonoids (83.0 ± 1.4 mg ECE/L), exhibiting DPPHradical dot scavenging activity (326.0 ± 5.7 mg TE/L) and ferric reducing antioxidant power (1791.9 ± 126.3 mg SFE/L). These conditions allowed, in comparison with other “more effective” for some individual parameters, a cost reduction, saving time and energy.