965 resultados para Process optimization
Resumo:
This paper proposes a novel method for controlling the convergence rate of a particle swarm optimization algorithm using fractional calculus (FC) concepts. The optimization is tested for several well-known functions and the relationship between the fractional order velocity and the convergence of the algorithm is observed. The FC demonstrates a potential for interpreting evolution of the algorithm and to control its convergence.
Resumo:
We present the modeling efforts on antenna design and frequency selection to monitor brain temperature during prolonged surgery using noninvasive microwave radiometry. A tapered log-spiral antenna design is chosen for its wideband characteristics that allow higher power collection from deep brain. Parametric analysis with the software HFSS is used to optimize antenna performance for deep brain temperature sensing. Radiometric antenna efficiency (eta) is evaluated in terms of the ratio of power collected from brain to total power received by the antenna. Anatomical information extracted from several adult computed tomography scans is used to establish design parameters for constructing an accurate layered 3-D tissue phantom. This head phantom includes separate brain and scalp regions, with tissue equivalent liquids circulating at independent temperatures on either side of an intact skull. The optimized frequency band is 1.1-1.6 GHz producing an average antenna efficiency of 50.3% from a two turn log-spiral antenna. The entire sensor package is contained in a lightweight and low-profile 2.8 cm diameter by 1.5 cm high assembly that can be held in place over the skin with an electromagnetic interference shielding adhesive patch. The calculated radiometric equivalent brain temperature tracks within 0.4 degrees C of the measured brain phantom temperature when the brain phantom is lowered 10. C and then returned to the original temperature (37 degrees C) over a 4.6-h experiment. The numerical and experimental results demonstrate that the optimized 2.5-cm log-spiral antenna is well suited for the noninvasive radiometric sensing of deep brain temperature.
Resumo:
In order to correctly assess the biaxial fatigue material properties one must experimentally test different load conditions and stress levels. With the rise of new in-plane biaxial fatigue testing machines, using smaller and more efficient electrical motors, instead of the conventional hydraulic machines, it is necessary to reduce the specimen size and to ensure that the specimen geometry is appropriated for the load capacity installed. At the present time there are no standard specimen’s geometries and the indications on literature how to design an efficient test specimen are insufficient. The main goal of this paper is to present the methodology on how to obtain an optimal cruciform specimen geometry, with thickness reduction in the gauge area, appropriated for fatigue crack initiation, as a function of the base material sheet thickness used to build the specimen. The geometry is optimized for maximum stress using several parameters, ensuring that in the gauge area the stress is uniform and maximum with two limit phase shift loading conditions. Therefore the fatigue damage will always initiate on the center of the specimen, avoiding failure outside this region. Using the Renard Series of preferred numbers for the base material sheet thickness as a reference, the reaming geometry parameters are optimized using a derivative-free methodology, called direct multi search (DMS) method. The final optimal geometry as a function of the base material sheet thickness is proposed, as a guide line for cruciform specimens design, and as a possible contribution for a future standard on in-plane biaxial fatigue tests. © 2014, Gruppo Italiano Frattura. All rights reserved.
Resumo:
Meshless methods are used for their capability of producing excellent solutions without requiring a mesh, avoiding mesh related problems encountered in other numerical methods, such as finite elements. However, node placement is still an open question, specially in strong form collocation meshless methods. The number of used nodes can have a big influence on matrix size and therefore produce ill-conditioned matrices. In order to optimize node position and number, a direct multisearch technique for multiobjective optimization is used to optimize node distribution in the global collocation method using radial basis functions. The optimization method is applied to the bending of isotropic simply supported plates. Using as a starting condition a uniformly distributed grid, results show that the method is capable of reducing the number of nodes in the grid without compromising the accuracy of the solution. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Dissertation presented in partial fulfillment of the requirements for the degree of Master in Biotechnology
Resumo:
This paper introduces a new unsupervised hyperspectral unmixing method conceived to linear but highly mixed hyperspectral data sets, in which the simplex of minimum volume, usually estimated by the purely geometrically based algorithms, is far way from the true simplex associated with the endmembers. The proposed method, an extension of our previous studies, resorts to the statistical framework. The abundance fraction prior is a mixture of Dirichlet densities, thus automatically enforcing the constraints on the abundance fractions imposed by the acquisition process, namely, nonnegativity and sum-to-one. A cyclic minimization algorithm is developed where the following are observed: 1) The number of Dirichlet modes is inferred based on the minimum description length principle; 2) a generalized expectation maximization algorithm is derived to infer the model parameters; and 3) a sequence of augmented Lagrangian-based optimizations is used to compute the signatures of the endmembers. Experiments on simulated and real data are presented to show the effectiveness of the proposed algorithm in unmixing problems beyond the reach of the geometrically based state-of-the-art competitors.
Resumo:
Informal learning is becoming more and more important: Nowadays people learn more this way, through the Internet, than in schools or normal trainings. But they don’t get any certificateswhich attest this fact. So they can't show the employer or teacher etc. that they have learned something. TRAILER project aim is to solve this problem by developing a special tool for managing of all competences and skills acquired through informal learning experiences. Both from the perspective of the user and the perspective of an institution or a company. We’ll present the IT tool to show how people can make their informal learning outcomes visible. TRAILER helps users to gather all information about process and outcomes of their informal learning. Users can share this with friends, colleagues or their employees, teachers and so on. They can create an interactive e-portfolio which can be attached to their CV, cover letter or Knowledge Management system etc. After the presentation of the tool we will discuss possible areas and fields to use this tool. Also we would like to discuss all possible use of the tool by the participants and another needs in this area. Moreover we want to discuss other problems in informal learning process, ways to solve the problems and discuss other ideas of different IT tools which could help in informal learning process. During the discussion we’ll use an interactive respond system which can be used on mobile devices: it makes possible for participants to share their opinions individually before knowing another persons' opinion.
Resumo:
Swarm Intelligence (SI) is the property of a system whereby the collective behaviors of (unsophisticated) agents interacting locally with their environment cause coherent functional global patterns to emerge. Particle swarm optimization (PSO) is a form of SI, and a population-based search algorithm that is initialized with a population of random solutions, called particles. These particles are flying through hyperspace and have two essential reasoning capabilities: their memory of their own best position and knowledge of the swarm's best position. In a PSO scheme each particle flies through the search space with a velocity that is adjusted dynamically according with its historical behavior. Therefore, the particles have a tendency to fly towards the best search area along the search process. This work proposes a PSO based algorithm for logic circuit synthesis. The results show the statistical characteristics of this algorithm with respect to number of generations required to achieve the solutions. It is also presented a comparison with other two Evolutionary Algorithms, namely Genetic and Memetic Algorithms.
Resumo:
Mestrado em Engenharia Electrotécnica – Sistemas Eléctricos de Energia
Resumo:
Dissertation submitted to Faculdade de Ciências e Tecnologia of Universidade Nova de Lisboa for the achievement of Integrated Master´s degree in Industrial Management Engineering
Resumo:
Wythoff Queens is a classical combinatorial game related to very interesting mathematical results. An amazing one is the fact that the P-positions are given by (⌊├ φn⌋┤┤,├ ├ ⌊φ┤^2 n⌋) and (⌊├ φ^2 n⌋┤┤,├ ├ ⌊φ┤n⌋) where φ=(1+√5)/2. In this paper, we analyze a different version where one player (Left) plays with a chess bishop and the other (Right) plays with a chess knight. The new game (call it Chessfights) lacks a Beatty sequence structure in the P-positions as in Wythoff Queens. However, it is possible to formulate and prove some general results of a general recursive law which is a particular case of a Partizan Subtraction game.
Resumo:
The increasing integration of wind energy in power systems can be responsible for the occurrence of over-generation, especially during the off-peak periods. This paper presents a dedicated methodology to identify and quantify the occurrence of this over-generation and to evaluate some of the solutions that can be adopted to mitigate this problem. The methodology is applied to the Portuguese power system, in which the wind energy is expected to represent more than 25% of the installed capacity in a near future. The results show that the pumped-hydro units will not provide enough energy storage capacity and, therefore, wind curtailments are expected to occur in the Portuguese system. Additional energy storage devices can be implemented to offset the wind energy curtailments. However, the investment analysis performed show that they are not economically viable, due to the present high capital costs involved.
Resumo:
As it is well known, competitive electricity markets require new computing tools for power companies that operate in retail markets in order to enhance the management of its energy resources. During the last years there has been an increase of the renewable penetration into the micro-generation which begins to co-exist with the other existing power generation, giving rise to a new type of consumers. This paper develops a methodology to be applied to the management of the all the aggregators. The aggregator establishes bilateral contracts with its clients where the energy purchased and selling conditions are negotiated not only in terms of prices but also for other conditions that allow more flexibility in the way generation and consumption is addressed. The aggregator agent needs a tool to support the decision making in order to compose and select its customers' portfolio in an optimal way, for a given level of profitability and risk.
Resumo:
A alta e crescente participação da energia eólica na matriz da produção traz grandes desafios aos operadores do sistema na gestão da rede e planeamento da produção. A incerteza associada à produção eólica condiciona os processos de escalonamento e despacho económico dos geradores térmicos, uma vez que a produção eólica efetiva pode ser muito diferente da produção prevista. O presente trabalho propõe duas metodologias de otimização do escalonamento de geradores térmicos baseadas em Programação Inteira Mista. Pretende-se encontrar soluções de escalonamento que minimizem as influências negativas da integração de energia eólica no sistema elétrico. Inicialmente o problema de escalonamento de geradores é formulado sem considerar a integração da energia eólica. Posteriormente foi considerada a penetração da energia eólica no sistema elétrico. No primeiro modelo proposto, o problema é formulado como um problema de otimização estocástico. Nesta formulação todos os cenários de produção eólica são levados em consideração no processo de otimização. No segundo modelo, o problema é formulado como um problema de otimização determinística. Nesta formulação, o escalonamento é feito para cada cenário de produção eólica e no fim determina-se a melhor solução por meio de indicadores de avaliação. Foram feitas simulações para diferentes níveis de reserva girante e os resultados obtidos mostraram que a alta participação da energia eólica na matriz da produção põe em causa a segurança e garantia de produção devido às características volátil e intermitente da produção eólica e para manter os mesmos níveis de segurança é preciso dispor no sistema de capacidade reserva girante suficiente capaz de compensar os erros de previsão.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do Grau de Mestre em Engenharia Informática.