759 resultados para Algorithm fusion
Resumo:
To maintain a power system within operation limits, a level ahead planning it is necessary to apply competitive techniques to solve the optimal power flow (OPF). OPF is a non-linear and a large combinatorial problem. The Ant Colony Search (ACS) optimization algorithm is inspired by the organized natural movement of real ants and has been successfully applied to different large combinatorial optimization problems. This paper presents an implementation of Ant Colony optimization to solve the OPF in an economic dispatch context. The proposed methodology has been developed to be used for maintenance and repairing planning with 48 to 24 hours antecipation. The main advantage of this method is its low execution time that allows the use of OPF when a large set of scenarios has to be analyzed. The paper includes a case study using the IEEE 30 bus network. The results are compared with other well-known methodologies presented in the literature.
Resumo:
This paper presents a Unit Commitment model with reactive power compensation that has been solved by Genetic Algorithm (GA) optimization techniques. The GA has been developed a computational tools programmed/coded in MATLAB. The main objective is to find the best generations scheduling whose active power losses are minimal and the reactive power to be compensated, subjected to the power system technical constraints. Those are: full AC power flow equations, active and reactive power generation constraints. All constraints that have been represented in the objective function are weighted with a penalty factors. The IEEE 14-bus system has been used as test case to demonstrate the effectiveness of the proposed algorithm. Results and conclusions are dully drawn.
Resumo:
Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tools must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.
Resumo:
Although it is always weak between RFID Tag and Terminal in focus of the security, there are no security skills in RFID Tag. Recently there are a lot of studying in order to protect it, but because it has some physical limitation of RFID, that is it should be low electric power and high speed, it is impossible to protect with the skills. At present, the methods of RFID security are using a security server, a security policy and security. One of them the most famous skill is the security module, then they has an authentication skill and an encryption skill. In this paper, we designed and implemented after modification original SEED into 8 Round and 64 bits for Tag.
Resumo:
Mestrado em Radioterapia.
Resumo:
Mestrado em Radioterapia
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
A presente dissertação apresenta uma solução para o problema de modelização tridimensional de galerias subterrâneas. O trabalho desenvolvido emprega técnicas provenientes da área da robótica móvel para obtenção um sistema autónomo móvel de modelização, capaz de operar em ambientes não estruturados sem acesso a sistemas de posicionamento global, designadamente GPS. Um sistema de modelização móvel e autónomo pode ser bastante vantajoso, pois constitui um método rápido e simples de monitorização das estruturas e criação de representações virtuais das galerias com um elevado nível de detalhe. O sistema de modelização desloca-se no interior dos túneis para recolher informações sensoriais sobre a geometria da estrutura. A tarefa de organização destes dados com vista _a construção de um modelo coerente, exige um conhecimento exacto do percurso praticado pelo sistema, logo o problema de localização da plataforma sensorial tem que ser resolvido. A formulação de um sistema de localização autónoma tem que superar obstáculos que se manifestam vincadamente nos ambientes underground, tais como a monotonia estrutural e a já referida ausência de sistemas de posicionamento global. Neste contexto, foi abordado o conceito de SLAM (Simultaneous Loacalization and Mapping) para determinação da localização da plataforma sensorial em seis graus de liberdade. Seguindo a abordagem tradicional, o núcleo do algoritmo de SLAM consiste no filtro de Kalman estendido (EKF { Extended Kalman Filter ). O sistema proposto incorpora métodos avançados do estado da arte, designadamente a parametrização em profundidade inversa (Inverse Depth Parametrization) e o método de rejeição de outliers 1-Point RANSAC. A contribuição mais importante do método por nós proposto para o avanço do estado da arte foi a fusão da informação visual com a informação inercial. O algoritmo de localização foi testado com base em dados reais, adquiridos no interior de um túnel rodoviário. Os resultados obtidos permitem concluir que, ao fundir medidas inerciais com informações visuais, conseguimos evitar o fenómeno de degeneração do factor de escala, comum nas aplicações de localização através de sistemas puramente monoculares. Provámos simultaneamente que a correcção de um sistema de localização inercial através da consideração de informações visuais é eficaz, pois permite suprimir os desvios de trajectória que caracterizam os sistemas de dead reckoning. O algoritmo de modelização, com base na localização estimada, organiza no espaço tridimensional os dados geométricos adquiridos, resultando deste processo um modelo em nuvem de pontos, que posteriormente _e convertido numa malha triangular, atingindo-se assim uma representação mais realista do cenário original.
Resumo:
Linear unmixing decomposes a hyperspectral image into a collection of reflectance spectra of the materials present in the scene, called endmember signatures, and the corresponding abundance fractions at each pixel in a spatial area of interest. This paper introduces a new unmixing method, called Dependent Component Analysis (DECA), which overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical properties of hyperspectral data. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. The performance of the method is illustrated using simulated and real data.
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings
Resumo:
Given a set of mixed spectral (multispectral or hyperspectral) vectors, linear spectral mixture analysis, or linear unmixing, aims at estimating the number of reference substances, also called endmembers, their spectral signatures, and their abundance fractions. This paper presents a new method for unsupervised endmember extraction from hyperspectral data, termed vertex component analysis (VCA). The algorithm exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. In a series of experiments using simulated and real data, the VCA algorithm competes with state-of-the-art methods, with a computational complexity between one and two orders of magnitude lower than the best available method.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica Ramo Automação e Electrónica Industrial
Resumo:
The calculation of the dose is one of the key steps in radiotherapy planning1-5. This calculation should be as accurate as possible, and over the years it became feasible through the implementation of new algorithms to calculate the dose on the treatment planning systems applied in radiotherapy. When a breast tumour is irradiated, it is fundamental a precise dose distribution to ensure the planning target volume (PTV) coverage and prevent skin complications. Some investigations, using breast cases, showed that the pencil beam convolution algorithm (PBC) overestimates the dose in the PTV and in the proximal region of the ipsilateral lung. However, underestimates the dose in the distal region of the ipsilateral lung, when compared with analytical anisotropic algorithm (AAA). With this study we aim to compare the performance in breast tumors of the PBC and AAA algorithms.
Resumo:
Conferência - 16th International Symposium on Wireless Personal Multimedia Communications (WPMC)- Jun 24-27, 2013
Resumo:
Objectivo do estudo: comparar o desempenho dos algoritmos Pencil Beam Convolution (PBC) e do Analytical Anisotropic Algorithm (AAA) no planeamento do tratamento de tumores de mama com radioterapia conformacional a 3D.