968 resultados para Combinatorial Designs
Resumo:
Natural gas industry has been confronted with big challenges: great growth in demand, investments on new GSUs – gas supply units, and efficient technical system management. The right number of GSUs, their best location on networks and the optimal allocation to loads is a decision problem that can be formulated as a combinatorial programming problem, with the objective of minimizing system expenses. Our emphasis is on the formulation, interpretation and development of a solution algorithm that will analyze the trade-off between infrastructure investment expenditure and operating system costs. The location model was applied to a 12 node natural gas network, and its effectiveness was tested in five different operating scenarios.
Resumo:
The optimal power flow problem has been widely studied in order to improve power systems operation and planning. For real power systems, the problem is formulated as a non-linear and as a large combinatorial problem. The first approaches used to solve this problem were based on mathematical methods which required huge computational efforts. Lately, artificial intelligence techniques, such as metaheuristics based on biological processes, were adopted. Metaheuristics require lower computational resources, which is a clear advantage for addressing the problem in large power systems. This paper proposes a methodology to solve optimal power flow on economic dispatch context using a Simulated Annealing algorithm inspired on the cooling temperature process seen in metallurgy. The main contribution of the proposed method is the specific neighborhood generation according to the optimal power flow problem characteristics. The proposed methodology has been tested with IEEE 6 bus and 30 bus networks. The obtained results are compared with other wellknown methodologies presented in the literature, showing the effectiveness of the proposed method.
Resumo:
To maintain a power system within operation limits, a level ahead planning it is necessary to apply competitive techniques to solve the optimal power flow (OPF). OPF is a non-linear and a large combinatorial problem. The Ant Colony Search (ACS) optimization algorithm is inspired by the organized natural movement of real ants and has been successfully applied to different large combinatorial optimization problems. This paper presents an implementation of Ant Colony optimization to solve the OPF in an economic dispatch context. The proposed methodology has been developed to be used for maintenance and repairing planning with 48 to 24 hours antecipation. The main advantage of this method is its low execution time that allows the use of OPF when a large set of scenarios has to be analyzed. The paper includes a case study using the IEEE 30 bus network. The results are compared with other well-known methodologies presented in the literature.
Resumo:
The scheduling problem is considered in complexity theory as a NP-hard combinatorial optimization problem. Meta-heuristics proved to be very useful in the resolution of this class of problems. However, these techniques require parameter tuning which is a very hard task to perform. A Case-based Reasoning module is proposed in order to solve the parameter tuning problem in a Multi-Agent Scheduling System. A computational study is performed in order to evaluate the proposed CBR module performance.
Resumo:
Individual cancer susceptibility seems to be related to factors such as changes in oncogenes and tumor suppressor genes expression, and differences in the action of metabolic enzymes and DNA repair regulated by specific genes. Epidemiological studies on genetic polymorphisms of human xenobiotics metabolizing enzymes and cancer have revealed low relative risks. Research considering genetic polymorphisms prevalence jointly with environmental exposures could be relevant for a better understanding of cancer etiology and the mechanisms of carcinogenesis and also for new insights on cancer prognosis. This study reviews the approaches of molecular epidemiology in cancer research, stressing case-control and cohort designs involving genetic polymorphisms, and factors that could introduce bias and confounding in these studies. Similarly to classical epidemiological research, genetic polymorphisms requires considering aspects of precision and accuracy in the study design.
Resumo:
A novel high throughput and scalable unified architecture for the computation of the transform operations in video codecs for advanced standards is presented in this paper. This structure can be used as a hardware accelerator in modern embedded systems to efficiently compute all the two-dimensional 4 x 4 and 2 x 2 transforms of the H.264/AVC standard. Moreover, its highly flexible design and hardware efficiency allows it to be easily scaled in terms of performance and hardware cost to meet the specific requirements of any given video coding application. Experimental results obtained using a Xilinx Virtex-5 FPGA demonstrated the superior performance and hardware efficiency levels provided by the proposed structure, which presents a throughput per unit of area relatively higher than other similar recently published designs targeting the H.264/AVC standard. Such results also showed that, when integrated in a multi-core embedded system, this architecture provides speedup factors of about 120x concerning pure software implementations of the transform algorithms, therefore allowing the computation, in real-time, of all the above mentioned transforms for Ultra High Definition Video (UHDV) sequences (4,320 x 7,680 @ 30 fps).
Resumo:
Dissertação de Mestrado em Engenharia e Gestão de Sistemas de Água.
Resumo:
A procura de piscinas para a prática de atividades desportivas, recreativas e/ou terapêuticas tem sofrido um aumento gradual ao longo do tempo. No entanto, nas piscinas existem vários perigos associados à sua utilização. Relativamente aos perigos químicos, a utilização de desinfetantes à base de cloro, bromo ou compostos derivados vai, por um lado, inativar microrganismos patogénicos mas, por outro, dar origem a subprodutos ao reagir com compostos orgânicos presentes na água. Os trihalometanos são um exemplo de subprodutos que se podem formar e, entre os compostos principais, estão o clorofórmio (TCM), bromodiclorometano (BDCM), clorodibromometano (CDBM) e bromofórmio (TBM). Este trabalho teve como objetivo o desenvolvimento de uma metodologia analítica para a determinação de trihalometanos em água e ar de piscinas e a sua aplicação a um conjunto de amostras. Para a análise dos compostos, foi utilizada a microextração em fase sólida no espaço de cabeça (HS-SPME) com posterior quantificação dos compostos por cromatografia gasosa com detetor de captura eletrónica (GC-ECD). Foi realizada uma otimização das condições de extração dos compostos em estudo em amostras de água, através da realização de dois planeamentos experimentais. As condições ótimas são assim obtidas para uma temperatura de extração de 45ºC, um tempo de extração de 25 min e um tempo de dessorção de 5 min. Foram analisadas amostras de águas de piscina cedidas pelo Centro de Estudos de Águas, sendo avaliada a aplicação da técnica HS-SPME e o efeito de matriz. O modo como se manuseiam as soluções que contêm os compostos em estudo influencia os resultados devido ao facto destes serem bastante voláteis. Concluiu-se também que existe efeito de matriz, logo a concentração das amostras deverá ser determinada através do método de adição de padrão. A caraterização da água de piscinas interiores permitiu conhecer a concentração de trihalometanos (THMs). Foram obtidas concentrações de TCM entre 4,5 e 406,5 μg/L sendo que apenas 4 das 27 amostras analisadas ultrapassam o valor limite imposto pelo Decreto-Lei nº306/2007 (100 μg/L) no que diz respeito a águas de consumo humano e que é normalmente utilizado como valor indicativo para a qualidade das águas de piscina. Relativamente à concentração obtida no ar de uma piscina interior, foi detetada uma concentração média de 224 μg/m3 de TCM, valor muito abaixo dos 10000 μg/m3 impostos pelo Decreto-lei nº24/2012, como valor limite para exposição profissional a agentes químicos.
Resumo:
A optimização e a aprendizagem em Sistemas Multi-Agente são consideradas duas áreas promissoras mas relativamente pouco exploradas. A optimização nestes ambientes deve ser capaz de lidar com o dinamismo. Os agentes podem alterar o seu comportamento baseando-se em aprendizagem recente ou em objectivos de optimização. As estratégias de aprendizagem podem melhorar o desempenho do sistema, dotando os agentes da capacidade de aprender, por exemplo, qual a técnica de optimização é mais adequada para a resolução de uma classe particular de problemas, ou qual a parametrização é mais adequada em determinado cenário. Nesta dissertação são estudadas algumas técnicas de resolução de problemas de Optimização Combinatória, sobretudo as Meta-heurísticas, e é efectuada uma revisão do estado da arte de Aprendizagem em Sistemas Multi-Agente. É também proposto um módulo de aprendizagem para a resolução de novos problemas de escalonamento, com base em experiência anterior. O módulo de Auto-Optimização desenvolvido, inspirado na Computação Autónoma, permite ao sistema a selecção automática da Meta-heurística a usar no processo de optimização, assim como a respectiva parametrização. Para tal, recorreu-se à utilização de Raciocínio baseado em Casos de modo que o sistema resultante seja capaz de aprender com a experiência adquirida na resolução de problemas similares. Dos resultados obtidos é possível concluir da vantagem da sua utilização e respectiva capacidade de adaptação a novos e eventuais cenários.
Resumo:
A new high performance architecture for the computation of all the DCT operations adopted in the H.264/AVC and HEVC standards is proposed in this paper. Contrasting to other dedicated transform cores, the presented multi-standard transform architecture is supported on a completely configurable, scalable and unified structure, that is able to compute not only the forward and the inverse 8×8 and 4×4 integer DCTs and the 4×4 and 2×2 Hadamard transforms defined in the H.264/AVC standard, but also the 4×4, 8×8, 16×16 and 32×32 integer transforms adopted in HEVC. Experimental results obtained using a Xilinx Virtex-7 FPGA demonstrated the superior performance and hardware efficiency levels provided by the proposed structure, which outperforms its more prominent related designs by at least 1.8 times. When integrated in a multi-core embedded system, this architecture allows the computation, in real-time, of all the transforms mentioned above for resolutions as high as the 8k Ultra High Definition Television (UHDTV) (7680×4320 @ 30fps).
Resumo:
A flow-spectrophotometric method is proposed for the routine determination of tartaric acid in wines. The reaction between tartaric acid and vanadate in acetic media is carried out in flowing conditions and the subsequent colored complex is monitored at 475 nm. The stability of the complex and the corresponding formation constant are presented. The effect of wavelength and pH was evaluated by batch experiments. The selected conditions were transposed to a flowinjection analytical system. Optimization of several flow parameters such as reactor lengths, flow-rate and injection volume was carried out. Using optimized conditions, a linear behavior was observed up to 1000 µg mL-1 tartaric acid, with a molar extinction coefficient of 450 L mg-1 cm-1 and ± 1 % repeatability. Sample throughput was 25 samples per hour. The flow-spectrophotometric method was satisfactorily applied to the quantification of tartaric acid (TA) in wines from different sources. Its accuracy was confirmed by statistical comparison to the conventional Rebelein procedure and to a certified analytical method carried out in a routine laboratory.
Resumo:
This paper presents a complete, quadratic programming formulation of the standard thermal unit commitment problem in power generation planning, together with a novel iterative optimisation algorithm for its solution. The algorithm, based on a mixed-integer formulation of the problem, considers piecewise linear approximations of the quadratic fuel cost function that are dynamically updated in an iterative way, converging to the optimum; this avoids the requirement of resorting to quadratic programming, making the solution process much quicker. From extensive computational tests on a broad set of benchmark instances of this problem, the algorithm was found to be flexible and capable of easily incorporating different problem constraints. Indeed, it is able to tackle ramp constraints, which although very important in practice were rarely considered in previous publications. Most importantly, optimal solutions were obtained for several well-known benchmark instances, including instances of practical relevance, that are not yet known to have been solved to optimality. Computational experiments and their results showed that the method proposed is both simple and extremely effective.
Resumo:
In recent years several countries have set up policies that allow exchange of kidneys between two or more incompatible patient–donor pairs. These policies lead to what is commonly known as kidney exchange programs. The underlying optimization problems can be formulated as integer programming models. Previously proposed models for kidney exchange programs have exponential numbers of constraints or variables, which makes them fairly difficult to solve when the problem size is large. In this work we propose two compact formulations for the problem, explain how these formulations can be adapted to address some problem variants, and provide results on the dominance of some models over others. Finally we present a systematic comparison between our models and two previously proposed ones via thorough computational analysis. Results show that compact formulations have advantages over non-compact ones when the problem size is large.
Resumo:
Mestrado em Controlo de Gestão e dos Negócios
Resumo:
This paper presents a modified Particle Swarm Optimization (PSO) methodology to solve the problem of energy resources management with high penetration of distributed generation and Electric Vehicles (EVs) with gridable capability (V2G). The objective of the day-ahead scheduling problem in this work is to minimize operation costs, namely energy costs, regarding he management of these resources in the smart grid context. The modifications applied to the PSO aimed to improve its adequacy to solve the mentioned problem. The proposed Application Specific Modified Particle Swarm Optimization (ASMPSO) includes an intelligent mechanism to adjust velocity limits during the search process, as well as self-parameterization of PSO parameters making it more user-independent. It presents better robustness and convergence characteristics compared with the tested PSO variants as well as better constraint handling. This enables its use for addressing real world large-scale problems in much shorter times than the deterministic methods, providing system operators with adequate decision support and achieving efficient resource scheduling, even when a significant number of alternative scenarios should be considered. The paper includes two realistic case studies with different penetration of gridable vehicles (1000 and 2000). The proposed methodology is about 2600 times faster than Mixed-Integer Non-Linear Programming (MINLP) reference technique, reducing the time required from 25 h to 36 s for the scenario with 2000 vehicles, with about one percent of difference in the objective function cost value.