74 resultados para Algoritmos - conceitos
Resumo:
This work performs an algorithmic study of optimization of a conformal radiotherapy plan treatment. Initially we show: an overview about cancer, radiotherapy and the physics of interaction of ionizing radiation with matery. A proposal for optimization of a plan of treatment in radiotherapy is developed in a systematic way. We show the paradigm of multicriteria problem, the concept of Pareto optimum and Pareto dominance. A generic optimization model for radioterapic treatment is proposed. We construct the input of the model, estimate the dose given by the radiation using the dose matrix, and show the objective function for the model. The complexity of optimization models in radiotherapy treatment is typically NP which justifyis the use of heuristic methods. We propose three distinct methods: MOGA, MOSA e MOTS. The project of these three metaheuristic procedures is shown. For each procedures follows: a brief motivation, the algorithm itself and the method for tuning its parameters. The three method are applied to a concrete case and we confront their performances. Finally it is analyzed for each method: the quality of the Pareto sets, some solutions and the respective Pareto curves
Resumo:
The use of clustering methods for the discovery of cancer subtypes has drawn a great deal of attention in the scientific community. While bioinformaticians have proposed new clustering methods that take advantage of characteristics of the gene expression data, the medical community has a preference for using classic clustering methods. There have been no studies thus far performing a large-scale evaluation of different clustering methods in this context. This work presents the first large-scale analysis of seven different clustering methods and four proximity measures for the analysis of 35 cancer gene expression data sets. Results reveal that the finite mixture of Gaussians, followed closely by k-means, exhibited the best performance in terms of recovering the true structure of the data sets. These methods also exhibited, on average, the smallest difference between the actual number of classes in the data sets and the best number of clusters as indicated by our validation criteria. Furthermore, hierarchical methods, which have been widely used by the medical community, exhibited a poorer recovery performance than that of the other methods evaluated. Moreover, as a stable basis for the assessment and comparison of different clustering methods for cancer gene expression data, this study provides a common group of data sets (benchmark data sets) to be shared among researchers and used for comparisons with new methods
Resumo:
A 3D binary image is considered well-composed if, and only if, the union of the faces shared by the foreground and background voxels of the image is a surface in R3. Wellcomposed images have some desirable topological properties, which allow us to simplify and optimize algorithms that are widely used in computer graphics, computer vision and image processing. These advantages have fostered the development of algorithms to repair bi-dimensional (2D) and three-dimensional (3D) images that are not well-composed. These algorithms are known as repairing algorithms. In this dissertation, we propose two repairing algorithms, one randomized and one deterministic. Both algorithms are capable of making topological repairs in 3D binary images, producing well-composed images similar to the original images. The key idea behind both algorithms is to iteratively change the assigned color of some points in the input image from 0 (background)to 1 (foreground) until the image becomes well-composed. The points whose colors are changed by the algorithms are chosen according to their values in the fuzzy connectivity map resulting from the image segmentation process. The use of the fuzzy connectivity map ensures that a subset of points chosen by the algorithm at any given iteration is the one with the least affinity with the background among all possible choices
Resumo:
Classifier ensembles are systems composed of a set of individual classifiers and a combination module, which is responsible for providing the final output of the system. In the design of these systems, diversity is considered as one of the main aspects to be taken into account since there is no gain in combining identical classification methods. The ideal situation is a set of individual classifiers with uncorrelated errors. In other words, the individual classifiers should be diverse among themselves. One way of increasing diversity is to provide different datasets (patterns and/or attributes) for the individual classifiers. The diversity is increased because the individual classifiers will perform the same task (classification of the same input patterns) but they will be built using different subsets of patterns and/or attributes. The majority of the papers using feature selection for ensembles address the homogenous structures of ensemble, i.e., ensembles composed only of the same type of classifiers. In this investigation, two approaches of genetic algorithms (single and multi-objective) will be used to guide the distribution of the features among the classifiers in the context of homogenous and heterogeneous ensembles. The experiments will be divided into two phases that use a filter approach of feature selection guided by genetic algorithm
Uma análise experimental de algoritmos exatos aplicados ao problema da árvore geradora multiobjetivo
Resumo:
The Multiobjective Spanning Tree Problem is NP-hard and models applications in several areas. This research presents an experimental analysis of different strategies used in the literature to develop exact algorithms to solve the problem. Initially, the algorithms are classified according to the approaches used to solve the problem. Features of two or more approaches can be found in some of those algorithms. The approaches investigated here are: the two-stage method, branch-and-bound, k-best and the preference-based approach. The main contribution of this research lies in the fact that no research was presented to date reporting a systematic experimental analysis of exact algorithms for the Multiobjective Spanning Tree Problem. Therefore, this work can be a basis for other research that deal with the same problem. The computational experiments compare the performance of algorithms regarding processing time, efficiency based on the number of objectives and number of solutions found in a controlled time interval. The analysis of the algorithms was performed for known instances of the problem, as well as instances obtained from a generator commonly used in the literature
Resumo:
Nonogram is a logical puzzle whose associated decision problem is NP-complete. It has applications in pattern recognition problems and data compression, among others. The puzzle consists in determining an assignment of colors to pixels distributed in a N M matrix that satisfies line and column constraints. A Nonogram is encoded by a vector whose elements specify the number of pixels in each row and column of a figure without specifying their coordinates. This work presents exact and heuristic approaches to solve Nonograms. The depth first search was one of the chosen exact approaches because it is a typical example of brute search algorithm that is easy to implement. Another implemented exact approach was based on the Las Vegas algorithm, so that we intend to investigate whether the randomness introduce by the Las Vegas-based algorithm would be an advantage over the depth first search. The Nonogram is also transformed into a Constraint Satisfaction Problem. Three heuristics approaches are proposed: a Tabu Search and two memetic algorithms. A new function to calculate the objective function is proposed. The approaches are applied on 234 instances, the size of the instances ranging from 5 x 5 to 100 x 100 size, and including logical and random Nonograms
Resumo:
The Scientific Algorithms are a new metaheuristics inspired in the scientific research process. The new method introduces the idea of theme to search the solution space of hard problems. The inspiration for this class of algorithms comes from the act of researching that comprises thinking, knowledge sharing and disclosing new ideas. The ideas of the new method are illustrated in the Traveling Salesman Problem. A computational experiment applies the proposed approach to a new variant of the Traveling Salesman Problem named Car Renter Salesman Problem. The results are compared to state-of-the-art algorithms for the latter problem
Resumo:
Symbolic Data Analysis (SDA) main aims to provide tools for reducing large databases to extract knowledge and provide techniques to describe the unit of such data in complex units, as such, interval or histogram. The objective of this work is to extend classical clustering methods for symbolic interval data based on interval-based distance. The main advantage of using an interval-based distance for interval-based data lies on the fact that it preserves the underlying imprecision on intervals which is usually lost when real-valued distances are applied. This work includes an approach allow existing indices to be adapted to interval context. The proposed methods with interval-based distances are compared with distances punctual existing literature through experiments with simulated data and real data interval
Resumo:
Este trabalho apresenta um algoritmo transgenético híbrido para a solução de um Problema de Configuração de uma Rede de Distribuição de Gás Natural. O problema da configuração dessas redes requer a definição de um traçado por onde os dutos devem ser colocados para atender aos clientes. É estudada neste trabalho uma maneira de conectar os clientes em uma rede com arquitetura em forma de árvore. O objetivo é minimizar o custo de construção da rede, mesmo que para isso alguns clientes que não proporcionam lucros deixem de ser atendidos. Esse problema pode ser formulado computacionalmente através do Problema de Steiner com Prêmios. Este é um problema de otimização combinatória da classe dos NPÁrduos. Este trabalho apresenta um algoritmo heurístico para a solução do problema. A abordagem utilizada é chamada de Algoritmos Transgenéticos, que se enquadram na categoria dos algoritmos evolucionários. Para a geração de soluções inicias é utilizado um algoritmo primaldual, e pathrelinking é usado como intensificador
Resumo:
Os parâmetros de análise social fornecidos pelos fundamentos da obra weberiana apontam algumas determinações necessárias à análise de fenômenos sociais. Defendendo uma sociologia capaz de compreender os sentidos e conexões presentes nas ações sociais, Weber propõe uma concepção específica de método e de objeto na sociologia, que se assenta na explicação de ações sociais individuais, sob condições determinadas, e busca, ao mesmo tempo, explicitar as significações das instituições sociais, nas quais os indivíduos agem, como resultantes também da ação humana. A partir de tais idéias, o breve ensaio ora apresentado, propõe-se a debater alguns elementos trabalhados nos conceitos weberianos de método e sociologia
Resumo:
The use of Multiple Input Multiple Output (MIMO) systems has permitted the recent evolution of wireless communication standards. The Spatial Multiplexing MIMO technique, in particular, provides a linear gain at the transmission capacity with the minimum between the numbers of transmit and receive antennas. To obtain a near capacity performance in SM-MIMO systems a soft decision Maximum A Posteriori Probability MIMO detector is necessary. However, such detector is too complex for practical solutions. Hence, the goal of a MIMO detector algorithm aimed for implementation is to get a good approximation of the ideal detector while keeping an acceptable complexity. Moreover, the algorithm needs to be mapped to a VLSI architecture with small area and high data rate. Since Spatial Multiplexing is a recent technique, it is argued that there is still much room for development of related algorithms and architectures. Therefore, this thesis focused on the study of sub optimum algorithms and VLSI architectures for broadband MIMO detector with soft decision. As a result, novel algorithms have been developed starting from proposals of optimizations for already established algorithms. Based on these results, new MIMO detector architectures with configurable modulation and competitive area, performance and data rate parameters are here proposed. The developed algorithms have been extensively simulated and the architectures were synthesized so that the results can serve as a reference for other works in the area
Resumo:
In this work we studied the method to solving linear equations system, presented in the book titled "The nine chapters on the mathematical art", which was written in the first century of this era. This work has the intent of showing how the mathematics history can be used to motivate the introduction of some topics in high school. Through observations of patterns which repeats itself in the presented method, we were able to introduce, in a very natural way, the concept of linear equations, linear equations system, solution of linear equations, determinants and matrices, besides the Laplacian development for determinants calculations of square matrices of order bigger than 3, then considering some of their general applications
Resumo:
Humans, as well as some animals are born gifted with the ability to perceive quantities. The needs that came from the evolution of societies and technological resources make the the optimization of such counting methods necessary. Although necessary and useful, there are a lot of diculties in the teaching of such methods.In order to broaden the range of available tools to teach Combinatorial Analysis, a owchart is presented in this work with the goal of helping the students to x the initial concepts of such subject via pratical exercises
Resumo:
Técnicas de otimização conhecidas como as metaheurísticas tem conseguido resolversatisfatoriamente problemas conhecidos, mas desenvolvimento das metaheurísticas écaracterizado por escolha de parâmetros para sua execução, na qual a opção apropriadadestes parâmetros (valores). Onde o ajuste de parâmetro é essencial testa-se os parâmetrosaté que resultados viáveis sejam obtidos, normalmente feita pelo desenvolvedor que estaimplementando a metaheuristica. A qualidade dos resultados de uma instância1 de testenão será transferida para outras instâncias a serem testadas e seu feedback pode requererum processo lento de “tentativa e erro” onde o algoritmo têm que ser ajustado para umaaplicação especifica. Diante deste contexto das metaheurísticas surgiu a Busca Reativaque defende a integração entre o aprendizado de máquina dentro de buscas heurísticaspara solucionar problemas de otimização complexos. A partir da integração que a BuscaReativa propõe entre o aprendizado de máquina e as metaheurísticas, surgiu a ideia dese colocar a Aprendizagem por Reforço mais especificamente o algoritmo Q-learning deforma reativa, para selecionar qual busca local é a mais indicada em determinado instanteda busca, para suceder uma outra busca local que não pode mais melhorar a soluçãocorrente na metaheurística VNS. Assim, neste trabalho propomos uma implementação reativa,utilizando aprendizado por reforço para o auto-tuning do algoritmo implementado,aplicado ao problema do caixeiro viajante simétrico e ao problema escalonamento sondaspara manutenção de poços.
Resumo:
This work consists of the conception, developing and implementation of a Computational Routine CAE which has algorithms suitable for the tension and deformation analysis. The system was integrated to an academic software named as OrtoCAD. The expansion algorithms for the interface CAE genereated by this work were developed in FORTRAN with the objective of increase the applications of two former works of PPGEM-UFRN: project and fabrication of a Electromechanincal reader and Software OrtoCAD. The software OrtoCAD is an interface that, orinally, includes the visualization of prothetic cartridges from the data obtained from a electromechanical reader (LEM). The LEM is basically a tridimensional scanner based on reverse engineering. First, the geometry of a residual limb (i.e., the remaining part of an amputee leg wherein the prothesis is fixed) is obtained from the data generated by LEM by the use of Reverse Engineering concepts. The proposed core FEA uses the Shell's Theory where a 2D surface is generated from a 3D piece form OrtoCAD. The shell's analysis program uses the well-known Finite Elements Method to describe the geometry and the behavior of the material. The program is based square-based Lagragean elements of nine nodes and displacement field of higher order to a better description of the tension field in the thickness. As a result, the new FEA routine provide excellent advantages by providing new features to OrtoCAD: independency of high cost commercial softwares; new routines were added to the OrtoCAD library for more realistic problems by using criteria of fault engineering of composites materials; enhanced the performance of the FEA analysis by using a specific grid element for a higher number of nodes; and finally, it has the advantage of open-source project and offering customized intrinsic versatility and wide possibilities of editing and/or optimization that may be necessary in the future