995 resultados para Hiker Dice. Algoritmo Exato. Algoritmos Heurísticos
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
There are many advantages when using a phantom for the evaluation of the mammographic images, one of them is to test the visibility of the distribution of objects that constitute it, and then, set the best image in order to obtain a diagnostic medical insurance, trying to reach the best risk - benefit ratio for the patient. Typically, the quality of the mammographic image is performed by subjective and quantitative assessments. This study developed algorithms that can quantitatively evaluate digital images (DICOM), obtained from mammographic phantom consisting of test objects. The results were adjusted by the response of the subjective evaluation performed by experts in radiology. This procedure aims to create an independence of the experts in radiology in daily tests of quality control in routine clinical diagnostic radiology
Resumo:
The representation of real objects in virtual environments has applications in many areas, such as cartography, mixed reality and reverse engineering. The generation of these objects can be performed through two ways: manually, with CAD (Computer Aided Design) tools, or automatically, by means of surface reconstruction techniques. The simpler the 3D model, the easier it is to process and store it. However, this methods can generate very detailed virtual elements, that can result in some problems when processing the resulting mesh, because it has a lot of edges and polygons that have to be checked at visualization. Considering this context, it can be applied simplification algorithms to eliminate polygons from resulting mesh, without change its topology, generating a lighter mesh with less irrelevant details. The project aimed the study, implementation and comparative tests of simplification algorithms applied to meshes generated through a reconstruction pipeline based on point clouds. This work proposes the realization of the simplification step, like a complement to the pipeline developed by (ONO et al., 2012), that developed reconstruction through cloud points obtained by Microsoft Kinect, and then using Poisson algorithm
Resumo:
This work aims viewing weather information, by building isosurfaces enabling enjoy the advantages of three-dimensional geometric models, to communicate the meaning of the data used in a clear and efficient way. The evolving technology of data processing makes possible the interpretation of masses of data increasing, through robust algorithms. In meteorology, in particular, we can benefit from this fact, due to the large amount of data required for analysis and statistics. The manipulation of data, by users from other areas, is facilitated by the choice of algorithm and the tools involved in this work. The project was further developed into distinct modules, increasing their flexibility and reusability for future studies
Resumo:
In radiotherapy, computational systems are used for radiation dose determination in the treatment’s volume and radiometric parameters quality analysis of equipment and field irradiated. Due to the increasing technological advancement, several research has been performed in brachytherapy for different computational algorithms development which may be incorporated to treatment planning systems, providing greater accuracy and confidence in the dose calculation. Informatics and information technology fields undergo constant updating and refinement, allowing the use Monte Carlo Method to simulate brachytherapy source dose distribution. The methodology formalization employed to dosimetric analysis is based mainly in the American Association of Physicists in Medicine (AAPM) studies, by Task Group nº 43 (TG-43) and protocols aimed at dosimetry of these radiation sources types. This work aims to analyze the feasibility of using the MCNP-5C (Monte Carlo N-Particle) code to obtain radiometric parameters of brachytherapy sources and so to study the radiation dose variation in the treatment planning. Simulations were performed for the radiation dose variation in the source plan and determined the dosimetric parameters required by TG-43 formalism for the characterization of the two high dose rate iridium-192 sources. The calculated values were compared with the presents in the literature, which were obtained with different Monte Carlo simulations codes. The results showed excellent consistency with the compared codes, enhancing MCNP-5C code the capacity and viability in the sources dosimetry employed in HDR brachytherapy. The method employed may suggest a possible incorporation of this code in the treatment planning systems provided by manufactures together with the equipment, since besides reducing acquisition cost, it can also make the used computational routines more comprehensive, facilitating the brachytherapy ...
Resumo:
Communities are present on physical, chemical and biological systems and their identification is fundamental for the comprehension of the behavior of these systems. Recently, available data related to complex networks have grown exponentially, demanding more computational power. The Graphical Processing Unit (GPU) is a cost effective alternative suitable for this purpose. We investigate the convenience of this for network science by proposing a GPU based implementation of Newman community detection algorithm. We showed that the processing time of matrix multiplications of GPUs grow slower than CPUs in relation to the matrix size. It was proven, thus, that GPU processing power is a viable solution for community dentification simulation that demand high computational power. Our implementation was tested on an integrated biological network for the bacterium Escherichia coli
Resumo:
This monograph aims to study the problem of thinning, also known by Image Skeletonization, to explore their applications in areas such as, Biometrics, Medicine, Engineering and Cartography. The algorithms of thinning can be classi ed into two major groups: iterative algorithms and non-iterative algorithms. Iterative are sub-divided into sequential algorithms and parallel algorithms. In order to develop a computer system able to extract the skeleton of an image, were studied, analyzed and implemented di erent algorithms for this problem, precisely those of Stentiford, Zhang Suen, and Holt
Resumo:
This Project aims to develop methods for data classification in a Data Warehouse for decision-making purposes. We also have as another goal the reduction of an attribute set in a Data Warehouse, in which a given reduced set is capable of keeping the same properties of the original one. Once we achieve a reduced set, we have a smaller computational cost of processing, we are able to identify non-relevant attributes to certain kinds of situations, and finally we are also able to recognize patterns in the database that will help us to take decisions. In order to achieve these main objectives, it will be implemented the Rough Sets algorithm. We chose PostgreSQL as our data base management system due to its efficiency, consolidation and finally, it’s an open-source system (free distribution)
Resumo:
This work was developed starting the study of traditionals mathematical models that describe the epidemiology of infectious díseases by direct or indirect transmission. We did the classical approach of equilibrium solutions search, its analysis of stability analytically and by numerical solutions. After, we applied these techniques in a compartimental model of Dengue transmission that consider the mosquito population (susceptible vector Vs and 'infected vector VI), human population (suseeptíble humans S, infected humans I and recovered humans R) and just one sorotype floating in this population. We found the equilibrium solutions and from their analises, it was possible find the reprodution rate of dísease and which define if the disease will be endemic or not in the population.- ext, we used the method described a..~, [1] to study the infíuence of seasonalíty at vírus transmission, when it just acts on one of rates related with the vector. Lastly, we made de modeling considering the periodicity of alI rates, thereby building, a modeI with temporal dependence that permits to study periodicity of transmission through of the approach of parametrical ressonance and genetic algorithm
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The Set Covering Problem (SCP) plays an important role in Operational Research since it can be found as part of several real-world problems. In this work we report the use of a genetic algorithm to solve SCP. The algorithm starts with a population chosen by a randomized greedy algorithm. A new crossover operator and a new adaptive mutation operator were incorporated into the algorithm to intensify the search. Our algorithm was tested for a class of non-unicost SCP obtained from OR-Library without applying reduction techniques. The algorithms found good solutions in terms of quality and computational time. The results reveal that the proposed algorithm is able to find a high quality solution and is faster than recently published approaches algorithm is able to find a high quality solution and is faster than recently published approaches using the OR-Library.
Resumo:
Algoritmos para reconhecimento de 3-variedades utilizam-se do conceito de superfície normal, sendo assim, pode-se então tratar problemas de teoria de 3-variedades como sendo de programação linear. Como exemplos tem-se o Algoritmo de reconhecimento da 3-esfera triangulável de Rubinstein-Thompson que é implementado na suíte de software Regina, como a decomposição soma conexa de 3-variedades. A completa classificação de 3-variedades pode ser realizada por meio de algoritmos, possuindo assim relevância para o Programa de Geometrização de Thurston para obtenção de resultados inicialmente utilizando topologia computacional. O objetivo do presente trabalho é discorrer sobre uma aplicação do software Regina. Obteve-se durante a elaboração do presente trabalho, o resultado entre a comparação da 3-esfera homológica de Poincaré com a 3-esfera, parte importante para o entendimento da Conjectura de Poincaré e do Programa de Geometrização.
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS