927 resultados para Algoritmo EM
Resumo:
This work approaches the Scheduling Workover Rigs Problem (SWRP) to maintain the wells of an oil field, although difficult to resolve, is extremely important economical, technical and environmental. A mathematical formulation of this problem is presented, where an algorithmic approach was developed. The problem can be considered to find the best scheduling service to the wells by the workover rigs, taking into account the minimization of the composition related to the costs of the workover rigs and the total loss of oil suffered by the wells. This problem is similar to the Vehicle Routing Problem (VRP), which is classified as belonging to the NP-hard class. The goal of this research is to develop an algorithmic approach to solve the SWRP, using the fundamentals of metaheuristics like Memetic Algorithm and GRASP. Instances are generated for the tests to analyze the computational performance of the approaches mentioned above, using data that are close to reality. Thereafter, is performed a comparison of performance and quality of the results obtained by each one of techniques used
Resumo:
In the world we are constantly performing everyday actions. Two of these actions are frequent and of great importance: classify (sort by classes) and take decision. When we encounter problems with a relatively high degree of complexity, we tend to seek other opinions, usually from people who have some knowledge or even to the extent possible, are experts in the problem domain in question in order to help us in the decision-making process. Both the classification process as the process of decision making, we are guided by consideration of the characteristics involved in the specific problem. The characterization of a set of objects is part of the decision making process in general. In Machine Learning this classification happens through a learning algorithm and the characterization is applied to databases. The classification algorithms can be employed individually or by machine committees. The choice of the best methods to be used in the construction of a committee is a very arduous task. In this work, it will be investigated meta-learning techniques in selecting the best configuration parameters of homogeneous committees for applications in various classification problems. These parameters are: the base classifier, the architecture and the size of this architecture. We investigated nine types of inductors candidates for based classifier, two methods of generation of architecture and nine medium-sized groups for architecture. Dimensionality reduction techniques have been applied to metabases looking for improvement. Five classifiers methods are investigated as meta-learners in the process of choosing the best parameters of a homogeneous committee.
Resumo:
Committees of classifiers may be used to improve the accuracy of classification systems, in other words, different classifiers used to solve the same problem can be combined for creating a system of greater accuracy, called committees of classifiers. To that this to succeed is necessary that the classifiers make mistakes on different objects of the problem so that the errors of a classifier are ignored by the others correct classifiers when applying the method of combination of the committee. The characteristic of classifiers of err on different objects is called diversity. However, most measures of diversity could not describe this importance. Recently, were proposed two measures of the diversity (good and bad diversity) with the aim of helping to generate more accurate committees. This paper performs an experimental analysis of these measures applied directly on the building of the committees of classifiers. The method of construction adopted is modeled as a search problem by the set of characteristics of the databases of the problem and the best set of committee members in order to find the committee of classifiers to produce the most accurate classification. This problem is solved by metaheuristic optimization techniques, in their mono and multi-objective versions. Analyzes are performed to verify if use or add the measures of good diversity and bad diversity in the optimization objectives creates more accurate committees. Thus, the contribution of this study is to determine whether the measures of good diversity and bad diversity can be used in mono-objective and multi-objective optimization techniques as optimization objectives for building committees of classifiers more accurate than those built by the same process, but using only the accuracy classification as objective of optimization
Resumo:
The segmentation of an image aims to subdivide it into constituent regions or objects that have some relevant semantic content. This subdivision can also be applied to videos. However, in these cases, the objects appear in various frames that compose the videos. The task of segmenting an image becomes more complex when they are composed of objects that are defined by textural features, where the color information alone is not a good descriptor of the image. Fuzzy Segmentation is a region-growing segmentation algorithm that uses affinity functions in order to assign to each element in an image a grade of membership for each object (between 0 and 1). This work presents a modification of the Fuzzy Segmentation algorithm, for the purpose of improving the temporal and spatial complexity. The algorithm was adapted to segmenting color videos, treating them as 3D volume. In order to perform segmentation in videos, conventional color model or a hybrid model obtained by a method for choosing the best channels were used. The Fuzzy Segmentation algorithm was also applied to texture segmentation by using adaptive affinity functions defined for each object texture. Two types of affinity functions were used, one defined using the normal (or Gaussian) probability distribution and the other using the Skew Divergence. This latter, a Kullback-Leibler Divergence variation, is a measure of the difference between two probability distributions. Finally, the algorithm was tested in somes videos and also in texture mosaic images composed by images of the Brodatz album
Resumo:
This work presents a algorithmic study of Multicast Packing Problem considering a multiobjective approach. The first step realized was an extensive review about the problem. This review serverd as a reference point for the definition of the multiobjective mathematical model. Then, the instances used in the experimentation process were defined, this instances were created based on the main caracteristics from literature. Since both mathematical model and the instances were definined, then several algoritms were created. The algorithms were based on the classical approaches to multiobjective optimization: NSGA2 (3 versions), SPEA2 (3 versions). In addition, the GRASP procedures were adapted to work with multiples objectives, two vesions were created. These algorithms were composed by three recombination operators(C1, C2 e C3), two operator for build solution, a mutation operator and a local search procedure. Finally, a long experimentation process was performed. This process has three stages: the first consisted of adjusting the parameters; the second was perfomed to indentify the best version for each algorithm. After, the best versions for each algorithm were compared in order to identify the best algorithm among all. The algorithms were evaluated based on quality indicators and Hypervolume Multiplicative Epsilon
Resumo:
Nonogram is a logical puzzle whose associated decision problem is NP-complete. It has applications in pattern recognition problems and data compression, among others. The puzzle consists in determining an assignment of colors to pixels distributed in a N M matrix that satisfies line and column constraints. A Nonogram is encoded by a vector whose elements specify the number of pixels in each row and column of a figure without specifying their coordinates. This work presents exact and heuristic approaches to solve Nonograms. The depth first search was one of the chosen exact approaches because it is a typical example of brute search algorithm that is easy to implement. Another implemented exact approach was based on the Las Vegas algorithm, so that we intend to investigate whether the randomness introduce by the Las Vegas-based algorithm would be an advantage over the depth first search. The Nonogram is also transformed into a Constraint Satisfaction Problem. Three heuristics approaches are proposed: a Tabu Search and two memetic algorithms. A new function to calculate the objective function is proposed. The approaches are applied on 234 instances, the size of the instances ranging from 5 x 5 to 100 x 100 size, and including logical and random Nonograms
Resumo:
This work consists on the study of two important problems arising from the operations of petroleum and natural gas industries. The first problem the pipe dimensioning problem on constrained gas distribution networks consists in finding the least cost combination of diameters from a discrete set of commercially available ones for the pipes of a given gas network, such that it respects minimum pressure requirements at each demand node and upstream pipe conditions. On its turn, the second problem the piston pump unit routing problem comes from the need of defining the piston pump unit routes for visiting a number of non-emergent wells in on-shore fields, i.e., wells which don t have enough pressure to make the oil emerge to surface. The periodic version of this problem takes into account the wells re-filling equation to provide a more accurate planning in the long term. Besides the mathematical formulation of both problems, an exact algorithm and a taboo search were developed for the solution of the first problem and a theoretical limit and a ProtoGene transgenetic algorithm were developed for the solution of the second problem. The main concepts of the metaheuristics are presented along with the details of their application to the cited problems. The obtained results for both applications are promising when compared to theoretical limits and alternate solutions, either relative to the quality of the solutions or to associated running time
Resumo:
The Hiker Dice was a game recently proposed in a software designed by Mara Kuzmich and Leonardo Goldbarg. In the game a dice is responsible for building a trail on an n x m board. As the dice waits upon a cell on the board, it prints the side that touches the surface. The game shows the Hamiltonian Path Problem Simple Maximum Hiker Dice (Hidi-CHS) in trays Compact Nth , this problem is then characterized by looking for a Hamiltonian Path that maximize the sum of marked sides on the board. The research now related, models the problem through Graphs, and proposes two classes of solution algorithms. The first class, belonging to the exact algorithms, is formed by a backtracking algorithm planed with a return through logical rules and limiting the best found solution. The second class of algorithms is composed by metaheuristics type Evolutionary Computing, Local Ramdomized search and GRASP (Greed Randomized Adaptative Search). Three specific operators for the algorithms were created as follows: restructuring, recombination with two solutions and random greedy constructive.The exact algorithm was teste on 4x4 to 8x8 boards exhausting the possibility of higher computational treatment of cases due to the explosion in processing time. The heuristics algorithms were tested on 5x5 to 14x14 boards. According to the applied methodology for evaluation, the results acheived by the heuristics algorithms suggests a better performance for the GRASP algorithm
Resumo:
The Scientific Algorithms are a new metaheuristics inspired in the scientific research process. The new method introduces the idea of theme to search the solution space of hard problems. The inspiration for this class of algorithms comes from the act of researching that comprises thinking, knowledge sharing and disclosing new ideas. The ideas of the new method are illustrated in the Traveling Salesman Problem. A computational experiment applies the proposed approach to a new variant of the Traveling Salesman Problem named Car Renter Salesman Problem. The results are compared to state-of-the-art algorithms for the latter problem
Resumo:
Image segmentation is the process of subdiving an image into constituent regions or objects that have similar features. In video segmentation, more than subdividing the frames in object that have similar features, there is a consistency requirement among segmentations of successive frames of the video. Fuzzy segmentation is a region growing technique that assigns to each element in an image (which may have been corrupted by noise and/or shading) a grade of membership between 0 and 1 to an object. In this work we present an application that uses a fuzzy segmentation algorithm to identify and select particles in micrographs and an extension of the algorithm to perform video segmentation. Here, we treat a video shot is treated as a three-dimensional volume with different z slices being occupied by different frames of the video shot. The volume is interactively segmented based on selected seed elements, that will determine the affinity functions based on their motion and color properties. The color information can be extracted from a specific color space or from three channels of a set of color models that are selected based on the correlation of the information from all channels. The motion information is provided into the form of dense optical flows maps. Finally, segmentation of real and synthetic videos and their application in a non-photorealistic rendering (NPR) toll are presented
Resumo:
Este trabalho apresenta um algoritmo transgenético híbrido para a solução de um Problema de Configuração de uma Rede de Distribuição de Gás Natural. O problema da configuração dessas redes requer a definição de um traçado por onde os dutos devem ser colocados para atender aos clientes. É estudada neste trabalho uma maneira de conectar os clientes em uma rede com arquitetura em forma de árvore. O objetivo é minimizar o custo de construção da rede, mesmo que para isso alguns clientes que não proporcionam lucros deixem de ser atendidos. Esse problema pode ser formulado computacionalmente através do Problema de Steiner com Prêmios. Este é um problema de otimização combinatória da classe dos NPÁrduos. Este trabalho apresenta um algoritmo heurístico para a solução do problema. A abordagem utilizada é chamada de Algoritmos Transgenéticos, que se enquadram na categoria dos algoritmos evolucionários. Para a geração de soluções inicias é utilizado um algoritmo primaldual, e pathrelinking é usado como intensificador
Resumo:
Non-Photorealisitc Rendering (NPR) is a class of techniques that aims to reproduce artistic techniques, trying to express feelings and moods on the rendered scenes, giving an aspect of that they had been made "by hand". Another way of defining NPR is that it is the processing of scenes, images or videos into artwork, generating scenes, images or videos that can have the visual appeal of pieces of art, expressing the visual and emotional characteristics of artistic styles. This dissertation presents a new method of NPR for stylization of images and videos, based on a typical artistic expression of the Northeast region of Brazil, that uses colored sand to compose landscape images on the inner surface of glass bottles. This method is comprised by one technique for generating 2D procedural textures of sand, and two techniques that mimic effects created by the artists using their tools. It also presents a method for generating 21 2D animations in sandbox from the stylized video. The temporal coherence within these stylized videos can be enforced on individual objects with the aid of a video segmentation algorithm. The present techniques in this work were used on stylization of synthetic and real videos, something close to impossible to be produced by artist in real life
Resumo:
The Curimataú estuary is located in the oriental coast of Rio Grande do Norte State in Brazil. Its importance resides in the fact that this region possesses one of the last portions of preserved mangrove in the Rio Grande do Norte State. Nevertheless, it has been severely affected by many anthropogenic activities, as sugarcane monoculture and shrimp farming. Former works demonstrated that an accumulation of heavy metals is occurring in oysters in this estuary, and perhaps it could be explained by the input of metals in this ecosystem deriving from the shrimp farming. To better understanding the origin of these metals, bottom sediment samples, cores and suspended particulate matter were collected for a characterization of metal concentrations (Al, Ba, Cd, Cu, Cr, Fe, Mn, Ni, Pb, Zn) and to determine the potentially bioavailable metals. Additionally, the enrichment ratio for each element analyzed was calculated. The mineralogical composition of sediment samples and cores were obtained by X-ray diffraction. Moreover, data of orbital remote sensing were used in order to detect and quantify suspended matter by applying a logarithmic algorithm. Geochemical data of bottom sediments and cores revealed that, excepting Ba and Pb, the elements analyzed presented concentrations characteristic of an unpolluted ecosystem (Al: 0,25 - 8,76 %; Ba: 3,03 - 870 µg.g-1; Cd: < 0,25 µg.g-1; Cr: 1,72 - 82,4 µg.g-1; Cu: 0,12 -25,3 µg.g-1; Pb: 0,38 - 23,7 µg.g-1; Fe: 0,10 - 5,82 %; Mn: 15,1 - 815 µg.g-1; Ni: 0,14 - 36,1 µg.g-1; Zn: 1,37 - 113 µg.g-1). During the dry season a distribution pattern was observed, with higher metal concentrations in the margins, decreasing toward the central portion of the channel. These metal concentrations were well correlated with mineralogical compositions, with clay minerals prevailing at the margins, and quartz and feldspar in the center. However, this pattern was not observed during the wet season, probably because of the high water flux that disturbed bottom sediments. But, as observed for the dry season, a good correlation between metal concentrations and mineralogical composition was also observed for the wet season, with high metal concentrations where there were high quantities of clay minerals. Low enrichment ratios were obtained for the majority of elements analyzed, excepting for Mn, Ba and Pb. Manganese presented the higher ratios downstream for both seasons, and it can be an evidence of anthropogenic impact by shrimp farming. As barium and lead concentrations in sediment samples presented analytical problems during the total sample digestion, one cannot be sure that the ratios obtained correspond to the reality. The highest metal concentrations in particulate matter were obtained in the portion dominated by fluvial transport for all metals analyzed, excepting for copper. Barium and zinc were the only elements that presented elevated concentrations that are not common of unpolluted ecosystems (Ba: 5730 - 8355 µg.g-1; Zn: 3899 - 4348 µg.g-1). However, these high concentrations could not be related to the shrimp farming and waste waters from the town of Canguaretama, once they were obtained from the fluvial particulate matter, that is upstream from the activities above mentioned. The application of the logarithmic algorithm to the processed LANDSAT image was well succeeded, although the acquired image does not correspond exactly to the field campaigns. The IKONOS image provided very detailed views of the suspended sediment concentration at the estuary, as the mixture of distinct water flows at the confluence of Cunhaú and Curimataú rivers, with more turbid waters from Cunhaú river, that is directly affected by effluents from shrimp farming and urban waste waters deriving from the town of Canguaretama
Resumo:
This work an algorithm for fault location is proposed. It contains the following functions: fault detection, fault classification and fault location. Mathematical Morphology is used to process currents obtained in the monitored terminals. Unlike Fourier and Wavelet transforms that are usually applied to fault location, the Mathematical Morphology is a non-linear operation that uses only basic operation (sum, subtraction, maximum and minimum). Thus, Mathematical Morphology is computationally very efficient. For detection and classification functions, the Morphological Wavelet was used. On fault location module the Multiresolution Morphological Gradient was used to detect the traveling waves and their polarities. Hence, recorded the arrival in the two first traveling waves incident at the measured terminal and knowing the velocity of propagation, pinpoint the fault location can be estimated. The algorithm was applied in a 440 kV power transmission system, simulated on ATP. Several fault conditions where studied and the following parameters were evaluated: fault location, fault type, fault resistance, fault inception angle, noise level and sampling rate. The results show that the application of Mathematical Morphology in faults location is very promising
Resumo:
The robustness and performance of the Variable Structure Adaptive Pole Placement Controller are evaluated in this work, where this controller is applied to control a synchronous generator connected to an infinite bus. The evaluation of the robustness of this controller will be accomplished through simulations, where the control algorithm was subjected to adverse conditions, such as: disturbances, parametric variations and unmodeled dynamic. It was also made a comparison of this control strategy with another one, using classic controllers. In the simulations, it is used a coupled model of the synchronous generator which variables have a high degree of coupling, in other words, if there is a change in the input variables of the generator, it will change all outputs simultaneously. The simulation results show which control strategy performs better and is more robust to disturbances, parametric variations and unmodeled dynamics for the control of Synchronous Generator