56 resultados para Algoritmo Genético
Resumo:
Vulvovaginal candidiasis (VVC) is one of the most common causes of vaginitis and affects about 75% of women of reproductive age. The majority of cases (80 to 90%) are due to C. albicans, the most virulent species of the genus Candida. Virulence attributes are scarcely investigated and the source of infection remains uncertain. Objective: This study aimed to evaluate the virulence factors and genotypes of clinical isolates of C. albicans sequentially obtained from the anus and vagina of patients with sporadic and recurrent VVC. Materials and methods: We analyzed 62 clinical isolates of C. albicans (36 vaginal and 26 anal strains). Direct examination of vaginal and anal samples and colony forming units (CFU) counts were performed. Yeasts were identified using the chromogenic media CHROMagar Candida® and by classical methodology, and phenotypically characterized regarding to virulence factors, including the ability to adhere to epithelial cells, proteinase activity, morphogenesis and biofilm formation. The genotypes of the strains were investigated with ABC genotyping, microsatellite genotyping with primer M13 and RAPD. Results: We found 100% agreement between direct examination and culture of vaginal samples. Filamentous forms were present in most of the samples of vaginal secretion, which presented CFU counts significantly higher than the samples of anal secretion. There was no statistically significant difference between virulence factors of infecting vaginal isolates and those presented by colonizing anal isolates; as well as for the comparison of the vaginal isolates from patients with different clinical conditions (sporadic or recurrent VVC). There was a decrease in the ability to adhere to HBEC, morphogenesis and biofilm formation of the vaginal isolates during the progress of infection. There was an association between the ability to express different virulence factors and the clinical manifestations presented by the patients. Genotype A was the most prevalent (93.6%), followed by genotype C (6.4%). We found maintenance of the same ABC genotype and greater prevalence of microevolution for the vaginal strains of C. albicans sequentially obtained. Vaginal and anal isolates of C. albicans obtained simultaneously from the same patient presented the same ABC genotype and high genetic relatedness. Conclusion: It is noteworthy that the proliferation of yeast and bud-to-hypha transition are important for the establishment of CVV. The expression of virulence factors is important for the pathogenesis of VVC, although it does not seem to be determinant in the transition from colonization to infection or to the installation of recurrent condition. Genotype A seems to be dominant over the others in both vaginal and anal isolates of patients with VVC. The most common scenario was microevolution of the strains of C. albicans in the vaginal environment. It is suggested that the anal reservoir constituted a possible source of vaginal infection, in most cases assessed
Algoritmo evolutivo paralelo para o problema de atribuição de localidades a anéis em redes sonet/sdh
Resumo:
The telecommunications play a fundamental role in the contemporary society, having as one of its main roles to give people the possibility to connect them and integrate them into society in which they operate and, therewith, accelerate development through knowledge. But as new technologies are introduced on the market, increases the demand for new products and services that depend on the infrastructure offered, making the problems of planning of telecommunication networks become increasingly large and complex. Many of these problems, however, can be formulated as combinatorial optimization models, and the use of heuristic algorithms can help solve these issues in the planning phase. This paper proposes the development of a Parallel Evolutionary Algorithm to be applied to telecommunications problem known in the literature as SONET Ring Assignment Problem SRAP. This problem is the class NP-hard and arises during the physical planning of a telecommunication network and consists of determining the connections between locations (customers), satisfying a series of constrains of the lowest possible cost. Experimental results illustrate the effectiveness of the Evolutionary Algorithm parallel, over other methods, to obtain solutions that are either optimal or very close to it
Resumo:
This paper aims to propose a hybrid meta-heuristics for the Heterogeneous Fleet Vehicle Routing Problem (HVRP), which is a combinatorial optimization problem NP-hard, and is characterized by the use of a limited fleet consists of different vehicles with different capacities. The hybrid method developed makes use of a memetic algorithm associated with the component optimizer Vocabulary Building. The resulting hybrid meta-heuristic was implemented in the programming language C + + and computational experiments generated good results in relation to meta-heuristic applied in isolation, proving the efficiency of the proposed method.
Resumo:
The Combinatorial Optimization is a basic area to companies who look for competitive advantages in the diverse productive sectors and the Assimetric Travelling Salesman Problem, which one classifies as one of the most important problems of this area, for being a problem of the NP-hard class and for possessing diverse practical applications, has increased interest of researchers in the development of metaheuristics each more efficient to assist in its resolution, as it is the case of Memetic Algorithms, which is a evolutionary algorithms that it is used of the genetic operation in combination with a local search procedure. This work explores the technique of Viral Infection in one Memetic Algorithms where the infection substitutes the mutation operator for obtaining a fast evolution or extinguishing of species (KANOH et al, 1996) providing a form of acceleration and improvement of the solution . For this it developed four variants of Viral Infection applied in the Memetic Algorithms for resolution of the Assimetric Travelling Salesman Problem where the agent and the virus pass for a symbiosis process which favored the attainment of a hybrid evolutionary algorithms and computational viable
Resumo:
Most algorithms for state estimation based on the classical model are just adequate for use in transmission networks. Few algorithms were developed specifically for distribution systems, probably because of the little amount of data available in real time. Most overhead feeders possess just current and voltage measurements at the middle voltage bus-bar at the substation. In this way, classical algorithms are of difficult implementation, even considering off-line acquired data as pseudo-measurements. However, the necessity of automating the operation of distribution networks, mainly in regard to the selectivity of protection systems, as well to implement possibilities of load transfer maneuvers, is changing the network planning policy. In this way, some equipments incorporating telemetry and command modules have been installed in order to improve operational features, and so increasing the amount of measurement data available in real-time in the System Operation Center (SOC). This encourages the development of a state estimator model, involving real-time information and pseudo-measurements of loads, that are built from typical power factors and utilization factors (demand factors) of distribution transformers. This work reports about the development of a new state estimation method, specific for radial distribution systems. The main algorithm of the method is based on the power summation load flow. The estimation is carried out piecewise, section by section of the feeder, going from the substation to the terminal nodes. For each section, a measurement model is built, resulting in a nonlinear overdetermined equations set, whose solution is achieved by the Gaussian normal equation. The estimated variables of a section are used as pseudo-measurements for the next section. In general, a measurement set for a generic section consists of pseudo-measurements of power flows and nodal voltages obtained from the previous section or measurements in real-time, if they exist -, besides pseudomeasurements of injected powers for the power summations, whose functions are the load flow equations, assuming that the network can be represented by its single-phase equivalent. The great advantage of the algorithm is its simplicity and low computational effort. Moreover, the algorithm is very efficient, in regard to the accuracy of the estimated values. Besides the power summation state estimator, this work shows how other algorithms could be adapted to provide state estimation of middle voltage substations and networks, namely Schweppes method and an algorithm based on current proportionality, that is usually adopted for network planning tasks. Both estimators were implemented not only as alternatives for the proposed method, but also looking for getting results that give support for its validation. Once in most cases no power measurement is performed at beginning of the feeder and this is required for implementing the power summation estimations method, a new algorithm for estimating the network variables at the middle voltage bus-bar was also developed
Resumo:
This work has as main objective to show all the particularities regarding the Three-phase Power Summation Method, used for load flow calculation, in what it says respect to the influence of the magnetic coupling among the phases, as well as to the losses presented in all the existent transformers in the feeder to be analyzed. Besides, its application is detailed in the study of the short-circuits, that happen in the presence of high impedance values, which possess a problem, that is its difficult detection and consequent elimination on the part of common devices of protection. That happens due to the characteristic presented by the current of short¬ circuit, in being generally of the same order of greatness that the load currents. Results of simulations accomplished in several situations will be shown, objectifying a complete analysis of the behavior of the proposed method in several types of short-circuits. Confront of the results obtained by the method with results of another works will be presented to verify its effectiveness
Resumo:
ln this work the implementation of the SOM (Self Organizing Maps) algorithm or Kohonen neural network is presented in the form of hierarchical structures, applied to the compression of images. The main objective of this approach is to develop an Hierarchical SOM algorithm with static structure and another one with dynamic structure to generate codebooks (books of codes) in the process of the image Vector Quantization (VQ), reducing the time of processing and obtaining a good rate of compression of images with a minimum degradation of the quality in relation to the original image. Both self-organizing neural networks developed here, were denominated HSOM, for static case, and DHSOM, for the dynamic case. ln the first form, the hierarchical structure is previously defined and in the later this structure grows in an automatic way in agreement with heuristic rules that explore the data of the training group without use of external parameters. For the network, the heuristic mIes determine the dynamics of growth, the pruning of ramifications criteria, the flexibility and the size of children maps. The LBO (Linde-Buzo-Oray) algorithm or K-means, one ofthe more used algorithms to develop codebook for Vector Quantization, was used together with the algorithm of Kohonen in its basic form, that is, not hierarchical, as a reference to compare the performance of the algorithms here proposed. A performance analysis between the two hierarchical structures is also accomplished in this work. The efficiency of the proposed processing is verified by the reduction in the complexity computational compared to the traditional algorithms, as well as, through the quantitative analysis of the images reconstructed in function of the parameters: (PSNR) peak signal-to-noise ratio and (MSE) medium squared error
Resumo:
Pipeline leak detection is a matter of great interest for companies who transport petroleum and its derivatives, in face of rising exigencies of environmental policies in industrialized and industrializing countries. However, existing technologies are not yet fully consolidated and many studies have been accomplished in order to achieve better levels of sensitivity and reliability for pipeline leak detection in a wide range of flowing conditions. In this sense, this study presents the results obtained from frequency spectrum analysis of pressure signals from pipelines in several flowing conditions like normal flowing, leakages, pump switching, etc. The results show that is possible to distinguish between the frequency spectra of those different flowing conditions, allowing recognition and announce of liquid pipeline leakages from pressure monitoring. Based upon these results, a pipeline leak detection algorithm employing frequency analysis of pressure signals is proposed, along with a methodology for its tuning and calibration. The proposed algorithm and its tuning methodology are evaluated with data obtained from real leakages accomplished in pipelines transferring crude oil and water, in order to evaluate its sensitivity, reliability and applicability to different flowing conditions
Resumo:
This work develops a methodology for defining the maximum active power being injected into predefined nodes in the studied distribution networks, considering the possibility of multiple accesses of generating units. The definition of these maximum values is obtained from an optimization study, in which further losses should not exceed those of the base case, i.e., without the presence of distributed generation. The restrictions on the loading of the branches and voltages of the system are respected. To face the problem it is proposed an algorithm, which is based on the numerical method called particle swarm optimization, applied to the study of AC conventional load flow and optimal load flow for maximizing the penetration of distributed generation. Alternatively, the Newton-Raphson method was incorporated to resolution of the load flow. The computer program is performed with the SCILAB software. The proposed algorithm is tested with the data from the IEEE network with 14 nodes and from another network, this one from the Rio Grande do Norte State, at a high voltage (69 kV), with 25 nodes. The algorithm defines allowed values of nominal active power of distributed generation, in percentage terms relative to the demand of the network, from reference values
Resumo:
This work proposes a collaborative system for marking dangerous points in the transport routes and generation of alerts to drivers. It consisted of a proximity warning system for a danger point that is fed by the driver via a mobile device equipped with GPS. The system will consolidate data provided by several different drivers and generate a set of points common to be used in the warning system. Although the application is designed to protect drivers, the data generated by it can serve as inputs for the responsible to improve signage and recovery of public roads
Resumo:
The seismic method is of extreme importance in geophysics. Mainly associated with oil exploration, this line of research focuses most of all investment in this area. The acquisition, processing and interpretation of seismic data are the parts that instantiate a seismic study. Seismic processing in particular is focused on the imaging that represents the geological structures in subsurface. Seismic processing has evolved significantly in recent decades due to the demands of the oil industry, and also due to the technological advances of hardware that achieved higher storage and digital information processing capabilities, which enabled the development of more sophisticated processing algorithms such as the ones that use of parallel architectures. One of the most important steps in seismic processing is imaging. Migration of seismic data is one of the techniques used for imaging, with the goal of obtaining a seismic section image that represents the geological structures the most accurately and faithfully as possible. The result of migration is a 2D or 3D image which it is possible to identify faults and salt domes among other structures of interest, such as potential hydrocarbon reservoirs. However, a migration fulfilled with quality and accuracy may be a long time consuming process, due to the mathematical algorithm heuristics and the extensive amount of data inputs and outputs involved in this process, which may take days, weeks and even months of uninterrupted execution on the supercomputers, representing large computational and financial costs, that could derail the implementation of these methods. Aiming at performance improvement, this work conducted the core parallelization of a Reverse Time Migration (RTM) algorithm, using the parallel programming model Open Multi-Processing (OpenMP), due to the large computational effort required by this migration technique. Furthermore, analyzes such as speedup, efficiency were performed, and ultimately, the identification of the algorithmic scalability degree with respect to the technological advancement expected by future processors
Resumo:
This work presents a scalable and efficient parallel implementation of the Standard Simplex algorithm in the multicore architecture to solve large scale linear programming problems. We present a general scheme explaining how each step of the standard Simplex algorithm was parallelized, indicating some important points of the parallel implementation. Performance analysis were conducted by comparing the sequential time using the Simplex tableau and the Simplex of the CPLEXR IBM. The experiments were executed on a shared memory machine with 24 cores. The scalability analysis was performed with problems of different dimensions, finding evidence that our parallel standard Simplex algorithm has a better parallel efficiency for problems with more variables than constraints. In comparison with CPLEXR , the proposed parallel algorithm achieved a efficiency of up to 16 times better
Resumo:
The present essay shows strategies of improvement in a well succeded evolutionary metaheuristic to solve the Asymmetric Traveling Salesman Problem. Such steps consist in a Memetic Algorithm projected mainly to this problem. Basically this improvement applied optimizing techniques known as Path-Relinking and Vocabulary Building. Furthermore, this last one has being used in two different ways, in order to evaluate the effects of the improvement on the evolutionary metaheuristic. These methods were implemented in C++ code and the experiments were done under instances at TSPLIB library, being possible to observe that the procedures purposed reached success on the tests done
Resumo:
The Multiobjective Spanning Tree is a NP-hard Combinatorial Optimization problem whose application arises in several areas, especially networks design. In this work, we propose a solution to the biobjective version of the problem through a Transgenetic Algorithm named ATIS-NP. The Computational Transgenetic is a metaheuristic technique from Evolutionary Computation whose inspiration relies in the conception of cooperation (and not competition) as the factor of main influence to evolution. The algorithm outlined is the evolution of a work that has already yielded two other transgenetic algorithms. In this sense, the algorithms previously developed are also presented. This research also comprises an experimental analysis with the aim of obtaining information related to the performance of ATIS-NP when compared to other approaches. Thus, ATIS-NP is compared to the algorithms previously implemented and to other transgenetic already presented for the problem under consideration. The computational experiments also address the comparison to two recent approaches from literature that present good results, a GRASP and a genetic algorithms. The efficiency of the method described is evaluated with basis in metrics of solution quality and computational time spent. Considering the problem is within the context of Multiobjective Optimization, quality indicators are adopted to infer the criteria of solution quality. Statistical tests evaluate the significance of results obtained from computational experiments
Resumo:
This work seeks to propose and evaluate a change to the Ant Colony Optimization based on the results of experiments performed on the problem of Selective Ride Robot (PRS, a new problem, also proposed in this paper. Four metaheuristics are implemented, GRASP, VNS and two versions of Ant Colony Optimization, and their results are analyzed by running the algorithms over 32 instances created during this work. The metaheuristics also have their results compared to an exact approach. The results show that the algorithm implemented using the GRASP metaheuristic show good results. The version of the multicolony ant colony algorithm, proposed and evaluated in this work, shows the best results