154 resultados para Algoritmos exatos
Resumo:
The main goal of Regression Test (RT) is to reuse the test suite of the latest version of a software in its current version, in order to maximize the value of the tests already developed and ensure that old features continue working after the new changes. Even with reuse, it is common that not all tests need to be executed again. Because of that, it is encouraged to use Regression Tests Selection (RTS) techniques, which aims to select from all tests, only those that reveal faults, this reduces costs and makes this an interesting practice for the testing teams. Several recent research works evaluate the quality of the selections performed by RTS techniques, identifying which one presents the best results, measured by metrics such as inclusion and precision. The RTS techniques should seek in the System Under Test (SUT) for tests that reveal faults. However, because this is a problem without a viable solution, they alternatively seek for tests that reveal changes, where faults may occur. Nevertheless, these changes may modify the execution flow of the algorithm itself, leading some tests no longer exercise the same stretch. In this context, this dissertation investigates whether changes performed in a SUT would affect the quality of the selection of tests performed by an RTS, if so, which features the changes present which cause errors, leading the RTS to include or exclude tests wrongly. For this purpose, a tool was developed using the Java language to automate the measurement of inclusion and precision averages achieved by a regression test selection technique for a particular feature of change. In order to validate this tool, an empirical study was conducted to evaluate the RTS technique Pythia, based on textual differencing, on a large web information system, analyzing the feature of types of tasks performed to evolve the SUT
Resumo:
Data clustering is applied to various fields such as data mining, image processing and pattern recognition technique. Clustering algorithms splits a data set into clusters such that elements within the same cluster have a high degree of similarity, while elements belonging to different clusters have a high degree of dissimilarity. The Fuzzy C-Means Algorithm (FCM) is a fuzzy clustering algorithm most used and discussed in the literature. The performance of the FCM is strongly affected by the selection of the initial centers of the clusters. Therefore, the choice of a good set of initial cluster centers is very important for the performance of the algorithm. However, in FCM, the choice of initial centers is made randomly, making it difficult to find a good set. This paper proposes three new methods to obtain initial cluster centers, deterministically, the FCM algorithm, and can also be used in variants of the FCM. In this work these initialization methods were applied in variant ckMeans.With the proposed methods, we intend to obtain a set of initial centers which are close to the real cluster centers. With these new approaches startup if you want to reduce the number of iterations to converge these algorithms and processing time without affecting the quality of the cluster or even improve the quality in some cases. Accordingly, cluster validation indices were used to measure the quality of the clusters obtained by the modified FCM and ckMeans algorithms with the proposed initialization methods when applied to various data sets
Resumo:
The Reconfigurables Architectures had appeares as an alternative to the ASICs and the GGP, keeping a balance between flexibility and performance. This work presents a proposal for the modeling of Reconfigurables with Chu Spaces, describing the subjects main about this thematic. The solution proposal consists of a modeling that uses a generalization of the Chu Spaces, called of Chu nets, to model the configurations of a Reconfigurables Architectures. To validate the models, three algorithms had been developed and implemented to compose configurable logic blocks, detection of controllability and observability in applications for Reconfigurables Architectures modeled by Chu nets
Resumo:
In Fazenda Belém oil field (Potiguar Basin, Ceará State, Brazil) occur frequently sinkholes and sudden terrain collapses associated to an unconsolidated sedimentary cap covering the Jandaíra karst. This research was carried out in order to understand the mechanisms of generation of these collapses. The main tool used was Ground Penetrating Radar (GPR). This work is developed twofold: one aspect concerns methodology improvements in GPR data processing whilst another aspect concerns the geological study of the Jandaíra karst. This second aspect was strongly supported both by the analysis of outcropping karst structures (in another regions of Potiguar Basin) and by the interpretation of radargrams from the subsurface karst in Fazenda Belém. It was designed and tested an adequate flux to process GPR data which was adapted from an usual flux to process seismic data. The changes were introduced to take into account important differences between GPR and Reflection Seismic methods, in particular: poor coupling between source and ground, mixed phase of the wavelet, low signal-to-noise ratio, monochannel acquisition, and high influence of wave propagation effects, notably dispersion. High frequency components of the GPR pulse suffer more pronounced effects of attenuation than low frequency components resulting in resolution losses in radargrams. In Fazenda Belém, there is a stronger need of an suitable flux to process GPR data because both the presence of a very high level of aerial events and the complexity of the imaged subsurface karst structures. The key point of the processing flux was an improvement in the correction of the attenuation effects on the GPR pulse based on their influence on the amplitude and phase spectra of GPR signals. In low and moderate losses dielectric media the propagated signal suffers significant changes only in its amplitude spectrum; that is, the phase spectrum of the propagated signal remains practically unaltered for the usual travel time ranges. Based on this fact, it is shown using real data that the judicious application of the well known tools of time gain and spectral balancing can efficiently correct the attenuation effects. The proposed approach can be applied in heterogeneous media and it does not require the precise knowledge of the attenuation parameters of the media. As an additional benefit, the judicious application of spectral balancing promotes a partial deconvolution of the data without changing its phase. In other words, the spectral balancing acts in a similar way to a zero phase deconvolution. In GPR data the resolution increase obtained with spectral balancing is greater than those obtained with spike and predictive deconvolutions. The evolution of the Jandaíra karst in Potiguar Basin is associated to at least three events of subaerial exposition of the carbonatic plataform during the Turonian, Santonian, and Campanian. In Fazenda Belém region, during the mid Miocene, the Jandaíra karst was covered by continental siliciclastic sediments. These sediments partially filled the void space associated to the dissolution structures and fractures. Therefore, the development of the karst in this region was attenuated in comparison to other places in Potiguar Basin where this karst is exposed. In Fazenda Belém, the generation of sinkholes and terrain collapses are controlled mainly by: (i) the presence of an unconsolidated sedimentary cap which is thick enough to cover completely the karst but with sediment volume lower than the available space associated to the dissolution structures in the karst; (ii) the existence of important structural of SW-NE and NW-SE alignments which promote a localized increase in the hydraulic connectivity allowing the channeling of underground water, thus facilitating the carbonatic dissolution; and (iii) the existence of a hydraulic barrier to the groundwater flow, associated to the Açu-4 Unity. The terrain collapse mechanisms in Fazenda Belém occur according to the following temporal evolution. The meteoric water infiltrates through the unconsolidated sedimentary cap and promotes its remobilization to the void space associated with the dissolution structures in Jandaíra Formation. This remobilization is initiated at the base of the sedimentary cap where the flow increases its abrasion due to a change from laminar to turbulent flow regime when the underground water flow reaches the open karst structures. The remobilized sediments progressively fill from bottom to top the void karst space. So, the void space is continuously migrated upwards ultimately reaching the surface and causing the sudden observed terrain collapses. This phenomenon is particularly active during the raining season, when the water table that normally is located in the karst may be temporarily located in the unconsolidated sedimentary cap
Resumo:
In February 2011, the National Agency of Petroleum, Natural Gas and Biofuels (ANP) has published a new Technical Rules for Handling Land Pipeline Petroleum and Natural Gas Derivatives (RTDT). Among other things, the RTDT made compulsory the use of monitoring systems and leak detection in all onshore pipelines in the country. This document provides a study on the method for detection of transient pressure. The study was conducted on a industrial duct 16" diameter and 9.8 km long. The pipeline is fully pressurized and carries a multiphase mixture of crude oil, water and natural gas. For the study, was built an infrastructure for data acquisition and validation of detection algorithms. The system was designed with SCADA architecture. Piezoresistive sensors were installed at the ends of the duct and Digital Signal Processors (DSPs) were used for sampling, storage and processing of data. The study was based on simulations of leaks through valves and search for patterns that characterize the occurrence of such phenomena
Resumo:
The separation methods are reduced applications as a result of the operational costs, the low output and the long time to separate the uids. But, these treatment methods are important because of the need for extraction of unwanted contaminants in the oil production. The water and the concentration of oil in water should be minimal (around 40 to 20 ppm) in order to take it to the sea. Because of the need of primary treatment, the objective of this project is to study and implement algorithms for identification of polynomial NARX (Nonlinear Auto-Regressive with Exogenous Input) models in closed loop, implement a structural identification, and compare strategies using PI control and updated on-line NARX predictive models on a combination of three-phase separator in series with three hydro cyclones batteries. The main goal of this project is to: obtain an optimized process of phase separation that will regulate the system, even in the presence of oil gushes; Show that it is possible to get optimized tunings for controllers analyzing the mesh as a whole, and evaluate and compare the strategies of PI and predictive control applied to the process. To accomplish these goals a simulator was used to represent the three phase separator and hydro cyclones. Algorithms were developed for system identification (NARX) using RLS(Recursive Least Square), along with methods for structure models detection. Predictive Control Algorithms were also implemented with NARX model updated on-line, and optimization algorithms using PSO (Particle Swarm Optimization). This project ends with a comparison of results obtained from the use of PI and predictive controllers (both with optimal state through the algorithm of cloud particles) in the simulated system. Thus, concluding that the performed optimizations make the system less sensitive to external perturbations and when optimized, the two controllers show similar results with the assessment of predictive control somewhat less sensitive to disturbances
Resumo:
This research aims at developing a variable structure adaptive backstepping controller (VS-ABC) by using state observers for SISO (Single Input Single Output), linear and time invariant systems with relative degree one. Therefore, the lters were replaced by a Luenberger Adaptive Observer and the control algorithm uses switching laws. The presented simulations compare the controller performance, considering when the state variables are estimated by an observer, with the case that the variables are available for measurement. Even with numerous performance advantages, adaptive backstepping controllers still have very complex algorithms, especially when the system state variables are not measured, since the use of lters on the plant input and output is not something trivial. As an attempt to make the controller design more intuitive, an adaptive observer as an alternative to commonly used K lters can be used. Furthermore, since the states variables are considered known, the controller has a reduction on the dependence of the unknown plant parameters on the design. Also, switching laws could be used in the controller instead of the traditional integral adaptive laws because they improve the system transient performance and increase the robustness against external disturbances in the plant input
Resumo:
Este trabalho apresenta uma técnica de verificação formal de Sistemas de Raciocínio Procedural, PRS (Procedural Reasoning System), uma linguagem de programação que utiliza a abordagem do raciocínio procedural. Esta técnica baseia-se na utilização de regras de conversão entre programas PRS e Redes de Petri Coloridas (RPC). Para isso, são apresentadas regras de conversão de um sub-conjunto bem expressivo da maioria da sintaxe utilizada na linguagem PRS para RPC. A fim de proceder fia verificação formal do programa PRS especificado, uma vez que se disponha da rede de Petri equivalente ao programa PRS, utilizamos o formalismo das RPCs (verificação das propriedades estruturais e comportamentais) para analisarmos formalmente o programa PRS equivalente. Utilizamos uma ferramenta computacional disponível para desenhar, simular e analisar as redes de Petri coloridas geradas. Uma vez que disponhamos das regras de conversão PRS-RPC, podemos ser levados a querer fazer esta conversão de maneira estritamente manual. No entanto, a probabilidade de introdução de erros na conversão é grande, fazendo com que o esforço necessário para garantirmos a corretude da conversão manual seja da mesma ordem de grandeza que a eliminação de eventuais erros diretamente no programa PRS original. Assim, a conversão automatizada é de suma importância para evitar que a conversão manual nos leve a erros indesejáveis, podendo invalidar todo o processo de conversão. A principal contribuição deste trabalho de pesquisa diz respeito ao desenvolvimento de uma técnica de verificação formal automatizada que consiste basicamente em duas etapas distintas, embora inter-relacionadas. A primeira fase diz respeito fias regras de conversão de PRS para RPC. A segunda fase é concernente ao desenvolvimento de um conversor para fazer a transformação de maneira automatizada dos programas PRS para as RPCs. A conversão automática é possível, porque todas as regras de conversão apresentadas seguem leis de formação genéricas, passíveis de serem incluídas em algoritmos
Resumo:
The development of computers and algorithms capable of making increasingly accurate and rapid calculations as well as the theoretic foundation provided by quantum mechanics has turned computer simulation into a valuable research tool. The importance of such a tool is due to its success in describing the physical and chemical properties of materials. One way of modifying the electronic properties of a given material is by applying an electric field. These effects are interesting in nanocones because their stability and geometric structure make them promising candidates for electron emission devices. In our study we calculated the first principles based on the density functional theory as implemented in the SIESTA code. We investigated aluminum nitride (AlN), boron nitride (BN) and carbon (C), subjected to external parallel electric field, perpendicular to their main axis. We discuss stability in terms of formation energy, using the chemical potential approach. We also analyze the electronic properties of these nanocones and show that in some cases the perpendicular electric field provokes a greater gap reduction when compared to the parallel field
Resumo:
In this work we study the Hidden Markov Models with finite as well as general state space. In the finite case, the forward and backward algorithms are considered and the probability of a given observed sequence is computed. Next, we use the EM algorithm to estimate the model parameters. In the general case, the kernel estimators are used and to built a sequence of estimators that converge in L1-norm to the density function of the observable process
Resumo:
Os Algoritmos Genético (AG) e o Simulated Annealing (SA) são algoritmos construídos para encontrar máximo ou mínimo de uma função que representa alguma característica do processo que está sendo modelado. Esses algoritmos possuem mecanismos que os fazem escapar de ótimos locais, entretanto, a evolução desses algoritmos no tempo se dá de forma completamente diferente. O SA no seu processo de busca trabalha com apenas um ponto, gerando a partir deste sempre um nova solução que é testada e que pode ser aceita ou não, já o AG trabalha com um conjunto de pontos, chamado população, da qual gera outra população que sempre é aceita. Em comum com esses dois algoritmos temos que a forma como o próximo ponto ou a próxima população é gerada obedece propriedades estocásticas. Nesse trabalho mostramos que a teoria matemática que descreve a evolução destes algoritmos é a teoria das cadeias de Markov. O AG é descrito por uma cadeia de Markov homogênea enquanto que o SA é descrito por uma cadeia de Markov não-homogênea, por fim serão feitos alguns exemplos computacionais comparando o desempenho desses dois algoritmos
Resumo:
In the work reported here we present theoretical and numerical results about a Risk Model with Interest Rate and Proportional Reinsurance based on the article Inequalities for the ruin probability in a controlled discrete-time risk process by Ros ario Romera and Maikol Diasparra (see [5]). Recursive and integral equations as well as upper bounds for the Ruin Probability are given considering three di erent approaches, namely, classical Lundberg inequality, Inductive approach and Martingale approach. Density estimation techniques (non-parametrics) are used to derive upper bounds for the Ruin Probability and the algorithms used in the simulation are presented
Resumo:
This work discusses the application of techniques of ensembles in multimodal recognition systems development in revocable biometrics. Biometric systems are the future identification techniques and user access control and a proof of this is the constant increases of such systems in current society. However, there is still much advancement to be developed, mainly with regard to the accuracy, security and processing time of such systems. In the search for developing more efficient techniques, the multimodal systems and the use of revocable biometrics are promising, and can model many of the problems involved in traditional biometric recognition. A multimodal system is characterized by combining different techniques of biometric security and overcome many limitations, how: failures in the extraction or processing the dataset. Among the various possibilities to develop a multimodal system, the use of ensembles is a subject quite promising, motivated by performance and flexibility that they are demonstrating over the years, in its many applications. Givin emphasis in relation to safety, one of the biggest problems found is that the biometrics is permanently related with the user and the fact of cannot be changed if compromised. However, this problem has been solved by techniques known as revocable biometrics, which consists of applying a transformation on the biometric data in order to protect the unique characteristics, making its cancellation and replacement. In order to contribute to this important subject, this work compares the performance of individual classifiers methods, as well as the set of classifiers, in the context of the original data and the biometric space transformed by different functions. Another factor to be highlighted is the use of Genetic Algorithms (GA) in different parts of the systems, seeking to further maximize their eficiency. One of the motivations of this development is to evaluate the gain that maximized ensembles systems by different GA can bring to the data in the transformed space. Another relevant factor is to generate revocable systems even more eficient by combining two or more functions of transformations, demonstrating that is possible to extract information of a similar standard through applying different transformation functions. With all this, it is clear the importance of revocable biometrics, ensembles and GA in the development of more eficient biometric systems, something that is increasingly important in the present day
Resumo:
This paper introduces a new variant of the Traveling Car Renter Problem, named Prizecollecting Traveling Car Renter Problem. In this problem, a set of vertices, each associated with a bonus, and a set of vehicles are given. The objective is to determine a cycle that visits some vertices collecting, at least, a pre-defined bonus, and minimizing the cost of the tour that can be traveled with different vehicles. A mathematical formulation is presented and implemented in a solver to produce results for sixty-two instances. The proposed problem is also subject of an experimental study based on the algorithmic application of four metaheuristics representing the best adaptations of the state of the art of the heuristic programming.We also provide new local search operators which exploit the neighborhoods of the problem, construction procedures and adjustments, created specifically for the addressed problem. Comparative computational experiments and performance tests are performed on a sample of 80 instances, aiming to offer a competitive algorithm to the problem. We conclude that memetic algorithms, computational transgenetic and a hybrid evolutive algorithm are competitive in tests performed
Resumo:
This work shows a integrated study of modern analog to fluvial reservoirs of Açu Formation (Unit 3). The modern analog studied has been Assu River located in the same named city, Rio Grande do Norte State, Northeast of Brazil. It has been developed a new methodology to parameterizating the fluvial geological bodies by GPR profile (by central frequency antennas of 50, 100 and 200 MHz). The main parameters obtained were width and thickness. Still in the parameterization, orthophotomaps have been used to calculate the canal sinuosity and braided parameters of Assu River. These information are integrated in a database to supply input data in 3D geological models of fluvial reservoirs. It was made an architectural characterization of the deposit by trench description, GPR profile interpretation and natural expositions study to recognize and describe the facies and its associations, external and internal geometries, boundary surfaces and archtetural elements. Finally, a three-dimensional modeling has been built using all the acquired data already in association with real well data of a reservoir which Rio Assu is considered as analogous. Facies simulations have been used simple kriging (deterministic algorithm), SIS and Boolean (object-based, both stochastics). And, for modeling porosities have used the stochastic algorithm SGS