147 resultados para Algoritmos computacionales


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research aims at developing a variable structure adaptive backstepping controller (VS-ABC) by using state observers for SISO (Single Input Single Output), linear and time invariant systems with relative degree one. Therefore, the lters were replaced by a Luenberger Adaptive Observer and the control algorithm uses switching laws. The presented simulations compare the controller performance, considering when the state variables are estimated by an observer, with the case that the variables are available for measurement. Even with numerous performance advantages, adaptive backstepping controllers still have very complex algorithms, especially when the system state variables are not measured, since the use of lters on the plant input and output is not something trivial. As an attempt to make the controller design more intuitive, an adaptive observer as an alternative to commonly used K lters can be used. Furthermore, since the states variables are considered known, the controller has a reduction on the dependence of the unknown plant parameters on the design. Also, switching laws could be used in the controller instead of the traditional integral adaptive laws because they improve the system transient performance and increase the robustness against external disturbances in the plant input

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho apresenta uma técnica de verificação formal de Sistemas de Raciocínio Procedural, PRS (Procedural Reasoning System), uma linguagem de programação que utiliza a abordagem do raciocínio procedural. Esta técnica baseia-se na utilização de regras de conversão entre programas PRS e Redes de Petri Coloridas (RPC). Para isso, são apresentadas regras de conversão de um sub-conjunto bem expressivo da maioria da sintaxe utilizada na linguagem PRS para RPC. A fim de proceder fia verificação formal do programa PRS especificado, uma vez que se disponha da rede de Petri equivalente ao programa PRS, utilizamos o formalismo das RPCs (verificação das propriedades estruturais e comportamentais) para analisarmos formalmente o programa PRS equivalente. Utilizamos uma ferramenta computacional disponível para desenhar, simular e analisar as redes de Petri coloridas geradas. Uma vez que disponhamos das regras de conversão PRS-RPC, podemos ser levados a querer fazer esta conversão de maneira estritamente manual. No entanto, a probabilidade de introdução de erros na conversão é grande, fazendo com que o esforço necessário para garantirmos a corretude da conversão manual seja da mesma ordem de grandeza que a eliminação de eventuais erros diretamente no programa PRS original. Assim, a conversão automatizada é de suma importância para evitar que a conversão manual nos leve a erros indesejáveis, podendo invalidar todo o processo de conversão. A principal contribuição deste trabalho de pesquisa diz respeito ao desenvolvimento de uma técnica de verificação formal automatizada que consiste basicamente em duas etapas distintas, embora inter-relacionadas. A primeira fase diz respeito fias regras de conversão de PRS para RPC. A segunda fase é concernente ao desenvolvimento de um conversor para fazer a transformação de maneira automatizada dos programas PRS para as RPCs. A conversão automática é possível, porque todas as regras de conversão apresentadas seguem leis de formação genéricas, passíveis de serem incluídas em algoritmos

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of computers and algorithms capable of making increasingly accurate and rapid calculations as well as the theoretic foundation provided by quantum mechanics has turned computer simulation into a valuable research tool. The importance of such a tool is due to its success in describing the physical and chemical properties of materials. One way of modifying the electronic properties of a given material is by applying an electric field. These effects are interesting in nanocones because their stability and geometric structure make them promising candidates for electron emission devices. In our study we calculated the first principles based on the density functional theory as implemented in the SIESTA code. We investigated aluminum nitride (AlN), boron nitride (BN) and carbon (C), subjected to external parallel electric field, perpendicular to their main axis. We discuss stability in terms of formation energy, using the chemical potential approach. We also analyze the electronic properties of these nanocones and show that in some cases the perpendicular electric field provokes a greater gap reduction when compared to the parallel field

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we study the Hidden Markov Models with finite as well as general state space. In the finite case, the forward and backward algorithms are considered and the probability of a given observed sequence is computed. Next, we use the EM algorithm to estimate the model parameters. In the general case, the kernel estimators are used and to built a sequence of estimators that converge in L1-norm to the density function of the observable process

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Os Algoritmos Genético (AG) e o Simulated Annealing (SA) são algoritmos construídos para encontrar máximo ou mínimo de uma função que representa alguma característica do processo que está sendo modelado. Esses algoritmos possuem mecanismos que os fazem escapar de ótimos locais, entretanto, a evolução desses algoritmos no tempo se dá de forma completamente diferente. O SA no seu processo de busca trabalha com apenas um ponto, gerando a partir deste sempre um nova solução que é testada e que pode ser aceita ou não, já o AG trabalha com um conjunto de pontos, chamado população, da qual gera outra população que sempre é aceita. Em comum com esses dois algoritmos temos que a forma como o próximo ponto ou a próxima população é gerada obedece propriedades estocásticas. Nesse trabalho mostramos que a teoria matemática que descreve a evolução destes algoritmos é a teoria das cadeias de Markov. O AG é descrito por uma cadeia de Markov homogênea enquanto que o SA é descrito por uma cadeia de Markov não-homogênea, por fim serão feitos alguns exemplos computacionais comparando o desempenho desses dois algoritmos

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the work reported here we present theoretical and numerical results about a Risk Model with Interest Rate and Proportional Reinsurance based on the article Inequalities for the ruin probability in a controlled discrete-time risk process by Ros ario Romera and Maikol Diasparra (see [5]). Recursive and integral equations as well as upper bounds for the Ruin Probability are given considering three di erent approaches, namely, classical Lundberg inequality, Inductive approach and Martingale approach. Density estimation techniques (non-parametrics) are used to derive upper bounds for the Ruin Probability and the algorithms used in the simulation are presented

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work discusses the application of techniques of ensembles in multimodal recognition systems development in revocable biometrics. Biometric systems are the future identification techniques and user access control and a proof of this is the constant increases of such systems in current society. However, there is still much advancement to be developed, mainly with regard to the accuracy, security and processing time of such systems. In the search for developing more efficient techniques, the multimodal systems and the use of revocable biometrics are promising, and can model many of the problems involved in traditional biometric recognition. A multimodal system is characterized by combining different techniques of biometric security and overcome many limitations, how: failures in the extraction or processing the dataset. Among the various possibilities to develop a multimodal system, the use of ensembles is a subject quite promising, motivated by performance and flexibility that they are demonstrating over the years, in its many applications. Givin emphasis in relation to safety, one of the biggest problems found is that the biometrics is permanently related with the user and the fact of cannot be changed if compromised. However, this problem has been solved by techniques known as revocable biometrics, which consists of applying a transformation on the biometric data in order to protect the unique characteristics, making its cancellation and replacement. In order to contribute to this important subject, this work compares the performance of individual classifiers methods, as well as the set of classifiers, in the context of the original data and the biometric space transformed by different functions. Another factor to be highlighted is the use of Genetic Algorithms (GA) in different parts of the systems, seeking to further maximize their eficiency. One of the motivations of this development is to evaluate the gain that maximized ensembles systems by different GA can bring to the data in the transformed space. Another relevant factor is to generate revocable systems even more eficient by combining two or more functions of transformations, demonstrating that is possible to extract information of a similar standard through applying different transformation functions. With all this, it is clear the importance of revocable biometrics, ensembles and GA in the development of more eficient biometric systems, something that is increasingly important in the present day

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces a new variant of the Traveling Car Renter Problem, named Prizecollecting Traveling Car Renter Problem. In this problem, a set of vertices, each associated with a bonus, and a set of vehicles are given. The objective is to determine a cycle that visits some vertices collecting, at least, a pre-defined bonus, and minimizing the cost of the tour that can be traveled with different vehicles. A mathematical formulation is presented and implemented in a solver to produce results for sixty-two instances. The proposed problem is also subject of an experimental study based on the algorithmic application of four metaheuristics representing the best adaptations of the state of the art of the heuristic programming.We also provide new local search operators which exploit the neighborhoods of the problem, construction procedures and adjustments, created specifically for the addressed problem. Comparative computational experiments and performance tests are performed on a sample of 80 instances, aiming to offer a competitive algorithm to the problem. We conclude that memetic algorithms, computational transgenetic and a hybrid evolutive algorithm are competitive in tests performed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work shows a integrated study of modern analog to fluvial reservoirs of Açu Formation (Unit 3). The modern analog studied has been Assu River located in the same named city, Rio Grande do Norte State, Northeast of Brazil. It has been developed a new methodology to parameterizating the fluvial geological bodies by GPR profile (by central frequency antennas of 50, 100 and 200 MHz). The main parameters obtained were width and thickness. Still in the parameterization, orthophotomaps have been used to calculate the canal sinuosity and braided parameters of Assu River. These information are integrated in a database to supply input data in 3D geological models of fluvial reservoirs. It was made an architectural characterization of the deposit by trench description, GPR profile interpretation and natural expositions study to recognize and describe the facies and its associations, external and internal geometries, boundary surfaces and archtetural elements. Finally, a three-dimensional modeling has been built using all the acquired data already in association with real well data of a reservoir which Rio Assu is considered as analogous. Facies simulations have been used simple kriging (deterministic algorithm), SIS and Boolean (object-based, both stochastics). And, for modeling porosities have used the stochastic algorithm SGS

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective to establish a methodology for the oil spill monitoring on the sea surface, located at the Submerged Exploration Area of the Polo Region of Guamaré, in the State of Rio Grande do Norte, using orbital images of Synthetic Aperture Radar (SAR integrated with meteoceanographycs products. This methodology was applied in the following stages: (1) the creation of a base map of the Exploration Area; (2) the processing of NOAA/AVHRR and ERS-2 images for generation of meteoceanographycs products; (3) the processing of RADARSAT-1 images for monitoring of oil spills; (4) the integration of RADARSAT-1 images with NOAA/AVHRR and ERS-2 image products; and (5) the structuring of a data base. The Integration of RADARSAT-1 image of the Potiguar Basin of day 21.05.99 with the base map of the Exploration Area of the Polo Region of Guamaré for the identification of the probable sources of the oil spots, was used successfully in the detention of the probable spot of oil detected next to the exit to the submarine emissary in the Exploration Area of the Polo Region of Guamaré. To support the integration of RADARSAT-1 images with NOAA/AVHRR and ERS-2 image products, a methodology was developed for the classification of oil spills identified by RADARSAT-1 images. For this, the following algorithms of classification not supervised were tested: K-means, Fuzzy k-means and Isodata. These algorithms are part of the PCI Geomatics software, which was used for the filtering of RADARSAT-1 images. For validation of the results, the oil spills submitted to the unsupervised classification were compared to the results of the Semivariogram Textural Classifier (STC). The mentioned classifier was developed especially for oil spill classification purposes and requires PCI software for the whole processing of RADARSAT-1 images. After all, the results of the classifications were analyzed through Visual Analysis; Calculation of Proportionality of Largeness and Analysis Statistics. Amongst the three algorithms of classifications tested, it was noted that there were no significant alterations in relation to the spills classified with the STC, in all of the analyses taken into consideration. Therefore, considering all the procedures, it has been shown that the described methodology can be successfully applied using the unsupervised classifiers tested, resulting in a decrease of time in the identification and classification processing of oil spills, if compared with the utilization of the STC classifier

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Orbital remote sensing has been used as a beneficial tool in improving the knowledge on oceanographic and hydrodynamic aspects in northern portion of the continental shelf of Rio Grande do Norte, offshore Potiguar Basin. Aspects such as geography, temporal and spatial resolution combined with a consistent methodology and provide a substantial economic advantage compared to traditional methods of in situ data collecting. Images of the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor onboard NASA's AQUA satellite were obtained to support systematic data collections related to the campaign of environmental monitoring and characterization of Potiguar Basin, held in May 2004. Images of Total Suspension Matter (TSM) and values of radiance standard were generated for the calculation of concentrations of total suspension matter (TSM), chlorophyll-a and sea surface temperature (SST). These data sets were used for statistical comparisons between measures in situ and satellite estimates looking validate algorithms or develop a comprehensive regional approach empirically. AQUA-MODIS images allowed the simultaneous comparison of two-dimensional water quality (total suspension matter), phytoplankton biomass (chlorophyll-a) variability and physical (temperature). For images of total suspension matter, the generated models showed a good correlation with the field data, allowing quantitative and qualitative analysis. The images of chlorophyll-a showed a consistent correlation with the in situ values of concentration. The algorithms adjusted for these images obtained a correlation coefficient fairly well with the data field in order that the sensor can be having an effect throughout the water column and not just the surface. This has led to a fit between the data of chlorophyll-the integration of the average sampling interval of the entire water column up to the level of the first optical depth, with the data generated from the images. This method resulted in higher values of chlorophyll concentration to greater depths, due to the fact that we are integrating more values of chlorophyll in the water column. Thus we can represent the biomass available in the water column. Images SST and SST measures in situ showed a mean difference DT (SST insitu - SST sat) around -0.14 ° C, considered low, making the results very good. The integration of total suspension matter, chlorophyll-a, the temperature of the sea surface (SST) and auxiliary data enabled the recognition of some of the main ways to fund the continental shelf. The main features highlighted were submerged canyons of rivers Apodi and Açu, some of the lines and beachrocks reefs, structural highs and the continental shelf break which occurs at depths around -60 m. The results confirmed the high potential for use of the AQUA-MODIS images to environmental monitoring of sea areas due to ease of detection of the field two-dimensional material in suspension on the sea surface, temperature and the concentration of chlorophyll-a

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing use of shallow seismic methods of high resolution, for investigations of geological problems, environmental or industrial, has impelled the development of techniques, flows and computational algorithms. The practice of applying techniques for processing this data, until recently it wasn t used and the interpretation of the data was made as they were acquired. In order to facilitate and contribute to the improvement of the practices adopted, was developed a free graphical application and open source, called OpenSeismic which is based on free software Seismic Un*x, widely used in the treatment of conventional seismic data used in the exploration of hydrocarbon reservoirs. The data used to validate the initiative were marine seismic data of high resolution, acquired by the laboratory of Geology and Marine Geophysics and Environmental Monitoring - GGEMMA, of the Federal University of Rio Grande do Norte UFRN, for the SISPLAT Project, located at the region of paleo-valley of the Rio Acu. These data were submitted to the processing flow developed by Gomes (2009), using the free software developed in this work, the OpenSeismic, as well other free software, the Seismic Un*x and the commercial software ProMAX, where despite its peculiarities has presented similar results