991 resultados para CNPQ::CIENCIAS EXATAS E DA TERRA::MATEMATICA APLICADA E ESTATÍSTICA
Resumo:
A real space renormalization group method is used to investigate the criticality (phase diagrams, critical expoentes and universality classes) of Z(4) model in two and three dimensions. The values of the interaction parameters are chosen in such a way as to cover the complete phase diagrams of the model, which presents the following phases: (i) Paramagnetic (P); (ii) Ferromagnetic (F); (iii) Antiferromagnetic (AF); (iv) Intermediate Ferromagnetic (IF) and Intermediate Antiferromagnetic (IAF). In the hierarquical lattices, generated by renormalization the phase diagrams are exact. It is also possible to obtain approximated results for square and simple cubic lattices. In the bidimensional case a self-dual lattice is used and the resulting phase diagram reproduces all the exact results known for the square lattice. The Migdal-Kadanoff transformation is applied to the three dimensional case and the additional phases previously suggested by Ditzian et al, are not found
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
In the present work we use a plasma jet system with a hollow cathode to deposit thin TiO2 films on silicon substrates as alternative at sol-gel, PECVD, dip-coating e magnetron sputtering techniques. The cylindrical cathode, made from pure titanium, can be negatively polarized between 0 e 1200 V and supports an electrical current of up to 1 A. An Ar/O2 mixture, with a total flux of 20 sccm and an O2 percentage ranging between 0 and 30%, is passed through a cylindrical hole machined in the cathode. The plasma parameters and your influence on the properties of deposited TiO2 films and their deposition rate was studied. When discharge occurs, titanium atoms are sputtered/evaporated. They are transported by the jet and deposited on the Si substrates located on the substrate holder facing the plasma jet system at a distance ranging between10 and 50 mm from the cathode. The working pressure was 10-3 mbar and the deposition time was 10 -60 min. Deposited films were characterized by scanning electron microscopy and atomic force microscopy to check the film uniformity and morphology and by X-ray diffraction to analyze qualitatively the phases present. Also it is presented the new dispositive denominate ionizing cage, derived from the active screen plasma nitriding (ASPN), but based in hollow cathode effect, recently developed. In this process, the sample was involved in a cage, in which the cathodic potential was applied. The samples were placed on an insulator substrate holder, remaining in a floating potential, and then it was treated in reactive plasma in hollow cathode regime. Moreover, the edge effect was completely eliminated, since the plasma was formed on the cage and not directly onto the samples and uniformity layer was getting in all sampl
Resumo:
We presented in this work two methods of estimation for accelerated failure time models with random e_ects to process grouped survival data. The _rst method, which is implemented in software SAS, by NLMIXED procedure, uses an adapted Gauss-Hermite quadrature to determine marginalized likelihood. The second method, implemented in the free software R, is based on the method of penalized likelihood to estimate the parameters of the model. In the _rst case we describe the main theoretical aspects and, in the second, we briey presented the approach adopted with a simulation study to investigate the performance of the method. We realized implement the models using actual data on the time of operation of oil wells from the Potiguar Basin (RN / CE).
Resumo:
The proposal of this work is to evaluate the influence of the organic matter on the results of the analyses of the metals (Zn, Pb, Al, Cu, Cr, Fe, Cd e Ni) for Atomic Absorption Spectrometry (AAS), so much in the extraction stage as in the reading using for that the chemometrics. They were used for this study sample of bottom sediment collected in river Jundiaí in the vicinity of the city of Macaíba-RN, commercial humus and water of the station of treatment of sewer of UFRN. Through the analyses accomplished by EAA it was verified that the interference of the organic matter happens in the extraction stage and not in the reading. With relationship to the technique of X Ray Fluorescence Spectrometry (XRFS), the present work has as intended to evaluate the viability of this technique for quantitative analysis of trace metals (Cr, Ni, Cu, Zn, Rb, Sr and Pb) in having leached obtained starting from the extraction with acqua regia for an aqueous solution. The used samples constitute the fine fraction (<0.063 mm) of sediments of swamp of the river Jundiaí. The preparation of tablets pressed starting from the dry residue of those leached it allowed your analysis in the solid form. This preliminary study shows that, in the case of the digestion chemistry partially of the fine fractions of bottom sediments used for environmental studies, the technique of applied EFRX to the analysis of dry residues starting from having leached with acqua regia, compared her it analyzes of the leached with ICP-OES, it presents relative mistakes for Cu, Pb, Sr and Zn below 10%
Resumo:
Clustering data is a very important task in data mining, image processing and pattern recognition problems. One of the most popular clustering algorithms is the Fuzzy C-Means (FCM). This thesis proposes to implement a new way of calculating the cluster centers in the procedure of FCM algorithm which are called ckMeans, and in some variants of FCM, in particular, here we apply it for those variants that use other distances. The goal of this change is to reduce the number of iterations and processing time of these algorithms without affecting the quality of the partition, or even to improve the number of correct classifications in some cases. Also, we developed an algorithm based on ckMeans to manipulate interval data considering interval membership degrees. This algorithm allows the representation of data without converting interval data into punctual ones, as it happens to other extensions of FCM that deal with interval data. In order to validate the proposed methodologies it was made a comparison between a clustering for ckMeans, K-Means and FCM algorithms (since the algorithm proposed in this paper to calculate the centers is similar to the K-Means) considering three different distances. We used several known databases. In this case, the results of Interval ckMeans were compared with the results of other clustering algorithms when applied to an interval database with minimum and maximum temperature of the month for a given year, referring to 37 cities distributed across continents
Resumo:
Model-oriented strategies have been used to facilitate products customization in the software products lines (SPL) context and to generate the source code of these derived products through variability management. Most of these strategies use an UML (Unified Modeling Language)-based model specification. Despite its wide application, the UML-based model specification has some limitations such as the fact that it is essentially graphic, presents deficiencies regarding the precise description of the system architecture semantic representation, and generates a large model, thus hampering the visualization and comprehension of the system elements. In contrast, architecture description languages (ADLs) provide graphic and textual support for the structural representation of architectural elements, their constraints and interactions. This thesis introduces ArchSPL-MDD, a model-driven strategy in which models are specified and configured by using the LightPL-ACME ADL. Such strategy is associated to a generic process with systematic activities that enable to automatically generate customized source code from the product model. ArchSPLMDD strategy integrates aspect-oriented software development (AOSD), modeldriven development (MDD) and SPL, thus enabling the explicit modeling as well as the modularization of variabilities and crosscutting concerns. The process is instantiated by the ArchSPL-MDD tool, which supports the specification of domain models (the focus of the development) in LightPL-ACME. The ArchSPL-MDD uses the Ginga Digital TV middleware as case study. In order to evaluate the efficiency, applicability, expressiveness, and complexity of the ArchSPL-MDD strategy, a controlled experiment was carried out in order to evaluate and compare the ArchSPL-MDD tool with the GingaForAll tool, which instantiates the process that is part of the GingaForAll UML-based strategy. Both tools were used for configuring the products of Ginga SPL and generating the product source code
Resumo:
This thesis proposes an architecture of a new multiagent system framework for hybridization of metaheuristics inspired on the general Particle Swarm Optimization framework (PSO). The main contribution is to propose an effective approach to solve hard combinatory optimization problems. The choice of PSO as inspiration was given because it is inherently multiagent, allowing explore the features of multiagent systems, such as learning and cooperation techniques. In the proposed architecture, particles are autonomous agents with memory and methods for learning and making decisions, using search strategies to move in the solution space. The concepts of position and velocity originally defined in PSO are redefined for this approach. The proposed architecture was applied to the Traveling Salesman Problem and to the Quadratic Assignment Problem, and computational experiments were performed for testing its effectiveness. The experimental results were promising, with satisfactory performance, whereas the potential of the proposed architecture has not been fully explored. For further researches, the proposed approach will be also applied to multiobjective combinatorial optimization problems, which are closer to real-world problems. In the context of applied research, we intend to work with both students at the undergraduate level and a technical level in the implementation of the proposed architecture in real-world problems
Resumo:
A matemática intervalar é uma teoria matemática originada na década de 60 com o objetivo de responder questões de exatidão e eficiência que surgem na prática da computação científica e na resolução de problemas numéricos. As abordagens clássicas para teoria da computabilidade tratam com problemas discretos (por exemplo, sobre os números naturais, números inteiros, strings sobre um alfabeto finito, grafos, etc.). No entanto, campos da matemática pura e aplicada tratam com problemas envolvendo números reais e números complexos. Isto acontece, por exemplo, em análise numérica, sistemas dinâmicos, geometria computacional e teoria da otimização. Assim, uma abordagem computacional para problemas contínuos é desejável, ou ainda necessária, para tratar formalmente com computações analógicas e computações científicas em geral. Na literatura existem diferentes abordagens para a computabilidade nos números reais, mas, uma importante diferença entre estas abordagens está na maneira como é representado o número real. Existem basicamente duas linhas de estudo da computabilidade no contínuo. Na primeira delas uma aproximação da saída com precisão arbitrária é computada a partir de uma aproximação razoável da entrada [Bra95]. A outra linha de pesquisa para computabilidade real foi desenvolvida por Blum, Shub e Smale [BSS89]. Nesta aproximação, as chamadas máquinas BSS, um número real é visto como uma entidade acabada e as funções computáveis são geradas a partir de uma classe de funções básicas (numa maneira similar às funções parciais recursivas). Nesta dissertação estudaremos o modelo BSS, usado para se caracterizar uma teoria da computabilidade sobre os números reais e estenderemos este para se modelar a computabilidade no espaço dos intervalos reais. Assim, aqui veremos uma aproximação para computabilidade intervalar epistemologicamente diferente da estudada por Bedregal e Acióly [Bed96, BA97a, BA97b], na qual um intervalo real é visto como o limite de intervalos racionais, e a computabilidade de uma função intervalar real depende da computabilidade de uma função sobre os intervalos racionais
Resumo:
The intervalar arithmetic well-known as arithmetic of Moore, doesn't possess the same properties of the real numbers, and for this reason, it is confronted with a problem of operative nature, when we want to solve intervalar equations as extension of real equations by the usual equality and of the intervalar arithmetic, for this not to possess the inverse addictive, as well as, the property of the distributivity of the multiplication for the sum doesn t be valid for any triplet of intervals. The lack of those properties disables the use of equacional logic, so much for the resolution of an intervalar equation using the same, as for a representation of a real equation, and still, for the algebraic verification of properties of a computational system, whose data are real numbers represented by intervals. However, with the notion of order of information and of approach on intervals, introduced by Acióly[6] in 1991, the idea of an intervalar equation appears to represent a real equation satisfactorily, since the terms of the intervalar equation carry the information about the solution of the real equation. In 1999, Santiago proposed the notion of simple equality and, later on, local equality for intervals [8] and [33]. Based on that idea, this dissertation extends Santiago's local groups for local algebras, following the idea of Σ-algebras according to (Hennessy[31], 1988) and (Santiago[7], 1995). One of the contributions of this dissertation, is the theorem 5.1.3.2 that it guarantees that, when deducing a local Σ-equation E t t in the proposed system SDedLoc(E), the interpretations of t and t' will be locally the same in any local Σ-algebra that satisfies the group of fixed equations local E, whenever t and t have meaning in A. This assures to a kind of safety between the local equacional logic and the local algebras
Resumo:
In this work will applied the technique of Differential Cryptanalysis, introduced in 1990 by Biham and Shamir, on Papílio s cryptosystem, developed by Karla Ramos, to test and most importantly, to prove its relevance to other block ciphers such as DES, Blowfish and FEAL-N (X). This technique is based on the analysis of differences between plaintext and theirs respective ciphertext, in search of patterns that will assist in the discovery of the subkeys and consequently in the discovery of master key. These differences are obtained by XOR operations. Through this analysis, in addition to obtaining patterns of Pap´ılio, it search to obtain also the main characteristics and behavior of Papilio throughout theirs 16 rounds, identifying and replacing when necessary factors that can be improved in accordance with pre-established definitions of the same, thus providing greater security in the use of his algoritm
Resumo:
The course of Algorithms and Programming reveals as real obstacle for many students during the computer courses. The students not familiar with new ways of thinking required by the courses as well as not having certain skills required for this, encounter difficulties that sometimes result in the repetition and dropout. Faced with this problem, that survey on the problems experienced by students was conducted as a way to understand the problem and to guide solutions in trying to solve or assuage the difficulties experienced by students. In this paper a methodology to be applied in a classroom based on the concepts of Meaningful Learning of David Ausubel was described. In addition to this theory, a tool developed at UFRN, named Takkou, was used with the intent to better motivate students in algorithms classes and to exercise logical reasoning. Finally a comparative evaluation of the suggested methodology and traditional methodology was carried out, and results were discussed
Resumo:
The segmentation of an image aims to subdivide it into constituent regions or objects that have some relevant semantic content. This subdivision can also be applied to videos. However, in these cases, the objects appear in various frames that compose the videos. The task of segmenting an image becomes more complex when they are composed of objects that are defined by textural features, where the color information alone is not a good descriptor of the image. Fuzzy Segmentation is a region-growing segmentation algorithm that uses affinity functions in order to assign to each element in an image a grade of membership for each object (between 0 and 1). This work presents a modification of the Fuzzy Segmentation algorithm, for the purpose of improving the temporal and spatial complexity. The algorithm was adapted to segmenting color videos, treating them as 3D volume. In order to perform segmentation in videos, conventional color model or a hybrid model obtained by a method for choosing the best channels were used. The Fuzzy Segmentation algorithm was also applied to texture segmentation by using adaptive affinity functions defined for each object texture. Two types of affinity functions were used, one defined using the normal (or Gaussian) probability distribution and the other using the Skew Divergence. This latter, a Kullback-Leibler Divergence variation, is a measure of the difference between two probability distributions. Finally, the algorithm was tested in somes videos and also in texture mosaic images composed by images of the Brodatz album
Resumo:
Data clustering is applied to various fields such as data mining, image processing and pattern recognition technique. Clustering algorithms splits a data set into clusters such that elements within the same cluster have a high degree of similarity, while elements belonging to different clusters have a high degree of dissimilarity. The Fuzzy C-Means Algorithm (FCM) is a fuzzy clustering algorithm most used and discussed in the literature. The performance of the FCM is strongly affected by the selection of the initial centers of the clusters. Therefore, the choice of a good set of initial cluster centers is very important for the performance of the algorithm. However, in FCM, the choice of initial centers is made randomly, making it difficult to find a good set. This paper proposes three new methods to obtain initial cluster centers, deterministically, the FCM algorithm, and can also be used in variants of the FCM. In this work these initialization methods were applied in variant ckMeans.With the proposed methods, we intend to obtain a set of initial centers which are close to the real cluster centers. With these new approaches startup if you want to reduce the number of iterations to converge these algorithms and processing time without affecting the quality of the cluster or even improve the quality in some cases. Accordingly, cluster validation indices were used to measure the quality of the clusters obtained by the modified FCM and ckMeans algorithms with the proposed initialization methods when applied to various data sets
Resumo:
The monitoring of Earth dam makes use of visual inspection and instrumentation to identify and characterize the deterioration that compromises the security of earth dams and associated structures. The visual inspection is subjective and can lead to misinterpretation or omission of important information and, some problems are detected too late. The instrumentation are efficient but certain technical or operational issues can cause restrictions. Thereby, visual inspections and instrumentation can lead to a lack of information. Geophysics offers consolidated, low-cost methods that are non-invasive, non-destructive and low cost. They have a strong potential and can be used assisting instrumentation. In the case that a visual inspection and strumentation does not provide all the necessary information, geophysical methods would provide more complete and relevant information. In order to test these theories, geophysical acquisitions were performed using Georadar (GPR), Electric resistivity, Seismic refraction, and Refraction Microtremor (ReMi) on the dike of the dam in Sant Llorenç de Montgai, located in the province of Lleida, 145 km from Barcelona, Catalonia. The results confirmed that the geophysical methods used each responded satisfactorily to the conditions of the earth dike, the anomalies present and the geological features found, such as alluvium and carbonate and evaporite rocks. It has also been confirmed that these methods, when used in an integrated manner, are able to reduce the ambiguities in individual interpretations. They facilitate improved imaging of the interior dikes and of major geological features, thus inspecting the massif and its foundation. Consequently, the results obtained in this study demonstrated that these geophysical methods are sufficiently effective for inspecting earth dams and they are an important tool in the instrumentation and visual inspection of the security of the dams